Thursday, October 6, 2016

Our Infrared Street View (ISV) program has won the JUMP Competition sponsored jointly by CLEAResult, the largest provider of energy efficiency programs and services in North America, and the National Renewable Energy Laboratory (NREL), a research division of the US Department of Energy (DOE). This JUMP Competition called for innovations in using smartphones' sensing capabilities to improve residential energy efficiency. Finalists were selected from a pool of submitted proposals and invited to make their pitches to the audience at the CLEAResult Energy Forum held in Austin, TX on October 4-6, 2016. There is only one winner among all the good ideas for each competition. This year, we just happened to be one.

IR homework

We envision the Infrared Street View as an infrared (IR) counterpart of Google's Street View (I know, I know, this is probably too big to swallow for an organization that is a few garages small). Unlike Google's Street View in the range of visible light, the Infrared Street View will provide a gigantic database of thermal images in the range of invisible IR light emitted by molecular vibrations related to thermal energy. If you think about these images in a different way, they actually are a massive 3D web of temperature data points. What is the value of this big data? If the data are collected in the right way, they may represent the current state of the energy efficiency of our neighborhoods, towns, cities, and even states. In a sense, what we are talking about is in fact a thermographic information system (TIS).

We are not the only group that realized this possibility (but we are likely the first one that came up with the notion and name of TIS). A few startup companies in Boston area have worked in this frontier earlier this decade. But none of them has tapped into the potential of smartphone technologies. With a handful of drive-by trucks or fly-by drones with a bunch of mounted infrared cameras, it probably would take these companies a century to complete this thermal survey for the entire country. Furthermore, the trucks can only take images from the front of a building and the drones can only take images from above, which mean that their data are incomplete and cannot be used to create the thermal web that we are imagining. In some cases, unsolicited thermal scan of people's houses may even cause legal troubles as thermal signatures may accidentally disclose sensitive information.

Our solution is based on FLIR ONE, a $200-ish thermal camera that can be plugged into a smartphone (iOS or Android). The low cost of FLIR ONE, for the first time in history, makes it possible for the public to participate in this thermal survey. But even with the relatively low price tag, it is simply unrealistic to expect that a lot of people will buy the camera and scan their own houses. So where can we find a lot of users who would volunteer to participate in this effort?

Let's look elsewhere. There are four million children entering the US education system each year. Every single one of them is required to spend a sizable chunk of their education on learning thermal science concepts -- in a way that currently relies on formalism (the book shows you the text and math, you read the text and do the math). IR cameras, capable of visualizing otherwise invisible heat flow and distribution, is no doubt the best tool for teaching and learning thermal energy and heat transfer (except for those visually impaired -- my apology). I think few science teachers would disagree with that. And starting this year, educational technology vendors like Vernier and Pasco are selling IR cameras to schools.

What if we teach students thermal science in the classroom with an IR camera and then ask them to inspect their own homes with the camera as a homework assignment? At the end, we then ask them to acquire their parents' permissions and contribute their IR images to the Infrared Street View project. If millions of students do this, then we will have an ongoing crowdsourcing project that can engage and mobilize many generations of students to come.

Sensor-based artificial intelligence

We can't take students' IR images seriously, I hear you criticizing. True, students are not professionals and they make mistakes. But there is a way to teach them how to act and think like professionals, which is actually a goal of the Next Generation Science Standards that define the next two or three decades of US science education. Aside from a curriculum that teaches students how to use IR cameras (skills) and how to interpret IR images (concepts), we are also developing a powerful smartphone app called SmartIR. This app has many innovations but two of them may lead to true breakthroughs in the field of thermography.

Thermogram sphere

The first one is sensor-based intelligence. Modern smartphones have many built-in sensors, including the visible light cameras. These sensors and cameras are capable of collecting multiple types of data. The increasingly powerful libraries of computer vision only enrich this capability even more. Machine learning can infer what students are trying to do by analyzing these data. Based on the analysis results, SmartIR can then automatically guide students in real time. This kind of artificial intelligence (AI) can help students avoid common mistakes in infrared thermography and accelerate their thermal survey, especially when they are scanning buildings independently (when there is no experienced instructor around to help them). For example, the SmartIR app can check if the inspection is being done at night or during the day. If it is during the day (because the clock says so or the ambient light sensor says so), then SmartIR will suggest that students wait to do their scan until nightfall eliminates the side effect of solar heating and lowers the indoor-outdoor temperature difference to a greater degree. With an intelligent app like this, we may be able to increase the quality and reliability of the IR images that are fed to the Infrared Street View project.

Virtual infrared reality (VIR) viewed with Google Cardboard

The second one is virtual infrared reality, or VIR in short, to accomplish true, immersive thermal vision. VIR is a technology that integrates infrared thermography with virtual reality (VR). Based on the orientation and GPS sensors of the phone, SmartIR can create what we called a thermogram sphere and then knit them together to render a seamless IR view. A VIR can be uploaded to Google Maps so that the public can experience it using a VR viewer, such as Google's Cardboard Viewer. We don't know if VIR is going to do any better than 2D IR images in promoting the energy efficiency business, but it is reasonable to assume that many people would not mind seeing a cool (or hot) view like this while searching their dream houses. For the building science professionals, this may even have some implications because VIR provides a way to naturally organize the thermal images of a building to display a more holistic view of what is going on thermally.

With these innovations, we may eventually be able to realize our vision of inventing a visual 3D web of thermal data, or the thermographic information system, that will provide a massive data set for governments and companies to assess the state of residential energy efficiency on an unprecedented scale and with incredible detail.

Sunday, October 2, 2016

Many solar facilities use racking systems to hold and move arrays of solar panels. Support of racks is now available in our Energy3D software. This new feature allows users to design many different kinds of solar farm, solar park, and solar canopy, ranging from small scale (a few dozen) to large scale (a few thousand).

Fig. 2: Multiple racks

Mini solar stations often use a single rack to hold an array of solar panels (Figure 1). This may be the best option when we cannot install solar panels on the building's roof. You probably have seen this kind of setup at some nature centers where the buildings are often shadowed by surrounding trees.

If you have more space, you probably can install multiple racks (Figure 2), especially when you are considering using altazimuth dual-axis solar trackers to drive them. This configuration is also seen in some large photovoltaic power stations.

Fig. 3: Rack arrays

Larger solar farms typically use arrays of long racks (Figure 3). Each rack can be driven by a horizontal single-axis tracker. Using taller racks usually requires larger inter-rack spacing, which may be an advantage as it allows maintenance trucks to drive through. In a recent experiment, SunPower experimented with how to grow crops or raise animals in the inter-rack space with their Oasis 3.0 system. So arrays of taller racks may be desirable if you want to combine green energy with green agriculture.

Fig. 4: Solar canopy above a parking lot

If you raise the height of a rack, it becomes a so-called solar canopy that provides shading for human activities like the green canopies of trees do. The most common type of solar canopy converts parking lots into power stations and provides shelters from the sun for cars in the summer (Figure 4).

Designing solar canopies for schools' parking lots may be a great engineering project for students to undertake. This is being integrated into our Solarize Your School Project. In fact, Figure 4 shows a real project in Natick High School in Massachusetts. The hypothetical design has more than 1,500 solar panels (each of them has the size of 0.99 x 1.96 m) and costs over a million dollars.