A Small Step towards a Big Infrared Street View

The Infrared Street View is an award-winning project that was funded by the National Science Foundation. The idea is to create a thermal equivalent of Google's Street View that would serve as the starting point to develop a thermographic information system (i.e., the "Internet of Temperature" as the thermal part of the Internet of Things). This is an ambitious goal that is normally only attainable through big investments from the Wall Street or by big companies like Google. However, being a single developer who doesn't have a lot of resources, I decided to give it a shot on my own. Being a non-Googler, I am counting on the citizen scientists out there to help me build the Infrared Street View. But the first step is to create a free app so that they have a way to contribute.

My journey effectively started in the mid of July 2018. In two months I have learned how to develop a powerful app. At the end, the Infrared Street View is coming into sight! This blog article shows some of the (imperfect but promising) results, as demonstrated in Figures 1 and 2.


Fig. 1: Panoramas in visible light and infrared light generated by SmartIR
This milestone is about developing the functionality of creating infrared panoramas so that anyone with a smartphone that has an infrared camera attachment such as FLIR ONE could produce a panoramic image and contribute it to the Infrared Street View, much like what you can do with Google's Street View app. Although this sounds easy, it is by no means a small challenge as we must work under the constraint of a very slow infrared thermal camera that can only take less than ten pictures per second. We must provide an easy-to-do user interface so that the majority of people can do the job without being overwhelmed. And we must overcome the challenge of stitching the imperfect thermal images together to produce a seamless panoramic picture. Although they are many image stitchers out there, no one can be sure that they would be applicable to thermal images as those stitchers may have been optimized for only visible light images.

Fig. 2: Panoramas in visible light and infrared light (two coloring styles) generated by SmartIR
To support users to make 360° panoramas, we have to guide them to aim at the right angles so that the resulting images can be used for stitching. These images should be evenly distributed in the azimuthal space and they should overlap considerably for the stitcher to have a clue about how to knit them together. SmartIR uses the on-board sensors of the smartphone to detect the orientation of the infrared camera. A number of circles are shown on the screen to indicate the position the user should aim the cursor of the camera at. When the cursor is within the circle, an image is automatically taken and stored in a 360° scroller. The user can just turn the phone at a fixed position for a number of times and a series of images will be generated for stitching. The following YouTube video shows how this image collector works.


Although this is still in a very primitive form, it nonetheless represents the first concrete step towards the realization of the Infrared Street View. Stitchers for infrared thermal images still need improvements to truly achieve seamless effects similar to those for visible light images. Tremendous challenges still lie ahead. I will keep interested folks posted as I inch towards the goal and I am quite optimistic that we can get somewhere, even though we are not Googlers.

Leave a Reply

Your email address will not be published. Required fields are marked *