Experimenting with Extended Reality in our Innovation Lab
At the Concord Consortium, we’re always experimenting with new ideas, using new technologies to support classroom inquiry in STEM, and researching the effects on learning. We’re a group of thinkers and tinkerers. As we invent new tools for tomorrow’s learners, our Innovation Lab draws from the future.
We’re currently fascinated by the potential of extended reality (XR)—a general term that includes technologies such as augmented reality (AR), mixed reality (MR), and virtual reality (VR)—to improve learning in schools, homes, and informal learning spaces like museums. From the practical to the fantastical—Google Cardboard to the HTC VIVE—we’re exploring approaches and technologies that blend the real world with the virtual or immerse users in an imagined space to help students understand the world around them.
GRASP (Gesture Augmented Simulations for Supporting Explanations), a project funded by the National Science Foundation, is studying the role that motions of the body play in forming explanations of scientific phenomena. We’re designing web-based gesture-controlled simulations for students to investigate molecular heat transfer, the pressure-volume gas law relationship, and the causes of the Earth’s seasons. Using the Leap Motion controller, GRASP can track the position and movement of a user’s hands in three dimensions. For example, students can explore the changing seasons by controlling the tilt of the Earth with the angle of their hands.
Our Infrared Street View program is a counterpart to Google’s Street View. IR cameras are capable of visualizing otherwise invisible heat flow and distribution, so we designed the IR Street View using the low-cost FLIR ONE thermal camera that can be plugged into a smartphone (iOS or Android) to collect heat flow data from buildings (Figure 1). We envision a time when a massive crowdsourcing project could engage students to collect data that demonstrates the current state of energy efficiency of their neighborhoods, towns, and states. Infrared Street View won the 2016 JUMP competition, sponsored jointly by CLEAResult, the largest provider of energy efficiency programs and services in North America, and the National Renewable Energy Laboratory, a research division of the U.S. Department of Energy.
The William K. Bowes, Jr. Foundation provided funding for our Learning Everywhere project, which is experimenting with attaching the Leap Motion to the headset of an HTC VIVE fully immersive room-scale VR system, so students can become part of the microscopic world and use their hands to control simulations. (Think Ms. Frizzle on a Magic School Bus field trip.) With typical VIVE controls, we can provide haptic feedback; however, we’re making the experience controller-free, so we’re using audio feedback (e.g., changing volume and pitch) to help students understand their movements—for instance, when they pull two molecules apart to explore ionic and covalent bonds. We also prototyped a simple room-scale solar system simulation where a learner can walk around the surface of a model Earth, observe shadows on a distant moon, or teleport and see the system from an alternative perspective.
We’ve just started to explore Microsoft’s HoloLens to bring the physical and digital worlds together in augmented reality. The HoloLens is a wireless headset that is transparent, but it allows a user to see a hologram on top of the real world. We’re experimenting with teaching complex subjects using gesture-based interactions with virtual holograms, which provide the freedom to move around and explore a simulation from different perspectives.
At the 2017 Augmented World Expo, we gathered with like-minded geeks and technology futurists ready to discuss the innovative uses of AR and VR. Enthusiasm and inventiveness were in the air everywhere. We don’t know where the future of AR and VR will lead us, but we’re optimistic about the possibilities for STEM teaching and learning.
This material is based upon work supported by the William K. Bowes, Jr. Foundation and the National Science Foundation under grants DUE-1432424 and DRL-1512868. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.