Mixed Reality Brings Science Concepts to Life
In his Critique of Pure Reason, the Enlightenment philosopher Immanuel Kant asserted that “conception without perception is empty; perception without conception is blind… The understanding can intuit nothing, the senses can think nothing. Only through their union can knowledge arise.” More than 200 years later, his wisdom is still enlightening our work in mixed-reality science experiments.
Mixed reality refers to the integration of real and virtual worlds to create new environments where physical and digital objects coexist and interact in real time to provide user experiences that are impossible in either the real or virtual world alone. Perception is a cognitive process that occurs in the real world while conception is a cognitive process that can be stimulated by virtual reality. Mixed reality couples the two processes.
Enacting science concepts in the real world
One way to look at the cognitive potential of mixed reality is to start by examining hands-on activities. Students enjoy hands-on activities because they provide perceptual experiences that feel real. For these experiences to make sense, however, students must be prepared with the conceptual framework needed to understand what they perceive. For example, while conducting an experiment about a gas law, students must be able to reason about the results using the kinetic theory (a gas is made of many interacting molecules in perpetual random motion). In this case, the temperature, pressure and volume of a gas can be perceived, while molecules, their motions and their collisions cannot—these are concepts scientists developed to explain the perceivable properties of gases.
Traditionally, students learn the kinetic theory first and then investigate gas laws in the lab. But integrated learning is not guaranteed. Even if students have studied a concept and performed well on a written test, they can still fall back on their possibly erroneous preconceptions in a lab, as if they had not been taught the concept earlier.
To enhance conceptual learning in lab activities, we can use powerful computers to render abstract concepts as visual, dynamic simulations and use sensors to seamlessly integrate the simulations with perceptual experiences in the real world (Figure 1). Such simulations can respond to changes of physical properties caused by the user’s actions. In this way, students can see the science concepts at work in the real world. For example, students can walk around a building holding a tablet that is running a molecular simulation of air and experience how the air temperature they feel is related to the simulated motion of air molecules displayed on the screen. And when a student ventures outside, the tablet can run a simulation revealing how water molecules form a regular lattice structure when the environmental temperature is below the freezing point; that lattice would break when the student walks back inside.
These mixed-reality activities represent a novel method of blending computer simulations into the real world. Simulations capable of reacting to changes in the environment provide a way to translate the otherwise obscure numeric data from sensors into compelling visualizations of science concepts.
Situating computer simulations with perceptual anchors
Another way to look at the cognitive potential of mixed reality is to start with learning with computer simulations. Simulations of invisible properties and processes are now widely used to teach science concepts. However, visual simulations of invisible phenomena alone are often insufficient for learning, because cognition requires a real-world context. For conceptual understanding to take root, students must find ways to connect new concepts to their perceptual experiences and integrate them with their current knowledge.
To help students make these mental connections, instructional designers often contextualize science animations with graphics that represent familiar objects. For example, clicking an image of a bike pump in a gas simulation adds molecules; clicking an image of a Bunsen burner adds heat, and so on. These images serve as the perceptual anchors that link the picture of random molecular motion to the everyday experiences of pressure and temperature. These anchors, however, are limited to visual perception.
What if, instead of clicking on images, students could actually exert force or add heat to compress or heat a simulated gas (Figure 2)? This way, the strange simulation can be meaningfully situated in a familiar environment and connected with different kinds of perception (e.g., spatial, mechanical and thermal senses). In this mixed-reality configuration, natural user actions are mapped to variables in the simulation to create an illusion—as if students could physically manipulate the virtual molecules at an extremely small scale.
Students can even “feel” interatomic forces using mixed reality. Figure 3 shows an activity that connects the sense of force with a visualization of interatomic interaction. Students can investigate the interatomic force as a function of the distance between two atoms. They will find that when the atoms start to overlap, they do not get closer no matter how hard the student pushes the spring, and the attraction force is the greatest at a certain distance but quickly diminishes when the atoms are further apart. In this way, students will discover the van der Waals force. This activity can be extended to teach other atomic-scale interactions, such as ionic bonds, covalent bonds, hydrogen bonds and protein-ligand docking.
Collaborating on inquiry
Mixed-reality labs can use multiple sensors to activate and enhance multimodal perception in science simulations. This allows a group of students to manipulate a simulation jointly using multiple inputs. For example, one student exerts force to compress or decompress a virtual gas while another uses a hot or cold object to change its temperature. Together, they investigate Gay-Lussac’s law (the pressure of a gas is proportional to its temperature). Or imagine two students each controlling the temperature or number of molecules of a virtual gas in a compartment separated from the other by a piston. They would discover Charles’s law (the volume of a gas is proportional to its temperature) or Avogadro’s law (the volume of a gas is proportional to its molecule count). This kind of mixed reality enables students to physically play the roles of different science concepts and learn their relationships collaboratively.
Next-generation educational technology
The use of computer simulations across science is emphasized in the Next Generation Science Standards (NGSS). Uniting student actions in the real world with the reactions of simulated molecules, mixed reality provides an unprecedented way to interact with science simulations. It represents an important direction of next-generation educational technology that promises to support NGSS.
Charles Xie (qxie@concord.org) directs the Mixed-Reality Labs project.
This material is based upon work supported by the National Science Foundation under grant IIS-1124281. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.