Mixed-Reality Labs

Hands-on learning can—and arguably should—be supplemented with computer simulations, which allow you to see the invisible. This project is exploring innovative technologies that connect the real and virtual worlds to accelerate and deepen student learning. We’re integrating sensors and simulations and developing powerful mixed-reality environments for supporting scientific inquiry and engineering design.

Hands-on labs provide rich context and multi-sensory experiences, but often fail to reveal the underlying concepts clearly. Virtual labs help focus student attention on the concepts through visual, interactive simulations, but often lack a sense of reality. By combining these two types of learning into single "mixed-reality" experiences, the advantages of both should increase learning. We've partnered with the University of Virginia to study mixed-reality labs that integrate sensors and simulations to enhance laboratory experiences in high school chemistry and physics courses.

Many sensor-simulation couplings are possible. For example, a relative humidity sensor, a temperature sensor, or a salinity sensor can be combined with a molecular dynamics simulation to study the effect of temperature or salinity on evaporation rate. An anemometer can be linked to a computational fluid dynamics simulation to investigate fluid flow. A motion detector and a force sensor can be used with a mechanics simulator to study kinematics, statics, or dynamics. An electromagnetic field sensor can be used with an electrodynamics simulator to study many invisible field effects of electromagnetism. We're intrigued by these integrations, and we're building a wide variety of prototypes to test in classrooms.

We're developing two mixed-reality technologies: the Frame Technology and the Kinect Lab. These applications provide powerful real-time inquiry environments for students to explore the natural world and make connections to fundamental concepts in science.

Principal Investigators

Charles Xie
Edmund Hazzard
Jennifer L. Chiu

Project Inquiries

qxie@concord.org

Project Partners

University of Virginia

Share This


National Science Foundation (NSF) Logo

This material is based upon work supported by the National Science Foundation under Grant No. IIS-1124281. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.

How to cite this material.

Usage/Citation

The Concord Consortium (n.d.) Mixed-Reality Labs. Retrieved 2014, September 21 from http://concord.org/projects/mixed-reality-labs

Disclaimer: The Concord Consortium offers citation styles as a guide only. We cannot offer interpretations about citations as this is an automated procedure.

Project research is investigating whether the integration of virtual and physical experiments through mixed-reality strategies can mutually enhance each other and promote deeper conceptual understanding. There have been few comparable studies. There is some research on the effectiveness of probeware and ample studies on the use of simulations of all kinds. A number of studies have looked at the relative effectiveness of virtual and physical labs and even the benefit of combining distinct virtual and physical labs. But there have been no studies on how virtual and physical experiments can seamlessly work together to transform student learning.

We plan formative research designed to identify important issues for later, detailed studies.

We will focus on the following research questions:

  1. How do students use mixed-reality activities? What kinds of new opportunities and practices do the mixed-reality labs afford students and teachers? How do these affordances correlate with learning and epistemologies of science?
  2. How can mixed-reality labs promote deep and coherent learning of science content and processes? Are mixed-reality activities more effective than similar activities that do not combine sensors and simulations?
  3. What kinds of teacher and curricular support can best enhance teaching and learning based on mixed-reality labs?

Our research hypothesis is that mixed-reality labs will provide two key elements to afford deeper learning: accelerated inquiry and augmented inquiry. For example, an augmented reality camera based on the high-performance 3D motion sensor built in Microsoft's Kinect controller can track and analyze motions in real time, providing students with instantaneous results and saving them time for collecting and analyzing data. The additional information available in an augmented camera view reveals scientific knowledge that would otherwise fall beyond students' perception.

We're developing and studying two kinds of mixed-reality environments. One group of activities uses data acquired in real time from a physical experiment to control a virtual experiment. The advantage of this coupling is that abstract concepts or invisible processes can be visualized on the computer screen while the physical experiment is under way.

For instance, in a laboratory inquiry into gas, students can measure and plot in real time pressure and temperature as the volume of a gas is changed. At the same time, these values can drive a molecular simulation of the gas that will permit students to "see" what is happening at the molecular level.

A second integration strategy uses physical and virtual experiments in parallel, challenging the student to match the results measured by the sensors and the results computed by the simulations. The learning potential in this configuration stems from the ability to go back and forth between both worlds, adjusting the virtual experiment to match the physical experiment and then adjusting the physical experiment to test the fidelity of the virtual experiment.

All the activities are being developed in collaboration with classroom teachers who teach high school physics and chemistry. Implementations of several activities in eight classrooms in Massachusetts and Virginia will be compared to classes covering similar content that do not use mixed-reality environments.

Log In

Don't have a profile?

Create a profile and...

Create your profile now »