Innovator Interview: Jie Chao


Jie ChaoQ. When you tell other people what you do, what do you say?

A. I’m trying to figure out what learning will look like in the future and how learning can shape that future. I’m interested in getting students to do authentic, relevant projects. I use learning analytics and data mining to help me understand student learning.

Q. How did you come to educational technology?

A. I’m very proud of my university, but education in China is really frustrating. I majored in chemistry but didn’t get into labs until my final year. That was too late. I had skipped so many lectures to go mountain climbing. I stumbled into an educational services company after graduation and became interested in the science of learning and teaching. I applied to the University of Virginia Instructional Technology program and have been passionate about learning sciences and pedagogy ever since.

Q. How does your background play into your philosophy about education?

A. I struggle with the contrast between American and Chinese education. One is very liberal with little emphasis on facts and the other puts too much emphasis on facts and not on active learning. I lean towards the active learning camp, which is a more powerful way to incorporate new knowledge, though it brings challenges for educators because everyone is different. That’s where computers come in—we can build a big sandbox where everyone can learn on personalized and productive tracks.

Q. What’s been interesting about the Mixed-Reality Labs project?

A. Mixed-Reality integrates the power of computer simulations with sensors to enhance science learning. Simulations are effective learning tools in many ways, but they cannot replicate many unique affordances provided by labs with physical materials. Kids like to touch things. When you experience reality, you’re not speaking with the software creator— philosophically, you’re speaking with God. It’s right there, but it’s mysterious. We wanted to marry these two. We use sensors to take data from physical labs and drive simulations in real time. We also use sensors to generate direct effects on simulations. Finally, we use infrared cameras to look at reality through IR imaging.

An IR camera is a great tool to support inquiry. When you see the moon with a telescope, you ask about the dark spots. If you don’t see them, you never ask the questions. Similarly, IR adds sensing abilities, making it natural to ask questions. It’s easy to imagine bringing IR imaging into augmented reality like wearable glasses. Students could then do experiments and see physical reality with six or seven senses!

Q. How do you use learning analytics?

A. Our Energy3D data is so rich, it almost replicates the classroom, though I’m drowning in data. The ability to collect data at such a fine grain size is like having a new sense for asking questions and looking for patterns. Currently, we’re looking at three high-level design categories—construction of prototypes, analysis of student design, and reflection on the design process. We’ve used cluster analysis to explore different types of effort allocation in these three design activities. Similar techniques are also used for analyzing profiles of explored design space, analytic space, and design episodes. We’ll apply sequence-matching techniques to enable machine recognition of design behaviors. There’s a lot to explore.

Q. Tell us about your first year at the Concord Consortium.

A. I’ve loved it! I’ve learned so much from everyone, especially Charles [Xie]. He is such a visionary thinker and hands-on doer. And our research forum gives me a great window to see what everyone is doing. I’m excited about all the potential for collaboration here.