Students using the Seismic Explorer

Making Uncertainty Accessible to Science Students


Theoretical physicist Werner Heisenberg once proclaimed, “What we observe is not nature itself, but nature exposed to our method of questioning.” The goal of science is to develop a fundamental understanding of natural phenomena, but the road to understanding is neither straightforward nor simple. Nature is complex, and it is this complexity that both excites scientists and limits their explanations.

Uncertainty cannot be avoided in scientific research. Indeed, it is an essential part of doing science. Uncertainty comes from limitations in theories and methods applied by scientists, including what they know already, what instruments they use to collect data, how they sample data, and what analyses they use to uncover mechanisms among identified variables. Recognizing uncertainty in science means understanding how scientific knowledge develops over time. Uncertainty invites productive, critical reflections on what can and cannot be explained. And it requires minimizing known errors and making room for the potentially unknown.

However, uncertainty is rarely introduced in science classrooms for the fear of making students science doubters. For example, A Framework for K-12 Science Education seems to recommend avoiding the discussion of uncertainty in science: “although science involves many areas of uncertainty as knowledge is developed, there are now many aspects of scientific knowledge that are so well established as to be unquestioned foundations of the culture and its technologies.”1 Nonetheless, over the past decade we have made important research contributions on the role and nature of uncertainty. And the overarching goal of the National Science Foundation-funded projects featured in this issue is research on the role of uncertainty in the study of Earth science.

Eliciting uncertainty in scientific argumentation

To elicit student ideas about uncertainty, we created uncertainty-infused scientific argument writing tasks in the High-Adventure Science modules. After students investigated data from scientists or from computational models, they were asked to make a claim, explain their reasoning based on data to justify the claim, select their level of uncertainty from 1 (not at all certain) to 5 (very certain), and attribute sources of uncertainty. We validated these scientific argumentation tasks along with the rubrics.2 The analysis of pre- and post-tests of roughly 6,300 students taught by 132 teachers showed significant improvements in writing scientific arguments with uncertainty after they completed the modules (from 0.35 to 0.54 standard deviations).

Characterizing a taxonomy of student uncertainty attribution

Based on our analysis of students’ uncertainty-infused scientific arguments, we constructed a taxonomy representing five distinct ways students attribute sources of uncertainty:

1) Students include no information about uncertainty attribution.

2) Students express personal uncertainty attribution statements.

3) Students use words like “data,” “reasoning,” or “knowledge” without citing any specific details.

4) Students include scientific descriptions of the theoretical basis or empirical findings associated with the investigation.

5) Students elaborate theoretical, empirical, measurement-related, and analytical limitations associated with the investigation.

This taxonomy can be used by teachers to engage students in productive discourse about the scientific uncertainty involved in their investigations and about the nature of science.

Supporting uncertainty attribution through automated feedback

After years of honing and validating the scientific argumentation tasks and rubrics, we developed automated scoring models to evaluate students’ performances and to identify types of feedback students would need to improve their performance. We engineered an automated scoring and feedback system called HASBot and embedded it within the uncertainty-infused scientific argumentation tasks in two curriculum modules on climate change and water sustainability. Students submitted their initial scientific arguments and received an automated score and feedback in real time with HASBot. Eighteen teachers from 11 states implemented the two modules. Our findings show that students wrote significantly better scientific arguments after the climate change module (a 0.85 standard deviation increase) and the water sustainability module (a 1.52 standard deviation increase).3

Examining uncertainty arising from simulation models

We are currently exploring different ways to support students’ consideration of uncertainty and elicit their thinking about it. In the GEODE project, for example, students make model-based claims to explain real-world evidence in the context of plate tectonics. They are scaffolded to examine the limitations in applying knowledge gained from Tectonic Explorer models to the real-world seismic and eruption data visualized in Seismic Explorer. We are investigating how students address sources of uncertainty while connecting model-based understanding to real-world data.

Characterizing risks associated with natural hazards based on uncertainty

A new area of research includes the study of natural hazards, which allows us to explore uncertainty involved in risk assessment. In order to make predictions, scientists identify patterns from historical data and interpret them based on their understanding of how natural phenomena, such as hurricanes and wildfires, behave. Four types of uncertainty operate while considering hazards and potential risks: 1) measurement uncertainty in data collection, 2) modeling uncertainty of complex systems, 3) temporal uncertainty due to difficulties in recounting past events and predicting future events, and 4) transitional uncertainty in sensemaking and communicating about uncertain results. We are exploring how students think about and practice uncertainty during their investigations of risks, in particular on system uncertainty (i.e., complex systems cannot be modeled exactly) and prediction uncertainty (i.e., future events cannot be predicted precisely).

Estimating uncertainty through Monte Carlo simulation

We are also exploring how students make sense of risk when it’s represented as the probability of experiencing a negative impact due to a natural hazard for a given location. Students learn how volcanologists estimate the probabilities of hazardous events (e.g., the collapse of a building), and they are guided to program computational models to represent variables and their relationships. For instance, students model how tephra disperses after volcanic eruptions. Like scientists, students estimate the risk for a particular negative impact to a community living near a volcano. We are currently exploring how students interpret the uncertainty as represented in the output of a Monte Carlo simulation.

Our research on uncertainty has evolved. Each new project brings its own challenges in designing curriculum and assessment materials so students can think about the uncertainty embedded in science. Throughout, we hope to teach students how to weigh evidence, what it means when an Earth scientist talks about uncertainty in data, how models are critical tools for understanding Earth phenomena even though there are limitations to using them as evidence for scientific claims, and how estimating probabilities of risk and impacts is critical even in the face of uncertainty. We hope that students learn that science includes both curiosity and uncertainty, and that it is possible to refine our understanding of the natural world while at the same time embracing uncertainty as an important feature of the scientific endeavor.

Hee-Sun Lee (hlee@concord.org) is a senior research scientist.

1. National Research Council. (2012). A framework for K-12 science education: Practices, crosscutting concepts, and core ideas, p. 44. Washington, DC: National Academies Press.

2. Lee, H.‐S., Liu, O. L., Pallant, A., Roohr, K. C., Pryputniewicz, S., & Buck, Z. E. (2014). Assessment of uncertainty‐infused scientific argumentation. Journal of Research in Science Teaching, 51(5), 581-605.

3. Lee, H.-S., Pallant, A., Pryputniewicz, S., Lord, T., Mulholland, M., & Liu, O. L. (2019). Automated text scoring and real-time adjustable feedback: Supporting revision of scientific arguments involving uncertainty. Science Education, 103(3), 590-622.

This material is based upon work supported by the National Science Foundation under grant DRL-1812362. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.