Students should learn science by doing science. They should select their own question, design and execute a study, draw conclusions based on their data, and communicate their findings. InquirySpace has demonstrated that typical students can learn to use an integrated set of computer-based tools to undertake sophisticated, open-ended investigations similar to the approach and thinking used by real scientists.
Watch teachers and students describe how InquirySpace has changed their classrooms.
Science should be learned in the way scientists learn, using inquiry-based learning or "extended inquiry." But finding projects for students that are feasible and interesting is a major problem that teachers face when taking this advice seriously. Projects cannot be too complex or too long; they cannot demand expensive equipment or require unusual skills. And projects must address the course learning goals. These constraints limit the range of feasible student projects. InquirySpace gives students tools, guidance and ideas that greatly expand the range and sophistication of meaningful open-ended science investigations.
The ability to produce and conduct collaborative inquiry activities can make inquiry far more effective and widespread in introductory science instruction. InquirySpace can be used across many grades and in diverse schools, not only in colleges and high-performing schools, but in under-resourced schools in which students often are performing below grade level and for whom text-based instruction is decontextualized and difficult.
InquirySpace uses three proven technologies—the versatile modeling environments of NetLogo and the Molecular Workbench, real-time data collection from probes and sensors, and the powerful visual data exploration capabilities of CODAP. These tools are integrated into a coherent, Web-based environment enabling rich, collaborative scientific inquiry.
Want to see InquirySpace in action? Check out this newsletter article »
All InquirySpace curriculum materials are available for free on the Learn portal.
The IS project identified and theoretically characterized Parameter Space Reasoning (PSR) as a powerful and general approach to inquiry-based experimentation. Parameter space is multi-dimensional and defined by outcomes and input parameters students can manipulate in experiments in both physical and simulation setups. The PSR construct was used to develop a test consisting of 28 items in four investigation contexts with varying degrees of transfer. Two investigation contexts addressed identical experimentation situations involving pendulum and spring/mass systems (near transfer); one addressed a medium transfer context involving distance and time graphs of a long distance running situation; the other addressed a far transfer situation related to multivariate relationships in everyday gardening. The test also addressed five PSR reasoning types:
- PSR1. Describe a scientific phenomenon using a time series graph.
- PSR2. Set, measure or obtain an outcome variable from a time series graph.
- PSR3. Make a parameter space graph using data from a time series graph.
- PSR4. Identify patterns represented in a parameter space graph and explain the patterns.
- PSR5. Treat, identify or explain outliers or errors in a primary or a parameter space graph.
The Parameter Space Reasoning approach raises a number of questions. Can students understand all the parts of PSR: the two kinds of graphs, the difference between parameters and outcomes, the relationship between points in parameter space and a time-series? Can they master all the reasoning steps in PSR? Can students coordinate this approach to produce a coherent explanation of the system being investigated? Can they use PSR to undertake their own investigations? To answer these questions, the project undertook three research studies.
The PSR Pre-Post Assessment
To determine whether students were able to understand and apply PSR, we included five aspects of PSR in pre- and post-tests (Lee, 2014). We looked for gains in the following tasks: understanding time-series graphs, obtaining an outcome, making a parameter-space graph, explaining the patterns in a particular graph, and explaining unusual features. Gains in these five elements were measured by an assessment consisting of 28 test items that were embedded in four investigation contexts of varying degrees of transfer. Two investigation contexts addressed experiments that were part of the IS materials (near transfer); the third addressed a medium transfer context involving distance and time graphs of a long distance running situation; the last context addressed a far transfer situation related to multivariate relationships in everyday gardening. The assessment passed Rasch analysis tests that determined the reliability of the assessment and other psychometric properties.
The PSR assessment was administered to a total of 231 students enrolled in three different high school physics courses. The assessment was used as a pre-test (before using any IS materials) and a post-test (as the last IS activity). Overall, students scored significantly better on the post-test, indicating that they did increase their understanding of PSR and its application to near and medium transfer contexts, but not to far transfer. There were significant differences in the gains by students in different courses. The net gains observed in this initial research were not as large as expected, suggesting that improvements in the assessment, technology, teacher preparation, class time, and materials might result in larger gains in PSR.
Bayesian Knowledge Tracing Analysis of Game Performance
The Pre-Post PSR Assessment only indicates that students PSR scores increased, but not why. To gain additional detail about student progress in understanding the relationships in a mechanical system, the project undertook a detailed analysis of student learning patterns as students played the Ramp Game that was administered just after the pre-test.
The Ramp Game is played by groups of two to four students and consists of five levels (with several steps at each level) in order of increasing difficulty. In order to study patterns of learning, the software recorded all parameter changes students made as well as students’ scores in each level. We analyzed student progress at 447 game levels produced by 64 student groups in two physics classrooms using a computational algorithm called Bayesian Knowledge Tracing (BKT). We improved conventional BKT algorithms substantially to increase the speed of BKT calculations by at least 10,000 and, more importantly, to increase its precision. We named this advanced version Monte Carlo BKT or MC-BKT (Gweon et. al., 2015a; 2015b).
Using MC-BKT, we were able to identify seven distinct learning patterns. As expected, one pattern showed students not improving in scores and another pattern displayed students’ quick mastery of the knowledge associated with a level in the game. More interesting were the five patterns showing students’ struggles followed by successful learning, but with different rates and success (Lee et al., 2015; Pallant, Lee, & Kimball, 2015).
Observational Analysis of Screencasts
To gain additional insights on student thinking, the project created screencasts of some student groups as they worked with the IS tools for full class periods. This research used the same technology as the screencasts used for student reports, but the "research" screencast recordings were left on for the entire class period.
For the Ramp Game, screencasts provided an independent way to identify student learning patterns in order to determine whether there was a match between the patterns identified by MC-BKT analysis and those identified by experienced educational researchers (Lee, et al, 2015). We collected screencasts from 21 of 64 student groups who played the Ramp Game, representing 32% of the data analyzed with MC-BKT. Using a structured analysis of these screencasts, we were able to identify all but one BKT cluster and even observed three sub-clusters within the remaining MC-BKT cluster. There was 84% agreement between the clusters identified through video analysis and through MC-BKT. The analysis also told us that students learned more about the physics used in the game than about PSR. We also analyzed the screencasts of the spring/mass investigation and observed the following themes (Stevens & Pallant, 2015):
- Students responded strongly to graphical anomalies.
- Students coordinated data representations and features to make sense of puzzling data.
- Even disengaged students frequently reasoned about patterns in their data, about what constituted acceptable variation, and what data should be rejected or ignored.
- Students moved from viewing graphs as tasks they had to perform to viewing them as tools to help them understand an experiment or a concept (Stephens & Pallant, 2015).
While it is impressive that students were able to coordinate all the technology and techniques to investigate the systems, we were interested in whether students could apply the PSR approach to questions of their own. Teachers were willing to devote only two or three class periods to independent projects, so these projects were quite limited. Half of the students selected one of the systems they had already explored and successfully investigated the effect of a new parameter on the outcome. The other half selected their own questions, such as finding the amount of force applied to a roller coaster with a certain design. For example, one group tried a more creative system. They built a long ramp in the hallway and tried to fit a parachute to the back of a toy car and make it reach terminal speed. Unfortunately, there was insufficient time and resources to construct a functional ramp-car-parachute system and they were unable to obtain any data.
This project has demonstrated that typical students can learn to use an integrated set of computer- based tools to undertake sophisticated, open-ended investigations that are similar to the approach and thinking used by scientists. This allows students to experience the practices of science as envisioned in the NGSS standards. Although the project focused on secondary physics and physical science content, it should be widely applicable. All science disciplines depend on investigating cause and effect, which is essentially the impact of parameters on outcomes, or PSR.
The goal of IS was to test a unique approach to scientific exploration before undertaking an investigation into the applicability of this approach to diverse students. For this reason, we have no data on student diversity. However, one research site was a middle school public charter school that randomly selected inner-city students. It was clear from observing these students that many had had very limited prior science lab experience and were under-performing in language skills and algebra. However, most of these students were able to perform the experiments and generate reasonable screencast reports. Although further research is needed, it seems likely that the combination of real-time data acquisition and display, graphical reasoning, and screencast reports coupled with the avoidance of algebra and writing will broaden access to authentic scientific explorations.
Teacher uptake of inquiry activities is hindered by the pressures to give students exposure to the maximum amount of content. We have found a way to combine this emphasis on “doing science” with instruction of disciplinary core ideas by having students investigate systems that illustrate relevant science content. This is essential for wide adoption of open-ended project-based teaching.
Other innovations of the project that will have broad impact include the creation of NetLogo Web and the MC-BKT algorithm. The creation of NetLogo Web is important because there are tens of thousands of NetLogo users who can now, for the first time, create models that run in any browser. The MC-BKT algorithm we developed is likely to have a broad impact on educational research because it is faster and more accurate than other BKT algorithms. This means that it might be feasible for the first time to do this analysis in real time as a student is playing the game. This, in turn, means that automated guidance based on MC-BKT might be able to automatically scaffold student learning. The project’s video analysis strongly suggests that BKT was detecting meaningful learning patterns. In the future, MC-BKT could sort students by their learning patterns in any educational game and provide real-time assistance based on these patterns.
The IS Learning Progression
The project has developed a sequence of increasingly open-ended computer-based activities using games, sensors, and computational models to develop student investigative skills. To simplify their integration into traditional course instruction, the activities are suitable alternatives to standard treatments of common physical science content such as motion, oscillation, and friction.
The project has developed a way of analyzing data that provides a template that students can use with many kinds of investigations. The basic idea is that a scientific explanation of a system is often stated in terms of how one or more variables affect some outcome. For instance, in one experiment, students explore a spring-mass system using force and motion detectors.5. A cup is supported by a light springs, various masses are added to the cup, the cup is released, and its vertical oscillations are recorded. The challenge is to investigate the connection between the mass and spring constant (the independent variables) on the period of oscillation of the system (the dependent variable). A distance sensor on the floor generates a time-series (students learn to call this a “run”), which can be graphed as the height of the mass above the floor as a function of time. Then one parameter—say, mass—can be changed systematically and new runs are made that each generate a different time-series from which period (the dependent variable in this case) can be calculated.
After exploring several different values of the mass and observing different periods, students can use CODAP to graph the period as a function of the mass. This graph appears as a series of points, each representing the results of one run. Students can use additional graphs to explore the influence of the other independent variables. A description of these graphs can provide a complete explanation of the effect of each independent variable.
We have named the type of cognition necessary for students to successfully engage in this form of inquiry-based experimentation, Parameter Space Reasoning (PSR). PSR is associated with planning experiments, operationalizing a set of parameters, navigating the parameter space through multiple experimental runs, identifying patterns in parameter space plots, and reflecting on sources of error (Lee et al., 2014).
The project relies heavily on having students read and interpret graphs accurately. Based on our observations of student weaknesses in this area, we developed a Ramp Game in NetLogo Web to introduce the IS approach (See Figure 1). To succeed in the game, it is essential for the player to read graphs and engage in PSR in order to move through five progressively more difficult challenges. Following this game, students use the same data graphing tools to investigate real and virtual spring-mass systems, and a parachute simulation (Figure 2). After these explorations, students are encouraged to undertake investigations to answer their own questions.
Figure 2: The Parachute Simulation. Data are generated by dropping the simulated parachute with different masses and sizes. Time-series of distance traveled and velocity are generated as the parachute drops. Runs are then displayed by CODAP as a table (upper right) and parameter-space graph (lower right). In this example, the user has attempted to match the final data to a straight line, but this is a poor match, indicating that another relationship is needed.
PSR is useful in many experiments—both lab based and simulated—wherever the experiments generate time-series datasets that depend on one or more independent variables. To enable students to undertake their own hands-on investigations, the project developed two activities that can accept data from one or two probes using the Sensor Connector, and pass those data to CODAP. To support projects based on student-generated models, we also have made it possible to connect any NetLogo Web model to CODAP. Any programmer who can use NetLogo and HTML can link a NetLogo Web model into CODAP (Finzer & Tinker, 2015). We plan to simplify this process in the future.
Articles and Papers
Finzer, W. (2014). Hierarchical data visualization as a tool for developing student understanding of variation of data generated in simulations. In Proceedings of the ninth international conference on teaching statistics (Vol. 6). Voorburg: International Statistics Institute.
Finzer, W., & Tinker, R. (2015). Under the hood: Embedding a simulation in CODAP. @Concord, 19(1), 14.
Gweon, G.-H, Lee, H.-S., Dorsey, C., Tinker, R., Finzer, W., & Damelin, D. (2015). Tracking student progress in a game-like learning environment with a constrained Bayesian Knowledge Tracing model. Proceedings in Learning Analytics & Knowledge Conference 2015, Poughkeepsie, NY.
Gweon, G.-H, Lee, H.-S., Dorsey, C., Tinker, R., Finzer, W., & Damelin, D. (2015). Tracking student progress in a game-like learning environment with a Monte Carlo Bayesian Knowledge Tracing model. Paper presented at the annual meeting of American Physical Society, San Antonio, TX.
Hazzard, E. (2014). A New Take on Student Lab Reports The Science Teacher. March. (Posted with permission of The Science Teacher.)
Lee, H.-S., Gweon, G.-H., Dorsey, C, Tinker, R., Finzer, W., Damelin, D., Kimball, N., Pallant, A., & Lord, T. (2015). How does Bayesian Knowledge Tracing model student development of knowledge about a simple physical system? Proceedings in Learning Analytics & Knowledge Conference 2015, Poughkeepsie, NY.
Lee, H. -S., Pallant, A., Tinker, R., & Horwitz, P. (2014). High school students' parameter space navigation and reasoning during simulation-based experimentation. In (Eds.),Proceedings of the Eleventh International Conference of the Learning Sciences (ICLS 2014) (pp.681). Boulder, CO: International Society of the Learning Sciences.
Pallant, A, Lee, H-A, and Kimball, N. (2015). Analytics and student learning: An example from InquirySpace. @Concord, 19(1), 8-9.
Stephens, A.L. and Pallant, A. (2015). From Graphs as Task to Graphs as Tool: Scaffolded Data Analysis. Manuscript submitted for publication.
Tinker, R. (2015) InquirySpace: A Place for Doing Science.
Tinker, R., Hazzard, E. (2012). InquirySpace: A Space for Real Science @Concord. 16(2) 8-9.