SPARKS

SPARKSWe build on the success of the CAPA project to produce a sequence of formative assessments based on simulations of electronic circuits and test equipment in this project, in which we collaborate with the Center for Occupational Research and Development (CORD), Tidewater Community College, and the University of Illinois at Chicago.

In this project, we are developing assessments for students in a 2- or 4-year college-level introductory electronics course. The assessments are designed to be used either in the classroom or as self-paced activities outside of class. Each assessment challenges the student to accomplish some task—for example, making a measurement or trouble-shooting a circuit—using a computer-based simulation.

In addition to providing a realistic simulation, the computer also monitors, records, and interprets the student's actions and generates scored reports for use by the student. In the future, reports will also be generated for the instructor. These will aggregate information across every student in a class. The reports will indicate not only whether students were able to accomplish the task, but also how they approached it.

If a student did something wrong, the computer will point it out; if a critical step was left out, the software will observe and report that too. Through the software we will encourage students to repeat the assessments over and over to improve their score (the assessments will incorporate random elements so that this does not become a trivial exercise in memorization). Longitudinal reports will enable both students and instructors to view test scores over time, in order to compare and evaluate learning progress.

Share This




National Science Foundation (NSF) Logo

This material is based upon work supported by the National Science Foundation under Grant No. DUE-0903243. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.

How to cite this material.

Usage/Citation

The Concord Consortium (n.d.) SPARKS. Retrieved 2014, December 19 from http://concord.org/projects/sparks

Disclaimer: The Concord Consortium offers citation styles as a guide only. We cannot offer interpretations about citations as this is an automated procedure.

The SPARKS project is creating formative assessments for introductory college-level electronics. The assessments analyze student performance to identify problems, guide students to understand their mistakes, and help them use that understanding to improve their performance.

The project draws on recent advances in the merger of cognitive science and psychometrics (DiBello, Roussos et al. 2007; Roussos, DiBello et al. 2007) that have focused attention on the scoring and interpretation of performances on assessment tasks, particularly a multi-faceted assessment strategy called skills diagnosis (Roussos, DiBello et al. 2007).

We are creating a model of the cognitive skills and proficiencies needed in a performance domain like electronics, and then generating a hypothesis about which skills are required to succeed on a given set of performance tasks. We ae conducting several stages of data collection and model confirmation and revision – both sequentially and recursively.

We postulate a model of competence and task performance based on the judgments of domain experts and instructors. We will validate our initial assumptions by collecting empirical student performance data, including observations and interview protocols with small groups of students, and estimate psychometrically a formal diagnostic measurement model once sufficient data have been collected.

Together with our colleagues Jim Pellegrino and Lou DiBello at the University of Illinois, Chicago we will develop a set of diagnostic problems or performances that we will embed in our formative assessments.

We will evaluate the effectiveness of our assessments in two contrasting ways:

  • By direct observation of – and interviews with – a small sample of individual teachers and students.
  • By examining the data we will collect from the very large number of students who will use our assessments via the Web.

We aim to cast light on the following questions:

  1. Do the students find the assessment activities engaging and useful?
  2. Do the teachers find the reports generated by the assessments helpful?
  3. Do the students use the assessments effectively (e.g., do they try them multiple times if they get a low score the first time)?
  4. Do the students’ scores on the assessments improve on later trials?
  5. Do students who use the assessments do better academically than students who don’t?
  6. Do students who use the simulated assessments ultimately do better with the real equipment?

We will obtain answers to the first two of these questions through interviews with students and teachers.

We will derive the third and fourth answers by analyzing the data created as each student completes an assessment.

The fifth question is more difficult to answer. Taking advantage of the fact that introductory electronics is typically a one-semester course, we will ask some of our participating teachers to run it in the fall semester without using the new assessments, and then introduce and use them in the spring with an entirely new group of students, presumably drawn from the same population. In making the comparison, we will use the students’ final grade in the class as an unbiased measure of learning results.

We will address the final question by randomly dividing students into two groups: one cohort learning a skill by textbook and lecture, and the other by using the simulated assessments. We will compare the scores of both groups given a hands-on test using real-equipment.

We have identified a number of topics and topic areas that are commonly taught in introductory electronics courses and that involve the application of theoretical knowledge toward the solving of practical real-world problems. We will draw from these areas in developing the assessments for the SPARKS project. Here is a tentative list of the assessments to be created:

  1. Resistor Color Code
  2. Measuring Voltage
  3. Measuring Current
  4. Series Circuits
  5. Parallel Circuits
  6. Series–Parallel Circuits
  7. Oscilloscope 1
  8. Oscilloscope 2
  9. Time Constants
  10. Inductive/Capacitive Reactance
  11. AC Series Circuits
  12. AC Parallel Circuits
  13. AC Series–Parallel Circuits
  14. Resonant Circuits

Log In

Don't have a profile?

Create a profile and...

Create your profile now »

Help Us Help You

minimize no thanks

Please answer these few questions to assist us in better meeting your needs. You can minimize the survey and return to it when you are ready.

Question 1 of 6

Are you a(n)...






If other, please specify: