Home » Season 3 » The Quest to Improve the Teaching of Electricity in the St. Mark’s Introductory Physics Course

The Quest to Improve the Teaching of Electricity in the St. Mark’s Introductory Physics Course

By Jacob Backon, STEM Faculty


The Quest to Improve the Teaching of Electricity in the St. Mark’s Introductory Physics Course


In response to research indicating significant conceptual misunderstandings of basic electrical concepts, the physics teachers at St. Mark’s incorporated the CASTLE curriculum into the introductory physics course. Over the past few years this curriculum has met with two significant challenges: delivering efficient feedback in response to student model building, and the time it takes to move through the curriculum. Canvas modules were used to address these challenges, and a concept test was administered before and after instruction to gather data on the effectiveness of these techniques. Preliminary data with a very small sample size indicates the CASTLE curriculum and Canvas modules did result in higher scores on the concept test than data reported from a more traditional style of instruction.


Over the past several years the physics teachers at St. Mark’s have been recreating the physics course that most students take in their third form year. One of the major changes to the curriculum was the introduction of a method of teaching electricity and magnetism called CASTLE. CASTLE is an acronym that stands for “capacitor aided system for the teaching and learning of electricity”, and is a hands-on inquiry based curriculum that uses DC circuits to provide students with observable physical phenomenon that facilitate model building. The curriculum was developed by a group of high school and college teachers with support from the National Science Foundation and the US Department of Education National Diffusion Network and has been peer reviewed.

The introduction of the curriculum has received mixed feedback from students and faculty. The major complaints have centered around the time it takes to develop the seemingly simple concept of Ohm’s Law, the use of non-conventional vocabulary to describe concepts, and the challenge of giving efficient feedback in response to student activities. Students often report a fatigue factor with the style of the curriculum which requires a large amount of independent meaning making from a diverse set of observable phenomena. In response to this feedback, the Canvas LMS was used to create a blended approach to CASTLE, and a concept test was used to collect objective data on the effectiveness of this method of instruction.


The methods presented in this study center around assessment of knowledge and delivery of the curriculum. The use of Canvas to help facilitate the CASTLE curriculum served two purposes: first, it allowed easy tracking of student progress and second, it provided a system for students to get instant feedback on their work with the use of preloaded quizzes. Here is an example of what the Canvas modules looked like:

Screenshot 2016-05-09 14.39.01

After each investigation, there is a zero point quiz. These quizzes contained the same questions asked in the CASTLE activities and provided students with the answers from the curriculum resources. Once the students completed the quizzes, the correct answers were revealed, and students could immediately see where their misconceptions were. The hope was that this presentation of the material would address the criticisms regarding timely feedback and also mitigate the fatigue factor by making the modules self-paced.

To establish a baseline of student knowledge about electricity before starting the CASTLE curriculum, the DIRECT concept test was administered. This multiple choice test is designed to measure conceptual understanding of the physics of DC circuits. The research validation summary provided by physport.org describes the DIRECT concept test as follows:

The multiple-choice questions on the DIRECT were developed based on instructional objectives, literature review and expert input. Free-response versions of the questions were given to students, and the responses were used to create the multiple-choice answers. The DIRECT was given to over 1000 students at the high school and university level across the US. Results were used to conduct appropriate statistical analyses of reliability, difficulty, discrimination and internal consistency, and some of these values were found to be above acceptable thresholds. The test was revised in response to the statistical analysis and student interviews.

Once the concept test was administered, the questions were put away and not reviewed until it was time to take the posttest. This was done to minimize the possibility of teaching to the test and skewing the posttest data.

Once preliminary data was collected, the modules were introduced in class and the students began working. In an effort to let students move at their own pace, only two deadlines were set over the course of window 3. These were primarily to ensure no one got too far off the pace of the fastest students. As window 3 came to a close, most students were in the middle of the 5th of 6 modules. Students finished the 6th module after spring break with a brief review class on the first day back. At the end of each module, students took a quiz designed to measure their understanding of the material from that module. These quizzes were provided as part of the CASTLE curriculum and were aligned to it.

After all students had completed the 6 modules that comprised the CASTLE curriculum the DIRECT concept test was readministered. In order to increase the chance of getting reliable results, a 10% bump to each student’s last quiz was offered if he or she did better on the post-test than the pretest. After the administration of the test a series of one-on-one interviews were conducted with select students based on their test results and classroom observations.


The data gathered is presented below. It should be noted that, while there were 14 students in the introductory physics class tested, two students missed either the pretest or posttest and are not represented on the graph. The scores of these two students are included in the class averages for their respective tests. One student has yet to complete the section 6 quiz for medical reasons.

Screenshot 2016-05-09 14.43.45

Screenshot 2016-05-09 14.43.54

The percentages shown for the DIRECT Concept Test are the percentages of questions correct. The raw gain is the difference between the pre and posttest averages. The normalized gain attempts to correct for the very low pretest scores. The percentages shown for the CASTLE Quiz average are percentages of points received out of the total. Point values were assigned to the questions on the CASTLE quizzes in as binary a system as possible. The questions were either right or wrong, and only 1 point was awarded for a correct answer. On questions that required explanation or more than a single answer, a rubric of important words or phrases was created that earned students points. These important words or phrases were taken from the study guides provided to the students before the quiz was taken.


It is important to note that the sample size presented here is too small for any conclusions to be statistically significant. In the end, this data is being used to help inform the teaching of electricity in the future. The ultimate goal is to aggregate data across classes and years and compare it with national averages provided by peer-reviewed research studies in order to confirm the following conclusions.

In analyzing the results of the DIRECT concept test, all students answered more or the same number of questions correct on the posttest. Three students showed no gains in concept knowledge, but anecdotal comments after the pretest indicated that many students had never seen any of the material on electricity and had guessed on most of the questions. Cross-referencing the questions each of these students answered correctly on the pretest and posttest showed there were no similarities across both tests. This suggests that the questions these students answered correctly on the pretest were guesses, and thus their scores should really have been 0%. In each case, zeroing the pretest scores resulted in an average gain similar to the class average.

Engelhardt and Beichner provide helpful comparison data from a 2004 study they published in the American Journal of Physics. In this paper, they report pretest scores from a “Traditional High School Physics Class” as 23%7and posttest scores as 33%1. This data suggests that the method of instruction in question was more effective than a “traditional physics class” where lectures, quizzes, and end of chapter problems are the primary mode of instruction. The DIRECT Data suggests that the units on electricity were indeed effective, but it is unclear based on the methods of this study whether it was the CASTLE curriculum or the use of Canvas that was most effective. Student interviews after the posttest shed some light on this issue.

Students with the lowest gains, highest gains, and one student in the middle were interviewed following the DIRECT posttest. Across all interviews, students reported that they did not like the self-paced Canvas modules. Some of the reasons included feeling rushed to get through the activities and a desire to have the teacher explain concepts instead of reading about them on the computer. All students suggested having more check-ins as a class to answer questions and ensure that each student’s conceptual model of electricity was accurate. The largely negative response to the modules suggests that it was the curriculum itself and not the presentation that was most effective. It is worth noting that the quiz averages suggest that the Canvas modules did not hinder any student’s learning. Students often report dissatisfaction with classes that are not teacher-led because they are unfamiliar with them and don’t consider it “good” teaching. It is possible that student dissatisfaction indicates a “better” learning experience because it suggests the student recognizes that the onus is on them to learn, and they don’t like having to shoulder this responsibility. Unfortunately, the methods of this study don’t provide true insight into the question of curriculum versus presentation, and any conclusions regarding which piece of instruction was more effective are only speculative.

Overall, the data gathered from this small sample of students suggests that the style of instruction used in this study is more effective than traditional non-inquiry based methods. It is unclear whether the use of Canvas to provide immediate feedback and a self-paced experience contributed to this effectiveness. Moving forward, it will be necessary to have more students take the DIRECT concept test before and after exposure to the CASTLE curriculum to increase the sample size and improve the statistical significance of the results. It would also be useful to compare results with classes that don’t use Canvas to supplement the CASTLE curriculum. In this way, it will be possible to determine which method contributes more to student understanding. The Canvas modules should also be supplemented with more all-class model building discussions to address student concerns regarding not having difficult concepts explained.

Jacob Backon teaches with a dual assignment in the Mathematics and Science Departments. He holds a B.A. from Bard College and a M.S. from Ohio University. A native to independent school life, Jacob grew up on the campus of Choate. He coaches JV soccer in the fall.


Brown, D. E. (1992, March). Teaching electricity with capacitors and causal models: Preliminary results from diagnostic and tutoring study data examining the CASTLE project. In NARST annual meeting, Boston, MA.

Engelhardt, P., & Beichner, R. (2004, January 1). Students’ understanding of direct current resistive electrical circuits. Am. J. Phys., 72(1), 98-115.

Madsen, A., McKagan, S. B., & Sayre, E. C. (2014). Best practices for administering concept inventories. arXiv preprint arXiv:1404.6500.

Search Volumes