Eleven years ago, SimBio received a grant from the National Science Foundation to “develop techniques for automatically providing immediate formative assessment as students are conducting simulation-based experiments and reasoning about their results”. As I wrote at the time, it was supposed to be a four-year project with partners from the Scheller Teacher Education Program at MIT and California State University, Fullerton. During those years we had a number of interesting results including a surprising finding about how certain question formats can promote learning. And now, as these things go, eleven years later we recently published the last major piece of the project.
The paper, Designing Activities to Teach Higher-Order Skills: How Feedback and Constraint Affect Learning of Experimental Design, tested whether feedback and the manipulation of constraint helps students learn complex skills like experimental design. The short answer is yes, in a big way. The details are interesting for groups like SimBio who develop teaching tools in biology.
The research was based on early versions of SimBio’s Understanding Experimental Design (UED), a tutorial that helps students design experiments and summarize and interpret results (and has since become a customer favorite). Our first result proved that Understanding Experimental Design is effective – using an independent test, students showed learning gains of Cohen’s D = 1.0 after using the tutorial which, if you are not familiar with the measure, is considered well into the “large effect” range. Anecdotally, instructors relate that using the tutorial helps students design better independent projects later in the term.
But why does it work so well? We thought it might be due to either the immediate feedback students receive as they design experiments in the tutorial, or to the way we constrained aspects of what students are able to design to help guide them. It turned out that our automated feedback to students, within an open-ended simulation, improved learning gains. On the other hand, when we changed how open-ended the activity was – the constraint on what students could do when designing experiments – it made little difference to learning gains. It did, however, make a difference in our ability to provide feedback. The conclusion, spelled out both in the CBE LSE paper and a related book chapter, is that it’s ok to manipulate constraint on an open-ended activity in order to make sure one can provide students with quick, specific feedback.
It’s a lesson SimBio is carrying forward as we tackle other complex skills needed by biology students, such as our current project, GraphSmarts Assessments, which analyzes the graph construction and interpretation skills of biology undergraduates.