The Answers are in the Questions!
One of the reasons I enjoy working at SimBio (after three decades of building biology education software) is that we have always maintained a strong research emphasis behind our learning tools. Rather than focusing solely on producing new virtual labs, we spend a year or more on each new teaching tool, scouring the literature for learning objectives, identifying common student confusions via student testing and instructor reviews, and using the results of our own published research or others’ to validate and improve our content. Having evidence that our content is effective feels good, and it keeps life interesting to be doing both fundamental and applied biology education research. Indeed, we’re always looking for opportunities to continue the research side of our work.
Three years ago, at the SABER annual meeting, I heard a talk from the Automated Analysis of Constructed Response (AACR) group headquartered at Michigan State University. This multi-institution team was inspired by research showing that instructors remain unaware of important nuances regarding their students’ comprehension of complex topics when they pose questions in highly constrained formats like multiple choice. SimBio has a long-running research program inspired by the same body of research. We’ve investigated how to use question formats with less constraint than multiple choice, but more than an essay question – what have been termed “intermediate constraint” formats. In a recent paper in Computers & Education, we showed that using intermediate constraint question formats in learning tools can help students learn more than they would with high constraint (multiple choice) or lower constraint (essay) questions.
Essay questions are hard to auto-score, making them less effective for teaching and assessment. AACR is successfully tackling this challenge – postdoc Kamali Sripathi gave a talk focusing on the development of new short-answer questions to assess student understanding of cellular respiration. Coincidentally, SimBio has a popular lab called Cellular Respiration Explored that teaches the same ideas AACR is attempting to measure. While we had good anecdotal feedback on the lab’s effectiveness, we lacked a formal measure of student learning. On the flip side, AACR’s student test population was rather narrow, consisting mostly of students in large 4-year research universities. As Cellular Respiration Explored is used in a broad range of classes, we realized there was a unique opportunity to collaborate to test and improve both our tools.
The result was a satisfying study recently published in CBE Life Sciences Education, led by another AACR postdoc, Juli Uhl, under the direction of Kevin Haudek. We incorporated several AACR short-answer questions as a pre/post-test around our Cellular Respiration lab. Two of the questions targeted concepts important to cellular respiration. The third was unrelated and served as a control. For us, it was satisfying to see that there were large improvements in student understanding of cellular respiration as measured by the AACR questions after using Cellular Respiration Explored (and little change on the control, as expected). The AACR team was able to investigate differences in understanding and types of confusions among a more diverse student body than they had previously tested, which will help improve their scoring algorithms among other learnings.
For those looking to assess students – either to determine whether learning tools like SimBio’s labs are helpful, to determine where your students need help, or to measure the effectiveness of your course as a whole – I definitely recommend checking out AACR’s website. Instructors are free to teach using their questions and then can upload the answers to a web-based interface to return an automated analysis. It’s a great service to take advantage of – we’re certainly happy to have discovered this new means of testing our own learning tools!
– Eli Meir, SimBio founder and author