One problem with biology labs in general, including our own virtual labs, is that students who can’t see are excluded from using them. That’s a shame, and something we’ve wanted to find a way around for a long time. That’s how I found myself in Baltimore a couple weeks ago, helping our vice-president Ellie Steinberg run workshops for the National Federation of the Blind’s Youth Slam.
A week-long summer program organized by the Jernigan Institute, Youth Slam this year was on the Towson State campus in Baltimore, with workshops and classes designed for blind and low-vision high school students who are interested in science. Other than the ubiquitous canes, we met the same range of mellow, excited, and boisterous kids you’d find at any summer camp. The goal we set our sessions was for the students to be able to navigate through our prototype “accessible” version of our most popular virtual biology lab, Isle Royale, to conduct a series of simulated experiments, and in the process, learn something about population growth and predator-prey dynamics.
Going into this project, which is funded by the National Science Foundation, we figured there must be examples of accessible scientific displays we could use as the basis for our work. There are, in fact, nice tools for making static graphs accessible – specialized tactile printers and touch-pad interfaces, for example, that let a blind person interpret graphs using touch and sound. But these don’t work well if the graphs are constantly changing over time. In previous projects we used the pitch of a sound to indicate the height of a line on a graph, and a few other software packages do that as well. But to learn about predator-prey dynamics, you have to compare changes over time for multiple interacting species. Apparently, no readily available software package let’s you listen to multiple variables at once.
So, we decided to try it. Working into the wee hours the night before our workshops, we gave each species a different timbre – flute for the wolves, viola for the moose, harmonica for grass, and so on. The students first focused only on the moose population. They easily distinguished linear versus exponential growth curves, and could hear the moose population skyrocketing initially and then crashing back down to a carrying capacity. So far so good. Next we challenged the students to figure out which of the three plant species the moose were eating. Listening to how the plant populations changed over time in comparison to the moose population, they figured this out as well. Then we reached the part of the lab where they add wolves, and had to compare population dynamics of up to five species to figure out what was going on.
Having multiple voices worked! It was really exciting to see. Using a special graph interface we built, many of the students figured out the population cycling, and the phase differences between species. Not only that, they were jazzed – some commented that it was the first time in their lives they had been able to really explore data on a graph. Afterwards, some were telling us about all the dumb stuff teachers came up with to keep them occupied in normal science labs – apparently even a prototype stripped-down virtual lab was miles better.
Simulation labs are challenging to make accessible because they are so visual, but this small experiment shows that with some thought and effort, it’s possible to go a long ways towards accessibility. Our attempt is still quite rough around the edges and we have a lot of work left on the details, but we’re now very hopeful that in the near future, some judicious use of sound will let a whole new group of students take advantage of active science learning through virtual labs.