The Power of Predicting

Pretesting has been shown to improve students’ retention

By Heather Morton
Senior Editor, MindEdge Learning
When I first started working as a senior editor at MindEdge, I was baffled by the company’s routine reliance on pretests. Why test students on material they have not yet studied? Their performance on a multiple-choice pretest, I thought, would lead either to discouragement (“I’m obviously not qualified to take this course”) or to complacency (“Why bother completing the module?”).
taking a pretest
Unsurprisingly, there turns out to be an evidence-based foundation for this pedagogical practice. Research shows that students who have completed a pretest retain 10 percent more information than students in a similarly situated control group who haven’t. Furthermore, there is some evidence that forcing students to predict an outcome—essentially what a pretest does—helps them engage more with the material, leading to better discussion-board postings.
Pretesting falls into a larger category of predicting, the act of having students forecast future information: how a novel will end, the effects of an economic factor they have not yet studied, how a formula might be modified to take into account another influence on profit.
Experts have several theories about how and why predicting works:
Pretesting increases students’ interest in the material that follows. Imagine you’re watching a curling match for the first time. If you’ve been forced to predict a winner in the match, you will pay closer attention to what follows. You might wonder if the techniques you saw on the ice led to the win or loss of the team you chose. Similarly, when you see information on a topic you were tested on in the pretest, you might look for information that explains why your answer was correct or not.
Pretesting helps students recognize what they should pay attention to. Novices in a field rarely have an intuitive sense of what information is important. In literature classes, for example, students may note that a character went out for a walk after dinner (unimportant) while failing to notice that other characters are addressing him as “sir” (important). A pretest primes students to focus on certain aspects of a topic. A pretest that asks about the social class of a character will direct students’ attention to information that reveals class while they are reading the novel—including how the character is addressed by others.
Pretesting primes retrieval, aiding students’ ability to connect new knowledge to old. This theory is a bit more technical. One of the most important ways we remember information is through its connection to other information. As a writing teacher, I see this most clearly with words that are only partially known. A student who has heard “benign” will connect it to “tumor” and know that a “benign tumor” is the best kind of tumor to have. She will know that long before she knows what the word “benign” means on its own. Our brains use a network of connections to store and retrieve information. “Tree” will be connected to “leaves,” “tall,” “plant,” “wood,” “forest,” “deciduous” and a host of other information we have on the topic. It turns out that we learn new information more easily when we have connected it to an already-existing network. A pretest asks us to ransack our minds for information on an unknown topic. This activation of previous knowledge allows us more easily to connect and retrieve the new information we are about to learn.
Experts caution that in the studies showing the effectiveness of prediction, students were provided immediate feedback on their predictions. Our use of prediction, therefore, should follow that model: it’s easier to remember the wrong guess you made on the pretest rather than the right answer you learned several days later. To ensure that you are harnessing the full power of prediction for your students, make sure the course provides feedback shortly after the prediction.
A pretest, of course, is not the only opportunity to mobilize the learning power of prediction. Here are some other ways a course can use the power of prediction to improve student retention:

  • In a finance course, a video might pause after a real-world problem is presented and ask students to predict which formula is most appropriate to solve it. Students choose the formula from a multiple-choice list. The video continues and reveals the formula the professor chose.
  • In a writing course, students might read the first draft of a paper and then write into a text box their prediction of the three most important issues the writer should fix in the next draft. What the instructor is looking for in the next draft is revealed and explained immediately after.
  • An art history course introduces the Baroque period by displaying side-by-side works of art depicting the same Biblical story from the Renaissance and Baroque periods. Students are asked to post to a discussion board about how the Baroque piece differently renders the Biblical story.

Because predicting can be incorporated into a variety of technological applications and applied to almost any field, it is another powerful and versatile tool to boost your students’ learning in an online environment.


Footnotes

1 Lang, James M. Small Teaching: Everyday Lessons from the Science of Learning. San Francisco: Jossey-Bass, 2016, pp. 46-47.
For a complete listing of MindEdge’s courses about online learning, click here.


Copyright © 2018 MindEdge, Inc.

Want to Be More Creative? Take a Hike!

 

Where does true creativity come from? Does it arise from a spark of divine inspiration? Is it a gift limited only to towering geniuses like Mozart and Einstein? Science says: Not so much. According to the latest neurological research, creativity can be spurred by such quotidian activities as walking, napping, and even taking a shower. This week’s MindEdge video explains it all for you.

For a complete listing of MindEdge’s course offerings on creativity and innovation, click here.

For Course Designers, It’s All About Retention

Designing to help students remember what they learn

By Heather Morton
Senior Editor, MindEdge Learning
The good news for those of us who work in the field of online education is that the low-cost, flexible alternative of online instruction continues to attract increasing numbers of students, even as college enrollment overall is down.1 Furthermore, we can feel good about our work: plenty of research supports the efficacy of online learning.2
students rarely walk out of class, but there's no social pressure to keep a browser window open.
The bad news is that online courses face one significant hurdle that face-to-face learning does not. That’s the ability—and tendency—of learners to click away from the course at the first feeling of difficulty. Students rarely walk out of class, but there’s no social pressure to keep a browser window open.
This liability of virtual education has implications for course design; instructors need to ensure that the learning experience is as frictionless as possible. But there is a deeper tension here, between what feels comfortable to the learner and what leads to genuine learning.
It turns out that a number of techniques shown to increase retention can feel difficult to students, decreasing the likelihood that they’ll stick around.
One of these uncomfortable-but-fruitful learning practices is called ”interleaved practice,” which is practice that varies the skills being exercised. Interleaved practice is more effective than the comfortably familiar “massed practice” which is, essentially, doing a lot of the same type of problem.
Unfortunately, our educational system is dominated by massed practice. In elementary school, children are given 20 pairs of fractions to multiply, one after another, in one sitting. In basketball, athletes practice shot after shot. In piano, a student plays the same measure repeatedly until it’s right. Massed practice leads to quick mastery and, more importantly, the quick feeling of mastery that learners associate with a successful learning experience. Students look for that same feeling in their online courses.
In fact, research suggests that all this repetition does not lead to greater long-term mastery. In a famous study, one group of eight-year-olds practiced throwing beanbags into a bucket three feet away. Another group practiced throwing beanbags into buckets two and four feet away. Then both groups were tested on how well they threw a beanbag into a bucket three feet away. The result? The group that practiced throwing the beanbag varying distances did significantly better than the group that practiced only on the target distance of three feet.
How can you explain to a student that this course will never give him any practice at the actual task he's being asked to perform?
Based on these test results, a successful course in throwing a beanbag three feet would have the learner practice throwing it two and four feet, instead. But how can you explain to a student that this course will never give him any practice at the actual task he’s being asked to perform?
Luckily, other research does not support the idea that students become good at tasks by never practicing them. Rather, these studies suggest that switching tasks frequently leads to a better understanding of each one individually. This is the power of interleaving.
But this power is hard-won. It’s difficult. It doesn’t feel to the learner as though she’s mastered anything.
In Make It Stick: The Science of Successful Learning, Peter Brown, Henry Roediger III, and Mark McDaniel recount a study in which two groups of students were asked to calculate the volume of four geometric figures. One group was given multiple problems at a time on a single figure, while the other was given the same practice problems in no particular order. During the practice, the group working on the massed problems (problems only testing one type of volume calculation at a time) performed significantly better on the calculations, but on a test a week later the massed problems group averaged 20 percent correct, while the interleaved group averaged 63 percent correct.3
While learners are right to say that repetition leads to quicker mastery, that mastery fades more quickly. So how does an instructional designer negotiate the divide between a positive learning experience—one that keeps students around—and a beneficial learning result?
Into the breach comes “spaced practice,” a learning strategy that calls for practices to be broken up into shorter sessions, over a longer period of time. A course can offer learners the massed learning they appreciate and then gradually space out the practice and review, interleaving the new topic with older ones. The act of forgetting (a little) and then retrieving the new information or skill cements it in long-term memory. The quick boost to short-term memory that is massed practice will gradually shift to the more permanent long-term memory.
Say you are designing an art history course that teaches students the characteristics of different periods. After presenting the characteristics of Renaissance art, you might have students explain how a few particular works of art exemplify the period (massed practice). The next task might be to pick out the seven Renaissance paintings from among a group of 14 paintings.
However, after students have completed a unit on the Baroque, they should be asked to differentiate Renaissance artworks from Baroque artworks from those that are neither. After the unit on the Baroque, students’ knowledge of the Renaissance is likely to be fuzzy. Interleaving Renaissance and Baroque works will sharpen their knowledge of the Baroque period while it shifts their knowledge of the Renaissance from short-term to long-term memory. After each new period, students can be asked to review Renaissance art in the context of recalling an increasing number of periods.
As designers, we want our students to enjoy our classes and feel a sense of mastery. But we also want to set them up for long-term success. A good instructional designer works to keep students on the page—and to make it worth their while to have been there.


Footnotes

1Straumsheim, C. A Volatile but Growing Online Ed Market. Inside Higher Education, May 2017. https://www.insidehighered.com/news/2017/05/02/report-finds-growth-volatility-online-education-market/
2Most recently, a meta-analysis of research by the U.S. Department of education. The report contrasted online to face-to-face education and found students in an online environment had modestly better outcomes. Evaluation of Evidence-based Learning Practices in Online Learning. U.S. Department of Education, September 2010. https://www2.ed.gov/rschstat/eval/tech/evidence-based-practices/finalreport.pdf
3Brown, P, Roediger III, H., McDaniel, M. Make it Stick: The Science of Successful Learning. Belknap Press: Harvard University Press, 2014, pp. 49-50.
For a complete listing of MindEdge’s courses about online learning, click here.


Copyright © 2018 MindEdge, Inc.