Seven principles of student-centered learning

What are key principles for student-centered learning?

After many years of developing online learning, MindEdge has identified seven principles for improved student outcomes.

They are:

  • Place the student at the center of the learning experience.
  • Leverage existing knowledge and skills.
  • Respect the learner’s time.
  • Employ diagnostic assessments to ascertain gaps in knowledge.
  • Offer varied learning experiences.
  • Provide needed scaffolding and practice for learners.
  • Measure progress and mastery of learning objectives.

Student-centered instruction focuses on supporting and empowering the student in mastering the skills and knowledge in a given field of study. This requires shifting from the traditional “sage on the stage” approach for instructors to one built around coaching and advising. Students who are engaged, learn. Students who are challenged by interesting learning experiences, learn.

student_image

Today’s students often have prior subject knowledge. Whenever possible, educators should capitalize on this foundation, valuing and extending what students bring to the classroom (traditional and virtual).

A well-constructed learning experience lets students know what is expected of them, and focuses assignments and assessments on the learning objectives outlined at the start of the course. Time is precious—avoid the extraneous or off-topic. Let students learn at their own pace.

Assessing prior student knowledge, and identifying gaps, is a valuable exercise for learners and instructors. This can be accomplished through simple diagnostic assessments at the beginning of the learning experience. It helps determine what should be emphasized, and what can be given a more cursory review.

Variety matters: it keeps learners engaged. The jury is still out on whether “learning styles” represents a valid way to categorize how we learn, but offering numerous ways for students to learn—including video, audio, infographics, interactives, assessments, games—appeals to many learners.

Not everyone “gets it” immediately, so it’s important to offer deeper levels of instruction for students who may struggle with new concepts. Whole-Part-Whole learning and adaptive learning are ways to integrate scaffolding and support into instruction.

At every phase of learning, assessing student progress is key. Asking students to demonstrate their mastery of material by synthesizing answers to higher-order questions is one helpful method of measurement.

These principles can be applied in all forms of learning (classroom, blended, online), and can provide a useful framework for developing and designing effective learning experiences.


Copyright © 2016 MindEdge, Inc.

Closing the 2 Sigma Learning Gap


We see a significant opportunity to use what we know about learning and the latest technology tools to dramatically improve student performance—to close the 2 Sigma Learning Gap.
This gap was identified by educational psychologist Benjamin Bloom. He and his fellow researchers found that the average student who was tutored one-to-one using “mastery learning techniques” performed two standard deviations (2 Sigma) better than students in a classroom. (Simply put, mastery learning techniques insist that students achieve mastery of knowledge and skills before proceeding to the next stage of learning.)
Bloom argued that one-on-one instruction would be “too costly for most societies to bear on a large scale,” and his proposed solution was to uncover those key variables in instruction that could be tweaked to improve student performance and then applied broadly.

https://learningworkshop.mindedge.com/wp-content/uploads/2015/09/2sigma_gap.jpg

The top six factors for improvement researchers uncovered (in rough order of importance) were the following:

  • Tutorial instruction
  • Reinforcement
  • Feedback-corrective
  • Cues and explanations
  • Student classroom participation
  • Student time on task

There has been a significant amount of experimentation and testing of these factors in the classroom and, increasingly, in online environments. In his initial paper, Bloom suggested that technology might be one way to scale mastery learning.

MindEdge’s approach

We design MindEdge Learning online courses and simulations to leverage technology to apply those six factors. We’ve integrated them into the five pedagogical tools best suited for adult learners. Those tools are:

  • Assessments
  • Gamification
  • Whole-Part-Whole Learning
  • Narrative Learning
  • Adaptive Learning

When we create courses, we look at how to best reach the learner employing these tools, with Bloom’s factors in mind.

Employing the learning tools

Assessments, for example, can be a vital tool in achieving Bloom’s mastery learning. Research shows that students who are continuously questioned about what they’ve learned perform better. Indeed, initial testing taps into the counterintuitive concept of “learning by failing.” Diagnostic assessments can personalize learning and help students focus on mastering challenging topics. Assessments address the Bloom factors of Reinforcement, Feedback-corrective and Cues and explanations.
Gamification—using game design elements in educational contexts–can engage and challenge the learner in different ways. Learning feels more personal when playing or competing, and educational research supports the value of “learning by playing.” Students who gravitate to a game environment are likely to spend more time engaging with the educational content. Gamification addresses the Bloom factors of Reinforcement, Feedback-corrective, and Student time on task.
Whole-Part-Whole Learning (WPWL) presents students with an overview of learning content (Whole), then guides them through the specific components of that knowledge or skill (Part), and then asks them to recreate that content (Whole). A pedagogical approach that has been adopted by adult educators, WPWL helps provide context and slows down the learning process. It addresses the Bloom factors of Reinforcement and Student classroom participation (in an online setting, the process of recreating the Whole can be structured to mimic classroom participation through instructor-led discussions or video conferences, or collaborative group work).
Narrative Learning (NL) engages students through case studies, scenarios, and simulations and asks them to apply their learning. We’ve found that students respond well to the real-world relevance of NL, and research suggests that humans are hard-wired to learn through story-telling. NL addresses the Bloom factors of Reinforcement, Student classroom participation, and Student time on task.
Adaptive Learning (AL) is the tool with the greatest promise for closing the 2 Sigma Learning Gap. It directly offers tutorial-like help that personalizes instruction and focuses on individual learning challenges. Students are helped through difficult topics by individualized scaffolding, varied content presentation, and iterative drills and problem solving. We’ve found AL works best in combination with the other teaching methods we employ—it’s best to offer learners a variety of approaches. AL addresses the Bloom factors of Tutorial instruction, Reinforcement, and Feedback-corrective.
The following chart summarizes the way MindEdge employs these tools and their impact on students and how they relate to Bloom’s six factors.

Pedagogical tool Approach Impact on students Bloom factors
Assessments (formative/
diagnostic)
Students respond to low-stakes questions throughout the learning process.
  • Personalizes
  • Frames the learning
  • ‘Learn by failing’
  • Reinforcement
  • Feedback-corrective
  • Cues and explanations
Gamification Students learn through games and interactive exercises.
  • Personalizes
  • Engages
  • ‘Learn by playing’
  • Reinforcement
  • Cues and explanations
  • Student time on task
Whole-Part-Whole Learning (WPWL) Students are presented with an overview of learning content (Whole), then guided through the specific components of that knowledge or skill (Part), and then asked to recreate that content (Whole).
  • Frames the learning
  • Develops cognitive skills
  • ‘Learn by reconstructing’
  • Reinforcement
  • Cues and explanations
  • Student classroom participation (modified)
Narrative Learning (NL) Students are engaged through case studies, scenarios, and simulations and asked to apply their learning.
  • Makes the learning relevant
  • Taps into narrative structure (conflict/
    resolution)
  • ‘Learn by story’
  • Reinforcement
  • Feedback-corrective
  • Student classroom participation (modified)
Adaptive Learning (AL) Students are helped through difficult topics by individualized scaffolding, varied content presentation, and iterative drills and problem solving.
  • Personalizes
  • Targets common instructional pain points
  • ‘Learn by focus’
  • Tutorial instruction
  • Reinforcement
  • Feedback-corrective

 

Closing the gap

Initial meta-research studies have suggested that online learning matches or exceeds traditional classroom instruction, although certainly not by 2 Sigma levels. We’re confident that when the pedagogical tools are applied correctly, more significant improvements in student performance are possible. MindEdge Learning’s academic partners have seen completion rates and test scores improve in course with these tools.
Our experience with adaptive learning in several difficult undergraduate courses (Composition, and Critical Thinking) suggests that students welcome the personalized tutorial-like focus (with 95% of students finding the AL segments helpful). As part of our commitment to data-driven analysis, MindEdge Learning continues to explore ways to better capture the effects of different approaches on student performance–and closing the 2 Sigma Learning Gap.
 


Jefferson Flanders is president of MindEdge Learning. He has taught at the Arthur L. Carter Journalism Institute at New York University, at Babson College, and at Boston University.
 

Reference:

2 Sigma Learning Gap: See: Benjamin Bloom. (1984). “The 2 Sigma Problem: The Search for Methods of Group Instruction as Effective as One-to-One Tutoring,” Educational Researcher, 13:6(4-16).
Assessments: ‘Learn by failing.’ See: Henry L. Roediger and Bridgid Finn, “Getting It Wrong: Surprising Tips on How to Learn, ” Scientific American, October 20, 2009.
Gamification: ‘Learn by playing.’ See: Juho Hamari, Jonna Koivisto, and Harri Sarsa. “Does gamification work?–a literature review of empirical studies on gamification.” In System Sciences (HICSS), 2014 47th Hawaii International Conference on, pp. 3025-3034. IEEE, 2014.
Whole-Part-Whole Learning: ‘Learn by reconstructing.’ See: R. A. Swanson and B. D. Law, “Whole-Part-Whole Learning Model.” Performance Improvement Quarterly, 2010: 6: 43–53.
Narrative Learning: ‘Learn by story.’ See: M.C. Clark, “Narrative learning: Its contours and its possibilities.” New Directions for Adult and Continuing Education, 2010: 3–11.
Adaptive Learning: ‘Learn by focus.’ See: Jefferson Flanders, “Exploring the Iceberg: Why selective adaptive learning meets the needs of students.” EdTech Digest, June 10, 2014.


Copyright © 2015 Jefferson Flanders

Bloom’s Taxonomy and learning

One common framework employed in designing instruction is Bloom’s Taxonomy, a pedagogical tool that helps trainers and educators organize learning activities by the type of thought they ask of students.

A committee of educators chaired by Benjamin Samuel Bloom, an educational psychologist, proposed this systemic approach, published in Bloom’s Taxonomy of Educational Objectives in 1956.

Bloom’s Taxonomy had helped provide a common language for educators. The model has three distinct learning domains: Cognitive, Affective, and Psychomotor. Within each domain, learning falls into various levels. It is generally true that mastery of higher levels of learning (such as synthesizing ideas to make something new) requires mastery of knowledge and abilities at the lower levels (such as comprehending the writing of others and being able to recall specific facts at will).

The Cognitive Domain

Learning in the Cognitive domain involves the development of skills of knowledge, comprehension, and critical thinking. Most online learning courses have the bulk of their objectives in the cognitive domain. There are six levels within the Cognitive domain (listed below in order of least demanding to most demanding):

  • Knowledge (Remembering). Requires the indication of memory of materials previously encountered.
  • Comprehension (Understanding). Requires demonstrating an understanding of those facts by sorting, comparing, and describing them, and by reducing them to more essential ideas and facts.
  • Application (Applying). Requires the use of knowledge in a new and different way.
  • Analysis (Analyzing). Requires the examination of information, the reduction of ideas and facts to more fundamental ones, or the identification of causes.
  • Evaluation (Evaluating). Requires that one present and defend judgments based on the information that has been learned.
  • Synthesis (Creating). Requires compilation of information in different ways.

The classic pyramid of Bloom’s Cognitive domain shows the six levels with the least complex level at the bottom and most complex level at the top.

The progression of the hierarchy in this pyramid diagram has received some criticism over the years. One point of disagreement is whether evaluation or synthesis is the highest level of learning. Bloom’s original hierarchy set evaluation as the highest level, though recent education scholars now believe synthesis/creation to be at the highest level. Other critics argue that while the first three stages of the hierarchy do occur in progression, the final three are actually parallel to one another. In addition, it has been suggested that the categories should actually be identified in verb form since performance words tend to be verbs as well.

Applying Bloom’s Taxonomy to Learning

By focusing on the way learners process information and establishing six levels of cognitive learning, Bloom’s Taxonomy helps instructors to move students beyond simple knowledge, or fact-gathering, to more challenging orders of thinking, such as understanding, applying, analyzing, and synthesizing.

Based on a given level of cognitive learning, the system can be used to help:

  • define learning objectives for a course or program
  • formulate questions and assignments
  • establish assessments, essay topics, etc.
  • evaluate student discussions

Bloom’s Taxonomy can be a helpful guide in assessing the way material will be presented and taught and how a given pedagogical approach matches up with any learning outcomes that have been established.

Engaging Learners at all Levels

While much effort has been put into discussing the most accurate order of performance skills, it’s best to not slavishly adhere to methods that require students to master lower level thinking before engaging in higher-level thinking. It often seems that as we learn, we employ skills at multiple levels simultaneously, or a student may skip from the first to the fifth step when learning some new information.

The fact that lower-level skills (such as recall) seem easier to teach and easier to test for has, in the past, led to poorly-constructed educational resources that bore the learner with constant drilling before helping the learner to engage in more thought-provoking and interesting applications of these skills. Scaffolding can help learners master less complex skills even while engaging them in real-world tasks.


Copyright © 2014 MindEdge, Inc.

Learning and Interactivity

touch screen on tablet computer
People often wonder what, exactly, counts as interactive learning, assuming that interactivity has to be high-tech and draw upon complex graphics and expert coding techniques. In reality, an interactive course may be high-tech, but interactive learning can describe any course for which the development team considers and plans interactions between the learner and content, instructor, and other learners rather than just presenting content with the hope that the learner will absorb it passively.
We know that learners must actively construct their knowledge by converting new information and new experiences into learning. They need to be engaged in their learning to facilitate making connections, and that engagement happens through interactions available to the learner.
There are three broad modes of interaction in education:

  • Learner-content interaction. The learner’s interaction with content such as course readings, videos, activities, and games.
  • Learner-instructor interaction. The learner’s interaction with the instructor, which may include written feedback, face-to-face presentation, and meetings conducted in person or via voice or video conferencing features.
  • Learner-learner interaction. The learner’s interaction with other learners, which might include discussion board assignments, peer review of produced work, or official or unofficial group study sessions.

Planning for these interactions makes it more likely that the course you design will be effective. In fact, in Distance Education: A Systems View, Michael Moore and Greg Kearsley (2005) suggest that one of the most common reasons that distance education courses fail is that an inordinate amount of attention is placed on the presentation of information rather than on the cultivation of interaction between the learner and the course:

“Whether the primary communication medium is online or print, audio or videotape recordings, broadcasts or teleconferences, there is often an imbalance between the time and effort devoted to experts’ presentation of information and the arrangements made for the learner to interact with the content thus presented, and the instructor-learner interaction and learner-learner interaction that we have discussed. Simply making a video presentation or putting lecture material on a Web site is no more teaching than it would be to send the students a book through the mail.” (145)

To ensure that learners are learning deeply and actively, course developers and instructors need to work together to create courses with attention to the interactions we are asking of the learner. Best practice calls for an authentic “back-and-forth” with the learner, an approach which encourages active learning.


Copyright © 2013 MindEdge, Inc.

About the National Institute for Online Learning

NIOL_logo_final_300

Last week MindEdge Online added two courses on online learning from the National Institute of Online Learning (NIOL). It was a milestone for the Institute, which seeks to improve the quality and effectiveness of online learning, especially for adult learners, by promoting best practices and innovation in the field.

We founded NIOL last year for several reasons. First, we thought that MindEdge Learning had knowledge and expertise of value that we believed would be helpful for those involved in online education. MindEdge has developed effective online courses and simulations used by hundreds of thousands of students in higher education and the private sector. The Institute seemed to be an appropriate vehicle in that transfer of learning.

Second, in working with partners and new entrants to the field, we encountered somewhat of a gap between theory and practice—some of those tasked with designing and creating online courses did not have prior grounding in learning theory or much exposure to the technology involved. We think the Institute can help in educating those who want a deeper background in online learning.

Third, we wanted a place where those interested in educating adults would be able to find resources. While the recent emphasis on MOOCs and undergraduate online education is promising, we thought that the challenges of designing and creating online courses and simulations for adults continues to deserve focused attention.

For those reasons, and others, we decided it was time for NIOL. What can you expect from the Institute in the near future? NIOL will be focused on training, education, consulting, and advocacy.

The Institute will offer additional courses focused on various aspects of online learning, including instructional design, course development, and key technologies. By the end of 2013, learners will have the opportunity to earn NIOL’s Online Learning Fundamentals Certificate, awarded for the successful completion of the Institute’s twelve introductory courses.

NIOL will also release occasional white papers focused on relevant learning topics (including narrative and adaptive learning) and will host webinars on best practices in course and simulation design and on technology issues.

The Institute will also be establishing an advisory board of academics, practitioners, educators, and others interested in online learning to help us keep NIOL abreast of the latest developments in the field.

You can learn more about the Institute at the NIOL website, or you can contact me directly at MindEdge Learning (info@mindedge.com) with any questions or suggestions.


Jefferson Flanders, an author and educator, is president of MindEdge. He has taught at the Arthur L. Carter Journalism Institute at New York University, Babson College, and Boston University.
Copyright © 2013 Jefferson Flanders

The Benefits of Instructional Scaffolding

Often, students struggle to understand new or difficult concepts and tasks on their own. Instructional scaffolding is a technique that incrementally guides students through these tasks by providing temporary support until the student is able to operate independently.

To prevent learners from becoming dependent on instructor assistance, successful scaffolding should be broken down into three stages: planning, execution, and fading. These stages create an arch-like process by gradually increasing instructor assistance and then gradually removing assistance. We’ve found this technique conforms nicely to the flow of online course assignments and helps course developers focus on what students need to learn rather than just what information needs to be presented to them.

The Stages

In the planning stage, break down the concept or task into manageable pieces or segments and consider how you can support the completion of the task. As students learn more and gain more experience, they require less assistance, so you should gradually remove the support to encourage the student to operate independently.

As you decide what support to provide at what time, you’ll also need to consider what the student already knows or is capable of and what kind of support they’ll need. In live teaching sessions, you can use polling or live-quiz techniques to assess what your students know already. In an online course, you’ll need to do plenty of research on your learners to figure out what they are likely to know. Then start by providing a bit more scaffolding than you think they will need. Advanced learners will find the first part of the lesson easy, but they’ll likely be satisfied with the extra practice. Just be sure that you don’t provide too little support for learners who actually need it. So start easy.

There are several ways to provide instructional support during execution. Popular methods include modeling the task or activity for the learner, providing annotations that explain any steps in the process.

When it’s time to offer less support, allow the learner to complete more of the task, but provide opportunities for coaching. The online medium allows you to provide on-demand coaching in the form of pop-up explanations that display only when the learner asks for them. An interactive technique such as this one is beneficial when students come to an online course with varying levels of background knowledge or proficiency. As an added benefit, students tend to retain information better and grasp concepts more quickly when they can actively engage with the material.

Real World Applications

One final advantage of using instructional scaffolding in online learning courses is that exercises that use scaffolding allow learners to apply themselves to real-world work, when they might not be practiced enough in the skills to work such a difficult problem on their own.

For example, in MindEdge’s Project Risk Management: PMI-RMP® Exam Prep, we ask learners to work with a case study in which they create a list of risks ranked in order from most severe to least severe.

The process is a bit too complicated to complete without scaffolding. The learner must consider the risk and determine its probability of occurrence and impact to determine the severity. Based on that severity, the risks must then be ranked in order, and resolutions must be documented. This is too much to do for a novice to do without help. So we break the process into a few steps.

Below, you can see an activity in which the learner determines whether there is low, medium, or high probability and low, medium, or high impact. Once a learner chooses the right answer for each category, the resultant severity is confirmed in the right column.


 

After completing this activity, the learner then sees the information provided in a static chart, which he or she must use to rank risks.


As you can see above, the learner isn’t asked to exercise more than one skill at a time. He or she doesn’t have to remember how to format a risk register. The template provides headers and instructions are provided above each step to make sure the learner knows how to employ the information provided and how to input answers.

No one single type of instruction will ever be sufficient to maximize student learning or retention. And instructional scaffolding poses challenges of its own. It’s not always possible to accurately determine what a learner is already capable of, what kind of support they’ll need, and when to let them try things independently. But scaffolding gives instructors an opportunity to guide students to the point where they can understand tasks and concepts on their own and this makes instructional scaffolding a valuable tool in the instructor’s toolbox.

Copyright © 2012 MindEdge, Inc.

Meeting the quality challenge in online learning

The question of whether online learning represents an effective way to educate and train has been answered. It’s clear that e-learning works for both institutions of higher learning and for corporations and other organizations by providing a convenient alternative, or supplement, to face-to-face learning.

Now the challenge has become one of quality. It’s time for a renewed focus on upgrading the quality level of online learning. Some schools and organizations relied on instructors and self-taught course designers for their initial online offerings to students. These do-it-yourself (DIY) solutions often failed to leverage the strengths of online for engagement, interactivity, and focused instruction.

Students have noticed. A report from Eduventures, a Boston-based research and consulting firm, has found that student interest in a virtual academic experience has plateaued, and that quality concerns are part of that leveling off of interest.

Eduventures reported that there has been “only a small bump over the last six years in the percentage of adult students who said online college is equal in quality to campus learning,” according to insidehighered.com.

The report also notes the growing competitiveness of the online education market with the arrival of massive open online courses (MOOCS) and new, innovative venture-backed entrants. Learners have many more options today—from both academic and non-academic players.

The quality challenge

We’re convinced that future success in online learning means developing courses and simulations that students will see as clearly better in quality than current offerings. That’s the quality challenge those of us creating online courses and simulations need to address.

Courses need to engage and encourage critical thinking (where we think narrative learning is key). They should make use of video, interactivity, and cloud-based tools, and empower instructors to coach and guide (rather than simply present). They should address different learning styles, and be accessible to all. Their content should flex to the new smartphone and mobile devices, and should incorporate external resources whenever appropriate.

When we design and develop online learning at MindEdge, we keep these factors in mind. We also recognize that a continuous improvement process is vital because the technology and platforms learners uses remains in a state of flux. For example, our new Online College Courses (OCC) have been designed to fit on the smaller screens of the next wave of smartphones and mobile devices, and elements (games, exercises) in these courses will automatically adapt to a more limited viewing canvas.

Raising the quality level of online learning isn’t a one-time effort. It means listening to learners, our academic and corporate partners, and focusing on what works and what enhancements we should make. The educator John W. Gardner once said: “Excellence is doing ordinary things extraordinarily well.” We agree. So we’ll meet the challenge of quality by doing our best to make the ordinary, extraordinary.


Jefferson Flanders is president of MindEdge. He has taught at the Arthur L. Carter Journalism Institute at New York University, Babson College, and Boston University.
Copyright © 2012 Jefferson Flanders

Variety and online learning

Variety, the old saying goes, is the spice of life—it’s also crucial in keeping online learners alert and engaged.

At MindEdge, we see incorporating a variety of learning approaches (including video, text, interactive exercises, flipbook presentations, writing-to-learning exercises, mini-cases, simulations, book excerpts, etc.) as a best practice for learner-centered instruction and education.

Here are four specific benefits of focusing on variety:

  • Variety offers multiple entry points for the learner.

    Learners who quickly scan the assignments in an online course may enter the course with an assignment or activity that appeals to them. Some learners proceed sequentially but others skip around. Some may even start with quizzes or tests as a way of gauging their level of knowledge. Variety is of value to all of these learners.

  • Novelty helps keep the learner engaged.

    Research shows that learners are more likely to lose focus after 15 minutes or so of concentrating on new content (which is one reason why MindEdge looks to keep its video segments short). Variety breaks up the pace of learning; changing the way content is presented can spark fresh interest.

  • Learning variety can better illuminate difficult content.

    Not everyone processes information in the same way. Some learners find video presentations help them master challenging material—others prefer text, some are most comfortable with visual aids. A simple graph or chart can often elicit that “aha” moment. Consider the following graphic about Six Sigma from MindEdge’s “Quality Management Basics” course:


    Six Sigma Bell Curve

    The chart shows clearly that processes working at a “Six Sigma” level are 99.9997% defect-free (or only 3.4 defects per million process outputs) in a way that many learners are going to find quite helpful.

  • Different learning methods reinforce different learning goals.

    Learners benefit when different learning approaches align with outcomes: writing-to-learn exercises, for example, are an effective way to help students in synthesizing what they have learned. Learning games can make mastering definitions or concepts enjoyable rather than tedious. Narrative learning (case studies, simulations) connects with learners in ways that non-narrative presentations don’t.

To make sure that courses, simulations, and other learning resources provide varied approaches to learning, it’s important to plan ahead during the content development process. Instructional designers should consider the sequence and pacing of the learning. They should look for opportunities to introduce new and different learning elements. The pay-off will come in the form of learning that engages, entertains, and informs.


Copyright © 2012 MindEdge, Inc.

Anchored instruction and narrative learning

When we develop self-paced online learning we try to look at how people actually learn in their work environment and model our pedagogical approach based on that reality.
A large amount of learning occurs just-in-time. Many of us wait until we actually need the skill or technique before we learn it and apply it to solving a specific problem. For example, a marketing analyst may not master Excel macros until he or she needs them for a particularly complex spreadsheet. Or a computer programmer may not take a formal course in a new programming language (such as jquery) but instead learn it piecemeal from books or online tutorials on an as-needed basis. This makes sense, after all: it often represents an efficient use of the most valuable resource we have—time.
When we emulate this pattern of learning it leads us naturally to embedding instruction in our narrative learning—what academics call anchored instruction.
How does this work in practice? When we develop an online learning resource we review the learning outcomes first and consider where specific skills or concepts can be integrated into the narrative environment. This is better illustrated through a real-life example.
Example: Anchored Instruction in a Simulation
When we developed a management simulation focused on sustainable management, we knew that we wanted learners to use some techniques for calculating return-on-investment (ROI) on competing projects that would improve sustainability.
In our “Taking the Helm at Coastal Industries Simulation” this meant anchored instruction in a decision point where Coastal Industries, a company that manufactures transformers, is considering three competing levels of energy conservation retrofits for its manufacturing plants. Learners are asked to figure out which of the three investments (Options A, B, and C) in retrofitting will yield the highest return-on-investment (ROI). To prepare the learners to conduct this analysis, we provide the background on ROI techniques and give an example of how ROI works.
screenshot of simulation
Then learners are given the raw data in the format they are likely to encounter it in the real-world. The next step is for learners to employ a prepared worksheet (using the web-based Zoho tool) and calculate the various ROIs.
screenshot of Zoho sheets
Then learners choose one of three options based on this analysis. The simulation then reveals the correct ROI calculations, allowing learners to check their work and understand why a given decision is optimal based on the ROI results.
So learners have been asked to:

  • learn what ROI means and how it is calculated;
  • apply this knowledge to a real-world problem and calculate ROI for three competing projects;
  • make a decision based on their analysis and immediately see whether they have calculated ROI correctly and, consequently, made the optimal choice.

What makes this more than a stand-alone problem set is that this decision-making process is part of an ongoing narrative. Learners can see that making the correct “just-in-time” decision about ROI (as they would in the workplace) influences their aggregate score in the simulation, reflecting its impact on the company. They also see how a series of decisions over time (compressed in the simulation) can move an organization in a given direction.
We’ve found anchored instruction in narrative learning to be a powerful way to show learners the importance of applying tools and techniques in a real-world setting. They are more likely to grasp the significance of a given analytical approach or skill if they can envision its use in context and see how it is integrated into actual business circumstances.


Copyright © 2012 MindEdge, Inc.
More information on MindEdge’s Taking the Helm at Coastal Industries Simulation.

Best Practices: A Fading Approach to Worked Examples

Lindsey Collins Sudbury

Guest writer: Lindsey Collins Sudbury
MindEdge Senior Editor
We structure MindEdge courses using a whole-part-whole learning approach, a model that works well for adult learners. Within that broad whole-part-whole framework, we use different instructional techniques. For teaching specific parts, where concrete skills or knowledge is being transferred to the learner, we often turn to the fading approach to worked examples.
Instructional designers should not underestimate the importance of examples. Learners learn by example—oftentimes, whether for reasons of time, personal learning preference, or of lack of confidence with the language, learners actually skip instructional content and head straight for the examples. In fact, teaching through worked examples (example problems that provide concurrent instructions for solving them at each step) has been found to be more effective than opening up a problem for learners to solve on their own, even when given previous instruction (Nievelstein et al. 2010; Van Gog et al. 2006).

Why does it work?

Showing worked examples before turning learners loose to apply instructional content to a problem reduces unnecessary load on working memory and discourages the learner from filling in imagined—but incorrect or incomplete—guesses about how the process works (Cooper et al. 2001; Ginns et al. 2003) or skimming the steps rather than steeping themselves in the details necessary for learning. “Fading out” support for completion of these examples so that examples slowly become exercises or problems for the learner to solve independently is like offering training wheels. When you use the fading approach with worked examples, learners can learn by doing and understand the whole process before being asked to exercise a skill with insufficient help.
Remember to start out with the explanation of the whole concept, and ensure that the details provided in the worked examples aren’t too overwhelming for your learners. Before teaching any skill in detail, learners will usually want an overview that answers some of the main journalist’s questions: what the skill is, why it is necessary, and how—generally—it works. Answering these questions usually provides enough schema to jump into worked examples.

An implementation

We recently completed a project for a university in which we relied heavily on the fading approach to teach the skill of paraphrasing.
Paraphrasing is a skill that we take for granted, but teaching this skill to someone unfamiliar with it threatens overload at every step. To paraphrase properly, the learner must read a paragraph from a research article, look up any unfamiliar words, rephrase the paragraph in his or her own words—being careful not to introduce any personal opinions or biases—, and cite the original source using proper citation conventions.
We chose examples from the four fields of our target learners in this multi-disciplinary course, but even that meant that only a quarter of the example reading passages is likely to be from of a field familiar to each given learner. Therefore, even providing examples of paraphrases in the context of research writing can be tricky because the content of the example itself can easily add mental strain unnecessary to learning a reliable process for paraphrasing.
Our solution was to use a fading approach to worked examples until the point in the lesson where most learners should be comfortable enough with the process to apply it to new content—and then to content that they may submit for grading.
The first worked example uses a judicious amount of clicking to draw attention to important components of a paraphrase. Annotations explain the thinking process for creating the paraphrase example and, in turn, structure the learner’s subsequent self-explanations about the process:
Annotated Example
You’ll notice that the learner is asked to do very little other than click to see how the example provides a model of the skill. As the learner builds a better mental model of the paraphrasing process, worked examples will begin to ask the learner to participate more in the process by identifying which of three sample paraphrases offers a best response to the task of paraphrasing a given passage.
To further fade support for task completion and allow the learner to apply his or her understanding of the concept to new situations, the next set of exercises asks the learner to identify which parts of erroneous paraphrase responses do not meet task requirements and then rewrite the paraphrase, eliminating the errors.
Step 2 asks learners to fix a negative response
Exercises that provide negative responses, or poor models of the completed task, which the learner must correct have been shown to help learners develop self-explanation and self-monitoring skills that become helpful as the learner gains more independence. The use of negative responses has also been shown to lead to longer retention as long as learners have developed the schema to support such explanations (Große and Renkl, 2007). So make sure that your examples offer plenty of annotations and feedback to support learning. We ensure that, as with any worked example, when the activity involves a negative response, the learner is ultimately provided with specific information about what is wrong with the negative response and with a positive model for completion.
Feedback is detailed and offers a correct solution.
Only after a sufficient number of examples and exercises with fading support is the learner asked to take on all of the steps of paraphrasing independently and generate his or her own paraphrase without support to structure the process. Still, however, the suggested response offers a sample solution and discusses any complex patterns of thinking necessary for understanding the paraphrase process used for that example.
After the learner completes as many of these practice examples and exercises as he or she wishes—and is offered enough feedback so that he or she understands how the process may be applied to any content—the final exercise walks the learner through the paraphrase submission task. For this task, which will be submitted for grading, the learner paraphrases a paragraph from an article that he or she is likely to use in his or her research paper. The fading approach has sufficiently modeled the processes necessary for completing this task, and the learner is now applying the process to his or her own work.

A sample worked example/exercise sequence

Item Purpose
Overview: Answers what, why, how Provides motivation for learning; places the skill in context.
Example with annotations Offers more structure than any other example/exercise in the sequence. Helps the learner understand the goal of the tasks without resorting to forming an incorrect or incomplete understanding of the process.
Example of a completed submission task with three different potential paraphrases. Learner is asked to review the possible answers and choose one. Feedback explains which paraphrase best fits task requirements and discusses the proficiencies and deficiencies of each choice. Allows the learner to understand the scope and requirements of the process being taught without having to complete the process himself or herself.
Exercise offers a negative example. Learner is asked to explain where it falls short and correct those aspects. Feedback reinforces the salient aspects of the process. Offers structure without asking the learner to independently complete the task before he or she is ready.
Exercise asks the learner to create a paraphrase independently. Feedback provides the correct answer and discusses the more complicated decisions that were necessary for completing the task. Helps the learner begin to independently apply skills to new situations.
Exercise walks the learner through completing a task for submission Structures completion of the task while letting the learner write independently. This exercise later acts as a reference or job aid to help the learner complete the task when he or she is called upon to paraphrase in the future.

Conclusion

We were faced with teaching a complex process that was bound to cause unnecessary strain to students—several of the steps of the paraphrase process, even when taught alone, are likely to be taxing on working memory. To help learners create a complete and correct model of the process in their minds, we decided to teach by example. We used a fading approach for worked examples: we began by showing complete examples with annotations alongside (which offered the most support), and we faded out to examples and exercises that allowed the learner to complete the paraphrase process more and more independently. Key to our success was careful annotations that helped learners understand what made for a successful paraphrase, followed by exercises and examples that provided extensive feedback alongside and after learner efforts.

Primary resources

Van Gog and Rummel offer a fantastic literature review of research on worked examples:
van Gog, T, & Rummel, N. (2010, May 8). Example-based learning: Integrating cognitive and social-cognitive research perspectives. Educational Psychology Review 22:155–174.

Other helpful resources

Cooper, G., Tindall-Ford, S., Chandler, P., & Sweller, J. (2001). Learning by imagining. Journal of Experimental Psychology. Applied, 7, 68–82.
Ginns, P., Chandler, P., & Sweller, J. (2003). When imagining information is effective. Contemporary Educational Psychology, 28, 229–251.
Große, C. S., & Renkl, A. (2007). Finding and fixing errors in worked examples: Can this foster learning outcomes? Learning and Instruction, 17, 612–634.
Nievelstein, F., Van Gog, T., Van Dijck, G., & Boshuizen, H. P. A. (2010). The worked example and expertise reversal effect in less structured tasks: Learning to reason about legal cases. Manuscript submitted for publication.
Van Gog, T., Paas, F., & Van Merriënboer, J. J. G. (2006). Effects of process-oriented worked examples on troubleshooting transfer performance. Learning and Instruction, 16, 154–164.


Copyright © 2011 Lindsey Collins Sudbury