Instructional designers often strive to develop training material that is concise and easily digested by the target learners. They also strive to create assessments and questions that are valid, clear, and direct. After all, it’s best if the learner can focus on the learning event rather than on trying to interpret and decipher the meaning of the content.
At least, that’s a commonly held belief in training circles.
The reality is that content and assessments are often so clear and so clean that the learner’s brain coasts by on cruise control, without engaging the material.
Consider the following question:
If it takes 5 machines 5 minutes to make 5 widgets, how long will it take 100 machines to make 100 widgets?
100 minutes or 5 minutes?
This question is part of Shane Frederick’s Cognitive Reflection Test (CRT), designed to evaluate the rationality of thought and mental processing. The correct answer is 5 minutes. Each machine takes 5 minutes to make one widget, so 100 machines would make 100 widgets in that same time period.
The Brain Wants to Take the Easy Route
The human brain, however, strives to find quick and easy answers and connections with the least amount of cognitive effort. When the CRT, which includes the previous question and two others of similar design, was given to a group of Princeton students, 90% of the students got at least one of the questions wrong. The answers don’t require any higher-level math or problem solving skills – they just require a minimum amount of logical reflection to rule out an immediate, intuitive, yet incorrect answer.
Now, consider this question:
In a lake, there is a patch of lilypads. Every day, the patch doubles in size. If it takes 48 days for the patch to cover the entire lake, how long would it take for the patch to cover half of the lake?
24 days or 47 days?
This question is also part of the CRT, and the correct answer is 47. If the patch doubles in size each day and takes 48 days to cover the lake, then it covered half of the lake on the 47th day. Again, the question doesn’t require higher-level knowledge or skill, but there is an intuitive, incorrect answer that the brain wants to accept as correct with minimal evaluation.
No Strain, No Gain
The fascinating part of the Princeton CRT testing, however, isn’t the questions. It’s how the presentation of the questions changed the results. 90% of the students got at least one of the three questions wrong when the CRT was presented in regular, clear font. However, when the CRT was presented in a light gray, difficult-to-read font, like the above, the percentage of wrong answers dropped to 35%.
The difficult-to-read font resulted in more correct answers!
By adding an element of cognitive strain,the test-takers applied greater effort in determining the answers and more often rejected incorrect or flawed intuition.
Adding a Few Potholes
Of course, this doesn’t mean that instructional designers should create all of their materials in light gray font. However, it does offer an important insight into training effectiveness.
By striving to create the smoothest, most efficient path to a learning outcome, we may actually decrease the success of a learning event. The smooth path offers the least resistance to the human brain’s assumptions, intuitions, and biases. As a result, it’s possible – or even probable – that important aspects of the training will be interpreted and applied incorrectly thanks to the brain’s automatic reliance on intuition and reluctance to engage situations that appear to have an easy answer.
The challenge in applying this concept, however, is that the most common method of mental application in eLearning is a clearly defined test and assessment. Simply adding a test question or two is neither sufficient nor effective. Instead, consider adding mental strain via these integrated, in-line learning techniques to achieve better learning results.
Strain The Brain
1. Add an Application Scenario
Rather than using explicit questions to test a learner’s knowledge retention, transition to an application scenario that is framed in the context of a case study or real world business problem. One technique for creating cognitive strain is to include previously discussed elements, along with elements that are coming up in the next section of learning. The unknown content in the scenario will slow down the learner and set up a strong introduction to the content in the next section.
2. Add A Game
Games can be woven into the learning experience and framed so that they are seamlessly integrated with the course content. Virtually any scenario can be turned into a game if a system of performance rewards or goals is created in a fun context. Even scenarios are easily turned into role-play games that, perhaps, contain elements spanning the entire course content.
3. Create a Discovery Activity
Discovery activities are designed so that the learner can explore content with which he is not already familiar. Well designed discovery activities present content in a sequence that results in “Aha!” moments of realization and then allow the learner to reflect on the learning that has occurred.
Of course, there are many other ways to create cognitive strain in eLearning, but simply placing a few, strategic potholes on the learning pathway may be all that’s needed to bump the learner out of cruise control and engage his higher level cognitive processing. From a client perspective, it’s the difference between successful human change and a failed training program. From a learner’s perspective, it’s the difference between an engaging learning experience and a mind numbing exercise.
For those of you who won’t rest until you know the third question in the CRT, here it is:
A bat and ball cost $1.10. The bat costs one dollar more than the ball. How much does the ball cost?
Leave your answer in the comments.