Computer tutors that can read students’ emotions
By Annie Murphy Paul
Human tutors — teachers who work closely with students, one on one — are unrivaled in their ability to promote deep and lasting learning. Education researchers have known this for more than 30 years, but until recently they haven’t paid much attention to one important reason why tutoring is so effective: the management of emotion. Studies show that tutors spend about half their time dealing with pupils’ feelings about what and how they’re learning. Now the designers of computerized tutoring systems are beginning to make sensing and responding to emotions a key part of the process, and they’re finding that users learn more as a result. At the same time, researchers are using the data generated by these programs to make new discoveries about emotion and its central role in learning.
One such discovery is that the feelings that dominate psychology’s conventional theories of emotion — such as psychologist Paul Ekman’s six “basic emotions” of anger, disgust, fear, joy, sadness and surprise — are not, by and large, the feelings that are involved in learning. In educational settings, it’s the “academic emotions” that occur most frequently: curiosity, delight, flow, engagement, confusion, frustration and boredom.
Researchers have found ingenious ways to identify these emotions in students; for example, the Posture Analysis Seat. This is a chair equipped with pressure sensors on its seat and back, allowing it to monitor the way learners are arranging their bodies. A student leaning forward is likely exhibiting interest and engagement; a student lolling back is apt to be bored or disengaged.
Then there’s the Pressure Mouse, a computer mouse that can detect how much pressure a user applies when clicking. Researchers have manipulated the level of frustration users feel (by employing a “a frustration-inducing online application form,” of course) and have found that the more vexed users become, the greater the pressure they exert on the mouse.
Wireless skin conductance sensors collect another type of information about emotion. These small devices attach to the learner’s hand or arm and monitor nervous system arousal, which can be positive (excitement and curiosity) or negative (anxiety and frustration).
Cameras, too, may be trained on students as they learn on computers. One type of camera records and analyzes facial expressions: eyes opening wide in expectation or squinting in close attention, eyebrows knitting in concentration or rising in surprise. Another kind of camera can track head movements, following students’ gaze on the screen and noting any head shakes or nods. An eye tracker can monitor pupillary response (pupils dilate and grow larger when learners feel interested), and a microphone can permit an analysis of the pitch and amplitude of learner’s voices.
An “affect-sensitive” computer program might use several of these sensors to collect information about the learner’s emotional state; an algorithm then sorts through the data flowing in from each channel and offers its best guess about whether the learner is feeling interested, bored, confused or frustrated. One computerized tutoring program uses “mind-reader software” to identify 22 facial feature points, 12 facial expressions and six mental states.
But accurately detecting the learners’ feelings is only the first step. The computer then has to respond to such feelings in a way that promotes learning. A computerized tutoring program called Wayang Outpost, developed by researchers at the University of Massachusetts-Amherst, features an onscreen avatar that subtly mirrors the emotions the learner is feeling. When the learner smiles, the avatar smiles too, making the learner feel understood and supported. When the learners express negative feelings, the avatar mirrors their facial expression of, say, frustration, and offers verbal reassurance: “Sometimes I get frustrated when solving these math problems.” Then — in a shift that researchers have found to be essential — the avatar pivots toward the positive. “On the other hand,” the avatar might add, “more important than getting the problem right is putting in the effort and keeping in mind that we can all do math if we try.”
Researchers carefully consider the wording of these messages — using them, for example, to promote a “growth mindset,” or the notion that ability is not fixed and can expand with effort. The messages delivered by the Affective AutoTutor, a computerized tutor developed by Sidney D’Mello of the University of Notre Dame and his colleagues, always attribute the source of the learners’ emotions to the material being studied, not to a deficiency in the learners themselves. If the learner seems bored, for example, the AutoTutor might respond with the comment, “This stuff can be kind of dull sometimes, so I’m gonna try and help you get through it. Let’s go.” If the AutoTutor senses that the learner is confused, it might advise, “Some of this material can be confusing. Just keep going and I am sure you will get it.”
The many learning sessions scientists have run in their laboratories (and the lab is where most of this emerging technology still resides) have produced reams of data that can be analyzed for clues about the role emotion plays in learning — evidence that may then be integrated with findings collected from more conventional studies. Researchers working with affect-sensitive computers have confirmed that negative emotions like anxiety and frustration can consume cognitive resources, leaving fewer resources to devote to the learning task. Positive emotions like curiosity and surprise, by contrast, tend to improve performance on the learning task. Positive emotions promote the adoption of “mastery goals” — wanting to learn information for its own sake — while negative emotions promote the adoption of “performance goals” — wanting simply to get a good grade or test score. Positive feelings lead to flexible, creative and holistic ways of solving problems, while negative feelings lead to focused, detail-oriented and analytical ways of thinking.
Notwithstanding the cognitive benefits of positive emotion, researchers in affective computing also find that deep learning must always involve a fair amount of negative emotion, concentrated in the phase in which students are struggling mightily to grasp new ways of thinking. In fact, students show the lowest levels of enjoyment during learning under the conditions in which they learn the most, and the feeling of confusion turns out to be the best predictor of learning.
Patterns of negative and positive feelings tend to follow a predictable progression, in which students feel worst when they’re in the throes of “cognitive disequilibrium,” or a state of unresolved confusion, and then start to feel better as the material becomes more comprehensible. For learners who experience repeated failures to make headway, however, confusion transitions into frustration, which in turn results in disengagement and boredom (and ultimately, minimal learning).
Skilled human tutors likely arrived at these insights some time ago. Our computers are just now catching up to what good teachers have done forever: make students’ feelings part of the lesson.