Ads

Post Page Advertisement [Top]


In Brian Aldiss’ landmark short story “Supertoys Last All Summer Long,” a child whiles the day away with his sentient teddy bear. Since the story was published in 1969, the “boy and his robot” trope has become a staple of science fiction. According to new research, however, those relationships aren’t nearly so idyllic in real life.

During a six-month study at a Swedish primary school, students took lessons in urban geography from a robot tutor. As they played map-reading and city-planning games on an interactive touchscreen, the 10-to-12-year-olds received coaching from NAO-T14, Softbank Robotics’ “interactive companion robot,” which was often the only “authority” in the room. If that sounds like a setup for the best school day ever (No teachers! Playing with robots!), here’s a transcript of one boy’s valiant struggle to operate the touchscreen:

ROBOT: Can you see this symbol?

Oliver moves over to the tool buttons and presses the measuring tool. Oliver deactivates the compass button and the measuring tool button. Oliver presses the measuring tool again. Oliver deactivates the measuring tool again. Oliver signals to the researcher by knocking on the door.

ROBOT: Is there really a tourist center there?

Researcher enters room.

OLIVER: How does the measuring tool work?

RESEARCHER: [briefly explains how to use the measuring tool]

Researcher leaves the room. Oliver looks at the screen and around the room for approximately 30s, growing increasingly stressed. Oliver sits down in armchair in the room, begins to cry.

ROBOT (perceives the emotional distress and tries to engage Oliver in small talk): What is your favorite subject? Mathematics is a good subject for robots to learn since it’s based on special rules. I only know geography though.

Oliver looks at the robot, but continues to cry.

I’m not sure what’s more tragicomic, the robot’s patronizing instruction (“Is there really a tourist center there?”) or the fact that its idea of small talk involves waxing fondly about math. At any rate, the transcript is begging for adaptation into an indie short film.

Above: NAO-T14, getting ready to drop some knowledge.

Image Credit: © Volkova t a / Wikimedia Commons / CC-BY-SA-3.0 / GFDL

Oliver wasn’t the only one who had issues with his automated tutor. The study, published in Computers in Human Behavior, reports that “students had a hard time understanding what the robot said over the noise produced by the motors in its gesturing arms.” When they could comprehend what the robot was saying, its “monotone voice” tended to produce inattention. (My own students have cited at least one of these two issues on their evaluations, so I can sympathize.)

In all seriousness, the study offers a stark reminder of the need for teachers to assert humane control of smart technology as it evolves. While relatively few instructors will have NAO as a teaching assistant anytime soon, we do have an ever-expanding menu of apps, tablets, and software at our disposal. Contemporary education’s fixation with data pressures us to deploy these tools in the name of “engagement” — activities that, more often than not, frame education purely in terms of quantifiable metrics. This drive for maximal efficiency will only accelerate with artificial intelligence’s permeation of our classrooms. Minus an empathetic teacher aware of technology’s depersonalizing power, sensitive students like Oliver will find themselves adrift in a sea of analytics dashboards and “mind-reading robo-tutors.” The whole episode reads as a parable about the dangers of leaving students to the care of unfeeling robots, especially given the proven benefits of teaching emotional intelligence to our students.

At the same time, it does little good to yearn for “dumb” boards and grading by candlelight. America’s chronic teacher shortage and the overcrowded classrooms that inevitably result from it makes technology the ultimate frenemy of harried educators. We may hate the eternal need to learn operating instructions, but it’s also the easiest way to keep track of 70 students at a time. At any rate, the presence (or absence) of technology isn’t the root issue. The study’s author concludes that malfunctioning technology, a “perceived lack of consistency and fairness,” and confusing instructions all produced “breakdowns that jeopardized children’s sense of agency.” In other words, what matters isn’t whether students are alone with an emotionally obtuse android, but whether they feel empowered in their quest for knowledge.

Instead of rejecting or ceding control to smart technology, then, teachers need to incorporate it into pedagogy that highlights students’ agency. Sometimes, this means building up to exercises that require sophisticated use of software. More often than not, it also means a steady dose of old-fashioned humanity: Modeling empathy, active listening, and other practices that treat students as whole persons. Teachers can also make use of apps that engage students’ feelings, tracking their emotions and guiding them through mindfulness exercises. Incorporating such practices can be time-consuming, but they allow students to process difficulty or frustration, reminding them that inability to achieve a certain score isn’t a death sentence.

If pedagogy-as-data-transfer rules the day, the Swedish experiment offers a taste of what the future of educational technology might look like. Thankfully, a growing cohort of startups prioritizes “emotive computing,” developing technology that educators could, one day, be used to customize coursework based on a student’s emotional response. In demonstrating demand for emotionally intelligent technology, teachers have the chance to steer the next 20 years of education. Conversely, if we merely use technology to convey information from screen to brain, the effect on our students may mirror the twist in “Supertoys Last All Summer Long”: The young protagonist turns out to be a robot himself.

Lucas Kwong is a professor of English at New York City College of Technology.

No comments:

Post a Comment

Bottom Ad [Post Page]