Meet with Dan Donahoo: “Robots@School”

robotsatschool-500x370

When “Robot University” was announced, journalist and digital education producer Dan Donahoo tweeted to me about a project he worked on: “Robots@School”.

Robots@School was a narrative-based research project to explore children’s expectations of robots and learning. Project Synthesis partnered with US-based research company Latitude and the LEGO Learning Institute to design a research framework and support the gathering and analysis of data worldwide.

The PDF report is on his Project Synthesis site. His research interested me, because it shows that kids don’t have negative views of robots. This affects my design: because if some of the visitors have no negative bias, then they won’t be going through a transformation…

What if most of the visitors aren’t having their assumptions flipped? Am I aiming it at people who are against robots or people who are not against robots? At the moment it is the former, but I have to design it in a way so that it works for people who do love robots. Generally speaking, who do I want to spend more time creating stuff for? Easy. Those who have already made the leap forward into inclusiveness. Is it preaching to the converted then? No, it is making more of a world where their views are reflected. Exclusiveness and intolerance is rampant. I want to see more of inclusiveness everywhere, including entertainment. Entertainment is our new folktales. They are the things that moderate and reflect our realities and morality.

Each robot could represent a different level or point of awareness/bias. Just as I initially did with AUTHENTIC, each of the NPCs could represent different perspectives on the same theme. So one robot could hate humans and want to destroy them (thus fulfilling the belief by some people that robots are like this); one robot could love humans and want to save them; one robot can’t feel love, but can think for itself. Hmmm, I like the idea of having robots with contradictions. One robot can hate humans, but obeys everything they are commanded to do (thus the reason for the hate?). One robot can love, but has no logical reason to do it. One robot can’t love, but thinks for him/itself. This relates to the wants versus needs approach to character design. But for some reason I’m not finding that grabs me. Can____ but____ seems to work for me right now.

But all of this is about designing for different audience expectations. So it is good that Dan’s study has revealed an audience group I hadn’t accounted for in my design. But what exactly did his study find?

From our recent work with children, we know that young people instinctively expect technology to
respond to them in very human-like ways—to motivate and empower them, often serving as a sort of
companion, rather than merely a tool for solving specific problems. While many adults think about
technology as separate from humanness, kids tend to think of it as fundamentally human. It comforts
us; it keeps us company; it helps us learn and grow; and, in some cases, it can fulfill certain emotional
needs more reliably than other people.

The study (Dan) used a “narrative-analysis” approach: where kids were asked to “imagine their lives as if robots were a fixture in their learning environments—at school and beyond”, to write a story around an environment like school or home, and draw a picture to go with it. Some examples from the report:

LarryrobotFootyrobot

What stood out for me was the idea that children were projecting their wishes onto the robot. So Dan and I had a good chat about this. That while it is great they are seeing the robots in a wonderfully helpful (patient and loving) light, the robots are embodiments of the child’s desires. What if the robot didn’t do everything the child wanted? What if it had it’s own friends? What if it found some homework more boring that others? Would the kid still love it? It seems to me that while being seen in a lovely light, the robots aren’t viewed as separate beings (which is perhaps partly due the way children view the world anyway). What if they didn’t like maths? What if they prefer basketball to football? What if it doesn’t like sport? “I have my limits” “I can’t be everything for you”. So there is scope for transformation with children too..

Indeed, if we are to imagine a future with robots that are separate beings in their own right, then it isn’t about buying a robot that is automatically your friend and shares your same interests. It is about finding a meeting a robot that shares your interests…

 

Leave a Reply

Your email address will not be published. Required fields are marked *