Cambridge University has built a robot that can mimic human facial expressions. Its name is Charles and it looks a lot like a prosthetic Sean Penn.
The automaton is part of the Department of Computer Science and Technology‘s research into whether or not humans will engage more with machines that can display human body language and facial expressions.
By using a complicated system of mechanical servos controlled by a computer program, Charles can mimic a number of facial expressions that more or less make him look like a stunned mullet.
Think of it like a physical animoji – someone pulls a face at a camera, the computer analyses it, and then tells Charles which servos to activate in order to express the same look. As the researchers point out, mimicking real human muscles is bloody hard.
“Charles is remarkably realistic, the prosthetics are very good, but the motors are just not like human muscles,” said Professor Peter Robinson.
“Our control programmes are just not quite fine enough and the monitoring of the human face we’re using at the moment is just not quite good enough and so it looks unnatural.”
The whole point of Charles is to see how people interact with a machine that’s capable of displaying relatable emotional traits, but the problem is, he’s sitting pretty firmly in what they call the Uncanny Valley – a graph that determines how revolted people are when confronted with something that appears nearly human, but not quite.
You can check him out in action below.
Robotic shortfalls aside, Robinson’s research actually delves into some pretty deep human behavioural questions.
“The more interesting question that this work has promoted is the social and theological understanding of robots that people have. Why do, when we talk of robots, always think about things that look like humans, rather than abstract machines, and why are they usually malicious?”
“That tells us something about people more than it tells us about the technology of the machine.”
It’s a fascinating time to be alive, folks.