Scientists have discovered that when a robot is made of “smart” materials like carbon and diamond, it can tell its human counterpart from a distance.
It’s not quite a game-changing discovery, but it’s significant nonetheless.
In a paper published in the journal Nature Communications, the researchers from the University of California, Santa Cruz and the University at Buffalo describe how they used a combination of algorithms and data to determine how and when a robotic arm could be programmed to respond to a human’s movements.
They then created a robot that was equipped with sensors that could detect the presence of a human in the environment, such as by moving around a room.
These sensors were paired with a set of computer code that could be used to control the robot’s movement.
The robot was programmed to move around a space and then to return to the starting position when it was given a clear cue to return.
The researchers found that a robot programmed to recognize humans could be able to “remember” their presence in the space by scanning a human for an appropriate distance.
In this way, the robot could be “befuddled” by a human and be able adjust its behaviour accordingly.
When they tested the robot with the human, the robots ability to distinguish between them improved substantially.
The robot can still “read” human facial expressions, and it could also distinguish between a human who is holding a toy and someone who is not, for example, when it detected a human.
But what this means is that it is able to respond more quickly to human signals, and can even “remember,” or “recognize,” human faces, such that the robot can continue to move forward if it has not detected a signal from the human.
“The robot that can distinguish between humans is not a robot built for human interaction,” said lead author Professor Christopher McBride, who is based at the University.
“It’s a robot designed for robotics.
That’s a really exciting finding, and this is just one example of how the field is moving forward.”
In the future, McBride said, robotic arms could be designed to recognise human facial patterns and respond to their movements in ways that would allow for “better navigation, for more realistic interaction.”
The research was funded by the US National Science Foundation (NSF), the National Institutes of Health (NIH), and the California Institute for Technology (Caltech).