Robotics, neuroscience, developmental psychology, and machine learning have converged in a new project led by University of California San Diego researchers. Meet Diego-san, a robotic one-year-old that "learns" to move and interact the same way a real baby does -- by watching you.
Besides bringing us that much closer to the uncanny valley, Diego-san is also providing a deeper understanding of sensory motor and social intelligence in children.
Diego-san stands about 4 feet 3 inches tall and weighs 66 pounds. Its body has a total of 44 pneumatic joints. Its eerily lifelike head contains around 27 moving parts. High definition cameras in his eyes allow him to "see" the world around it -- and the research team is developing algorithms that allow it to “learn” from cues like gestures, facial expressions and other movements. Just like a real baby.
As reported by science website Phys.org:
The robot is a product of the "Developing Social Robots" project launched in 2008. As outlined in the proposal, the goal of the project was "to make progress on computational problems that elude the most sophisticated computers and Artificial Intelligence approaches, but that infants solve seamlessly during their first year of life."
For that reason, the robot's sensors and actuators were built to approximate the levels of complexity of human infants, including actuators to replicate dynamics similar to those of human muscles. The technology should allow Diego-san to learn and autonomously develop sensory-motor and communicative skills typical of one-year-old infants.
Creepy as this science might seem, researchers hope it will provide new insights into the study of infant development and help us better understand developmental disorders such as autism.
The video below is the first of Diego-san to be publicly released, showing its robotic head moving through facial expressions it has learned from the humans around it.