BabyRobot: it sees, it hears and it feels

08 May 2006 | News
Italian researchers have designed a small-sized humanoid robot to test a model of the human sense of "presence". They are now looking for investors to take the technology for further development.

Getting things in perspective - Genoa University's BabyBot

Researchers at Genoa University in Italy have designed a small-sized humanoid robot, BabyBot, to test a model of the human sense of “presence”  - a combination of senses such as sight, hearing and touch. The work could result in applications in robotics, artificial intelligence and machine perception.

The researchers are now looking for investors to take the technology for further development and to transfer it for industrial usage.

“With BabyBot we are investigating human development as a guideline [the instructions] for building the robots' intelligence,” said Giorgio Metta, assistant professor at the Laboratory for Integrated Advanced Robotics at Italy's Genoa University.

“We are doing this together with neuroscientists and developmental psychologists. The benefits are obvious. Once we have ‘intelligence’ as a plug-in to our machines we could then see a broad range of applications in every field of automation, computer systems, smart environments, etc. further, understanding intelligence would open up new avenues to therapy, recovery from injuries, etc.”

The idea behind the BabyBot comes from what happens when a baby tries to reach for the object that it sees. If the child fails, the object is too far away. This teaches the child perspective. If the child does reach the object, he or she will then try to grasp it, or taste it or shake it.

These actions all teach the child about the object and govern its perception of it – in a cumulative process rather than a single act. The conclusion is humans’ expectations have enormous influence on their perception.

With a minimal set of instructions the BabyBot would act on the environment. For the senses, the researchers uses sound, vision and touch, and focused on simple objects within the environment. Two experiments were conducted with BabyBot. It was made to touch an object and to grasp the object. Once the visual scene is segmented, the robot can start learning about specific properties of objects useful, for instance, to grasp them.

The Babybot has 18 degrees of freedom distributed along the head, arm, torso, and hand. The arm is a small off-the-shelf small manipulator and it is mounted on a rotating torso.

The robot’s sensory system is composed of a pair of cameras with space-variant resolution, two microphones each mounted inside an external ear, a set of three gyroscopes mimicking the human vestibular system, positional encoders at each joint, a torque/force sensor at the wrist and tactile sensors at the fingertips and the palm.

The Babybot project was started by Giulio Sandini and Giorgio Metta at Genoa. The research is being funded under the European Commission’s FET (Future and Emerging Technologies) initiative of the IST programme, as part of the ADAPT project.

Never miss an update from Science|Business:   Newsletter sign-up