Scientists have created RoboEarth, which is essentially an internet designed for robots.
RoboEarth was created to allow robots to learn from other robots and share their information as well. The system, which runs on a cloud, is the result of a project which took four years to complete and was backed financially by the European Union. It was designed by students from several European universities, as well as scientists who work at Phillips.
“At its core RoboEarth is a world wide web for robots: a giant network and database repository where robots can share information and learn from each other,” said Rene van de Molengraft, who is the project leader for RoboEarth. “The problem right now is that robots are often developed specifically for one task. Everyday changes that happen all the time in our environment make all the programmed actions unusable.”
“A task like opening a box of pills can be shared on RoboEarth, so other robots can also do it without having to be programmed for that specific type of box,” Molengraft said as well.
Cloud technology is useful so far as scientists and engineers won’t have to worry so much about internal computing power or storage space. Also, if the majority of the information that a robot needs is located in one central place, then we’ll know where to go to attempt to stop them if a robot uprising does occur.
Thinking along these lines is James Barrat, an author who has written a lot about robot artificial intelligence and its potential dangers. “In the short term, RoboEarth adds security by building in a single point of failure for all participating robots,” he said. “In the longer term, watch out when any of the nodes can evolve or otherwise improve their own software. The consequences of sharing that capability with the central ‘mind’ should be explored before it happens.”
[via BBC News]