Bbc.co.uk - A US team aims to build a robot that can work out how to use nearby objects to solve problems or escape threats.
The machine has been dubbed a MacGyver Bot, after the TV character who cobbled together devices to escape life-threatening situations.
The challenge is to develop software that "understands" what objects are in order to deduce how they can be used.
The US Navy is funding the project and says the machines might ultimately be deployed alongside humans.
It is providing $900,000 (£562,000) to robotics researchers at the Georgia Institute of Technology to carry out the work.
"Our goal is to develop a robot that behaves like MacGyver, the television character from the 1980s who solved complex problems and escaped dangerous situations by using everyday objects and materials he found at hand," said project leader Prof Mike Stilman.
"Researchers in the robot motion planning field have traditionally used computerised vision systems to locate objects in a cluttered environment to plan collision-free paths, but these systems have not provided any information about the objects' functions."
The machine has been dubbed a MacGyver Bot, after the TV character who cobbled together devices to escape life-threatening situations.
The challenge is to develop software that "understands" what objects are in order to deduce how they can be used.
The US Navy is funding the project and says the machines might ultimately be deployed alongside humans.
It is providing $900,000 (£562,000) to robotics researchers at the Georgia Institute of Technology to carry out the work.
"Our goal is to develop a robot that behaves like MacGyver, the television character from the 1980s who solved complex problems and escaped dangerous situations by using everyday objects and materials he found at hand," said project leader Prof Mike Stilman.
"Researchers in the robot motion planning field have traditionally used computerised vision systems to locate objects in a cluttered environment to plan collision-free paths, but these systems have not provided any information about the objects' functions."
Rescue bot
Mr Stilman said he planned to create software that first identified an object, then determined potential things that could be done with it, before turning it into "a simple machine" that could be used to complete an action.
Examples given include stacking boxes to climb over something, building a bridge out of debris or climbing on a chair to grab an object out of reach.
By the project's end, the software should be able to combine such tasks when necessary.
To test whether this is the case, the researchers hope to load the code onto Golem Krang - a robot already developed by Mr Stillman's laboratory - to see if it works in action.
The researchers ultimately envisage a situation in which the machine might be deployed to rescue trapped officers without needing to risk anyone else's life.
Mr Stilman said he planned to create software that first identified an object, then determined potential things that could be done with it, before turning it into "a simple machine" that could be used to complete an action.
Examples given include stacking boxes to climb over something, building a bridge out of debris or climbing on a chair to grab an object out of reach.
By the project's end, the software should be able to combine such tasks when necessary.
To test whether this is the case, the researchers hope to load the code onto Golem Krang - a robot already developed by Mr Stillman's laboratory - to see if it works in action.
The researchers ultimately envisage a situation in which the machine might be deployed to rescue trapped officers without needing to risk anyone else's life.
Perception problem
One UK-based artificial intelligence (AI) researcher said the challenge was harder that it sounded.
"For example, vision alone is not enough to tell you if an object can support your weight or be used as a lever - you need to interact with it physically to understand its physical possibilities," Prof Barbara Webb, from the University of Edinburgh's School of Informatics, told the BBC.
"This is probably a harder problem for current robotics than making a plan to solve the task."
Another AI expert suggested the project might like to draw on existing research into how animals use tools.
"A monkey will use a stick to reach a longer stick to reach an out-of-reach banana," said Noel Sharkey, professor of artificial intelligence and robotics at the University of Sheffield.
"A crow will bend a piece of wire to get a trapped food.
"I have seen this kind of work with robots, but it is a very difficult project and like all research may not work as planned, but it is well worth the effort and will advance the field."
One UK-based artificial intelligence (AI) researcher said the challenge was harder that it sounded.
"For example, vision alone is not enough to tell you if an object can support your weight or be used as a lever - you need to interact with it physically to understand its physical possibilities," Prof Barbara Webb, from the University of Edinburgh's School of Informatics, told the BBC.
"This is probably a harder problem for current robotics than making a plan to solve the task."
Another AI expert suggested the project might like to draw on existing research into how animals use tools.
"A monkey will use a stick to reach a longer stick to reach an out-of-reach banana," said Noel Sharkey, professor of artificial intelligence and robotics at the University of Sheffield.
"A crow will bend a piece of wire to get a trapped food.
"I have seen this kind of work with robots, but it is a very difficult project and like all research may not work as planned, but it is well worth the effort and will advance the field."
No comments:
Post a Comment