DARPA-Funded Robot Designed for Disaster Relief Tasks
By Cheryl Pellerin
American Forces Press Service
WASHINGTON, July 12, 2013 – One of the most-advanced humanoid robots ever built was introduced to the public yesterday in Waltham, Mass.
The 6-foot 2-inches tall, 330-pound Atlas robot, built by Boston Dynamics, is funded by the Defense Advanced Research Projects Agency and it’s designed to help humankind deal with future disasters.
That’s the goal of the ongoing DARPA Robotics Challenge, or DRC. The DRC seeks to enable groundbreaking research and development in hardware and software to help robots perform the most-hazardous jobs in human-supervised humanitarian assistance and disaster-relief operations, to reduce casualties, avoid more destruction and save lives.
The challenge was launched in October 2012 and will end after the final robot trial in December 2014, when teams will compete for a $2 million grant from DARPA.
The first challenge event was virtual -- designed for those who didn’t have their own robots or hardware experience -- and produced seven winners who designed their own software to run virtual robots through a series of tasks in a DARPA real-time open-source simulator.
The winning teams each received an Atlas robot, which will be programmed with their software. Then the teams will compete with each other and with other robots in the next event. They also will receive DARPA funding and ongoing technical support from Atlas developer Boston Dynamics.
In December, the second event and first live competition -- open to the public -- will be held at the Homestead-Miami Speedway in Homestead, Fla.
“The Virtual Robotics Challenge was a proving ground for teams’ ability to create software to control a robot in a hypothetical scenario,” DARPA Program Manager Dr. Gill Pratt said in a statement.
“The DRC simulator tasks were fairly accurate representations of real-world causes and effects but the experience wasn’t quite the same as handling an actual, physical robot,” said Pratt , adding, “Now these seven teams will see if their simulation-honed algorithms can run a real machine in real environments, and we expect all the teams will be further refining their algorithms using both simulation and experimentation.”
That software and the actions of a human operator through a control unit will guide each robot’s suite of sensors, actuators, joints and limbs.
The Atlas robot can make a range of natural movements and has an on-board, real-time control computer. The Atlas also boasts a hydraulic pump and thermal management, two arms, two legs, a torso and a head, 28 hydraulically actuated joints, a Carnegie Robotics sensor head with LIDAR and stereo sensors, and two sets of hands -- one provided by iRobot and one by the Department of Energy’s Sandia National Laboratory.
The term, LIDAR, taken from the combination of the words light and radar, is a sensing technology that employs laser beams to measure distances.
During a recent media roundtable, Pratt said DARPA wants to employ the Robotics Challenge to prove that robots can be compatible in environments engineered for people -- opening doors, climbing stairs and moving around, even in environments degraded by some sort of disaster.
DARPA also wants to demonstrate that robots can be made to use tools designed for people, from screwdrivers to fire trucks, and that robots can be supervised by people who aren’t trained to operate robots.
Another DARPA advance involving robotics is the level of communication between people and robots.
In the past, Pratt said, robots, particularly robots for explosive ordnance disposal, have been operated with human supervision at the motion level.
“A person will tell a robot to go a centimeter forward or a centimeter to the left or tell the arm to move forward to grasp and do things like that,” he explained.
But the DARPA Robotics Challenge is set up so communications are degraded, as they might be in a disaster, to the extent that such “teleoperation” won’t be a practical way to communicate with the machines, Pratt said.
“Instead,” he added, “what’s going to be necessary is for the teams to give task-level commands to the robot. Things like, open the door, go up the stairs, turn the handle. What that will require is for the robot itself to use [its own] perceptual processing … to understand what it is looking at and then to use behavior controls to execute the task while watching what the effect of the task is.”
The kinds of robots used today in disaster scenarios are derived from robots developed for explosive ordnance disposal tasks, Pratt said.
“They tend to be pretty small machines,” he added. “They have treads in most cases and they’re mainly used for inspection, so they help give situational awareness to first responders … but they don’t do anything to really affect the disaster.”
The hope is to develop machines that can intervene and help make a disaster less severe, Pratt said, adding that a good example occurred during the Fukushima Daiichi nuclear disaster following the Tōhoku earthquake and tsunami in March 2011.
“During the first 24 hours if it had been possible to vent the reactors, then the explosions would not have occurred and the disasters would have been much less severe,” he explained.
“Human beings, in fact, tried to do it but had to turn around and go back because their radiation dosimeters read too high,” Pratt said. “That was a perfect place where, if we could have sent a machine in quickly during the first day, the disaster would have been much less destructive.”
|Join the GlobalSecurity.org mailing list|