On June 5th, the Pentagon will hold the final round of its Robotics Challenge, where 25 teams from around the world will vie for a $2 million prize to see whose robot can perform best amidst a simulated disaster zone. But there's more at stake than just money. The Pentagon hopes one day such robots might save lives.

The competition was spurned by the 2011 Fukushima nuclear disaster, which occurred after a magnitude 9.0 earthquake struck northeastern Japan, setting off a 14-meter high tsunami. Workers inside the plant were hampered by radiation leaks and forced to evacuate before the plant could be secured. So the Pentagon has stepped in to develop robotic solutions to such crises.

The Defense Advanced Research Projects Agency (DARPA) has invested millions in the quest to develop robots that can work in disaster zones deemed too dangerous for humans. Hence, the Robotics Challenge. In the first stage of competition, the robots were tasked with navigating a computer-animated world. The second stage involved actual obstacles and tasks, but the finals next week are where the robots will really have to strut their stuff.

There will be eight tasks in all and each robot will have 60 minutes to complete the course, which will consist of turning valves, cutting holes, climbing stairs and navigating uneven terrain. The goal of the competition is to test not only the robots' physical agility, but to gauge their awareness and cognition. The Pentagon envisions using robots to help mitigate disaster zones, but acknowledges that as technology advances, the bots could have numerous applications, even as that of soldiers.

Which has ethicists up in arms. Recently, the U.N. sponsored discussions on the development of "Lethal Autonomous Weapons Systems," and the legal and ethical dilemmas they pose. The U.N. has previously called for banning the development of such robots, saying that "in addition to being physically removed from the kinetic action, humans would also become more detached from decisions to kill - and their execution."

There is even a Campaign to Stop Killer Robots. Mary Wareham, who coordinates this consortium of human rights groups, said, "We want to talk to the governments about how [the robots] function and understand the human control of the targeting and attack decisions. We want assurances that a human is in the loop."

But the Pentagon is addressing such concerns and focusing on the potential of such advanced technologies.

"As with any technology, we cannot control what it is going to be used for," says Gill Pratt, DARPA's program manager. "We do believe it is important to have those discussions as to what they're going to be used for, and it's really up to society to decide that. But to not develop the technology is to deny yourself the capability to respond effectively, in this case to disasters, and we think it's very important to do that."

"We don't know what the next disaster will be but we know we have to develop the technology to help us to address these kinds of disaster."