April 17, 2009

Merciless Robots to Fight Future Wars By 2015

Tax Day Tea Party, Robot at White House

A robot is seen on the sidewalk in front of the north gate of the White House in Washington, Wednesday, April 15, 2009, after tax protesters threw what appeared to be a box of tea bags over the White House fence on Pennsylvania Avenue. (AP Photo/Charles Dharapak)

Military 50 Percent Robotic By 2015

AFP
February 6, 2009

Robots will be armies of the future in a case of science fact catching up to fiction, a researcher told an elite Technology Entertainment and Design (TED) gathering on Wednesday.

Peter Singer, who has authored books on the military, warned that while using robots for battle saves lives of military personnel, the move has the potential to exacerbate warfare by having heartless machines do the dirty work.

“We are at a point of revolution in war, like the invention of the atomic bomb,” Singer said. What does it mean to go to war with U.S. soldiers whose hardware is made in China and whose software is made in India?”

Singer predicts that U.S. military units will be half machine, half human by 2015.

The U.S. Army already recruits soldiers using a custom war videogame, and some real-world weapon controls copy designs of controllers for popular videogame consoles.

Attack drones and bomb-handling robots are already common in battle zones.

Robots not only have no compassion or mercy, they insulate living soldiers from horrors that humans might be moved to avoid.



“The United States is ahead in military robots, but in technology there is no such thing as a permanent advantage,” Singer said. “You have Russia, China, Pakistan and Iran working on military robots.”

There is a “disturbing” cross between robotics and terrorism, according to Singer, who told of a website that lets visitors detonate improvised explosive devices from home computers.

“You don’t have to convince robots they are going to get 72 virgins when they die to get them to blow themselves up,” Singer said.

Robots also record everything they see with built-in cameras, generating digital video that routinely gets posted online at YouTube in graphic clips that soldiers refer to as “war porn,” according to Singer.

“It turns war into entertainment, sometimes set to music,” Singer said. “The ability to watch more but experience less.”

Robotics designer David Hanson offered hope when it comes to making robots a little more human. Hanson builds robots that have synthetic flesh faces and read people’s expressions in order to copy expressions. “The goal here is not just to achieve sentience, but empathy,” Hanson said. “As machines are more capable of killing, implanting empathy could be the seeds of hope for our future.”

Hanson demonstrated a lifelike robotic bust of late genius Albert Einstein that makes eye contact and mimics people’s expressions.

“I smiled at that thing and jumped out of my skin when it smiled back,” TED curator Chris Anderson quipped. “It’s freaky.”

Scientists Debate a Robot War in New Book ‘Wired for War’

Canadian Press
February 11, 2009

In the 1921 play that invented the word “robot” - Czech writer Karel Capek’s “Rossum’s Universal Robots” - mechanical, highly intelligent slaves mount a revolt and kill all humans but one.

Ever since, science fiction has explored the idea of robots outsmarting, dominating and destroying the human race. Author P. W. Singer, at 33 a Senior Fellow at the highly serious Brookings Institution, can’t resist the fascination of the topic, but he isn’t writing fiction. He treats the possibility with appropriate seriousness in “Wired for War,” a meticulous account of the latest military robots.

Two earlier books by him have explored two of the hottest issues in 21st century military developments. One was “Corporate Warriors: The Rise of the Privatized Military Industry,” the reinvigorated ancient profession of mercenaries. The other deals with something relatively new: “Children at War” - the recruitment and enslavement of boys and girls in their teens and even younger.

Singer says some 40 countries are making military robots. The motive: reduced casualties. “When a robot dies, you don’t have to write a letter to its mother,” Singer quotes one unit commander as saying.

Military robots are already being built with greater endurance, firepower, precision and - for the moment, submissiveness - than human soldiers. The trend is to make them more autonomous, able to take decisions according to built-in commands, unmoved by fear, pity, revenge or other human emotion.

Whether they can or should be endowed with a system of ethics is controversial. How to tell a ragged soldier from a ragged civilian?

Scientists foresee the day when robots will develop what is called “strong AI” - high level artificial intelligence - and use it reproduce themselves without human intervention. Singer quotes Vernon Vinge, mathematician, computer scientist and science fiction writer, as predicting more than 15 years ago:
“Within the next 30 years, we will have the technological means to create superhuman intelligence. Shortly thereafter, the human era will be ended.”
Rodney Brooks, chief technical officer at iRobot, is more optimistic. The firm takes its name from Isaac Asimov’s “I, Robot,” which posits “laws” that robots must never harm humans. The firm also makes the first mass-produced robotic vacuum cleaner. Brooks says there’ll never be a robot takeover because by then, people will be part computer, part human.

Singer’s exhaustively researched book, enlivened by examples from popular culture, ends with a hint that he’s worried, too.
“We are creating something exciting and new, a technology that might just transform humans’ role in their world, perhaps even create a new species,” he concludes. “But this revolution is mainly driven by our inability to move beyond the conflicts that have shaped human history from the very start. Sadly, our machines may not be the only thing wired for war.”


Pentagon Exploring Robot Killers That Can Fire on Their Own

McClatchy Newspapers
March 27, 2009

The unmanned bombers that frequently cause unintended civilian casualties in Pakistan are a step toward an even more lethal generation of robotic hunters-killers that operate with limited, if any, human control.

The Defense Department is financing studies of autonomous, or self-governing, armed robots that could find and destroy targets on their own. On-board computer programs, not flesh-and-blood people, would decide whether to fire their weapons.

“The trend is clear: warfare will continue and autonomous robots will ultimately be deployed in its conduct,” Ronald Arkin, a robotics expert at the Georgia Institute of Technology in Atlanta, wrote in a study commissioned by the Army.
The pressure of an increasing battlefield tempo is forcing autonomy further and further toward the point of robots making that final, lethal decision,” he predicted. “The time available to make the decision to shoot or not to shoot is becoming too short for remote humans to make intelligent informed decisions.”
Autonomous armed robotic systems probably will be operating by 2020, according to John Pike, an expert on defense and intelligence matters and the director of the security Web site GlobalSecurity.org in Washington.

This prospect alarms experts, who fear that machines will be unable to distinguish between legitimate targets and civilians in a war zone.
“We are sleepwalking into a brave new world where robots decide who, where and when to kill,” said Noel Sharkey, an expert on robotics and artificial intelligence at the University of Sheffield, England.
Human operators thousands of miles away in Nevada, using satellite communications, control the current generation of missile-firing robotic aircraft, known as Predators and Reapers. Armed ground robots, such as the Army’s Modular Advanced Armed Robotic System, also require a human decision-maker before they shoot.

As of now, about 5,000 lethal and nonlethal robots are deployed in Iraq and Afghanistan. Besides targeting Taliban and al Qaida leaders, they perform surveillance, disarm roadside bombs, ferry supplies and carry out other military tasks. So far, none of these machines is autonomous; all are under human control.



The Pentagon’s plans for its Future Combat System envision increasing levels of independence for its robots.
“Fully autonomous engagement without human intervention should also be considered, under user-defined conditions,” said a 2007 Army request for proposals to design future robots.
For example, the Pentagon says that air-to-air combat may happen too fast to allow a remote controller to fire an unmanned aircraft’s weapons.
“There is really no way that a system that is remotely controlled can effectively operate in an offensive or defensive air-combat environment,” Dyke Weatherington, the deputy director of the Pentagon’s unmanned aerial systems task force, told a news conference on Dec. 18, 2007. “The requirement for that is a fully autonomous system,” he said. “That will take many years to get to.”
Many Navy warships carry the autonomous, rapid-fire Phalanx system, which is designed to shoot down enemy missiles or aircraft that have penetrated outer defenses without waiting for a human decision-maker.

At Georgia Tech, Arkin is finishing a three-year Army contract to find ways to ensure that robots are used in appropriate ways. His idea is an “ethical governor” computer system that would require robots to obey the internationally recognized laws of war and the U.S. military’s rules of engagement.
“Robots must be constrained to adhere to the same laws as humans or they should not be permitted on the battlefield,” Arkin wrote.
For example, a robot’s computer “brain” would block it from aiming a missile at a hospital, church, cemetery or cultural landmark, even if enemy forces were clustered nearby. The presence of women or children also would spark a robotic no-no.

Arkin contends that a properly designed robot could behave with greater restraint than human soldiers in the heat of battle and cause fewer casualties. “Robots can be built that do not exhibit fear, anger, frustration or revenge, and that ultimately behave in a more humane manner than even human beings in these harsh circumstances,” he wrote.

Sharkey, the British critic of autonomous armed robots, said that Arkin’s ethical governor was “a good idea in principle. Unfortunately, it’s doomed to failure at present because no robots or AI (artificial intelligence) systems could discriminate between a combatant and an innocent. That sensing ability just does not exist.”

Selmer Bringsjord, an artificial intelligence expert at Rensselaer Polytechnic Institute in Troy, N.Y., is worried, too.
“I’m concerned. The stakes are very high,” Bringsjord said. “If we give robots the power to do nasty things, we have to use logic to teach them not to do unethical things. If we can’t figure this out, we shouldn’t build any of these robots.”
Bailiffs Get Power to Use Force on Debtors
20,000 More U.S. Troops To Be Deployed For "Domestic Security" (December 2008)
Pentagon Deploys U.S. Troops for Terrorist Attack or Other Catastrophe (December 2008)
Combat Troops in Iraq Deployed to U.S. to "Help with Civil Unrest" (October 2008)
Gates Memo Announces Final Assimilation of National Guard and Reserve
America Under Martial Law (Video Part 1 of 3 - October 2008)
Brain-Machine Interface Technology Enables Controlling of a Robot by Human Thought Alone
Posse Comitatus on the Ropes: Northcom Responds to North Dakota Flooding
U.S. Recruiting Misfits for Army (Felons, Racists, Gang Members Fill in the Ranks)
U.S. Military Will Offer Path to Citizenship
Militarizing Police Depts. With Your Bailout Money
Pentagon Wants Packs of Robots to Detect "Non-cooperative Humans"
Is FEMA & DHS Preparing for Mass Graves and Martial Law Near Chicago?
Pentagon's Pain Beam to Get Tougher, Smaller, More Powerful
General Wants to Scan More U.S. Irises, Fingerprints
Internment Camps Readied for Mass Illegal Alien Influx? (February 2009)
New Orleans: Bombs, Choppers During Military Exercises Startle Residents (February 2009)
New Legislation Authorizes FEMA Camps in U.S. (January 2009)
En Route to Military Rule and the Demise of Civilian Government
U.S. Military Preparing for Domestic Disturbances
Life-Like Walking Female Robot Unveiled
Top Professor: Autonomous Killer Robots in the Field
Military Recruits Thousands More Warbots for New Unmanned Surge
Unmanned "Surge": 3000 More Robots for War

Robots are Narrowing the Gap with Humans
Killer Robots and a Revolution in Warfare
Robots That Kill for America
March of the Killer Robots
U.S. Air Force Unmanned Aircraft Systems Flight Plan 2009-2047
Upcoming Military Robot Could Feed on Dead Bodies
Are we on the brink of creating a computer with a human brain?
Secret U.S. spontaneous human combustion beam tested
Test brings SciFi depictions of laser weapons vaporizing targets into reality
Police give man CPR after shooting him with Taser
Rebellion-B-Gone: Chemical Neurowarfare
Police Buy Military-Style Sonic Devices
The Criminal Behavior of G-20 Police in Pittsburgh
Robocops Employ Scary Crowd-Stopping Technology at Pittsburgh Protests
Cops May Get Portable Pain Weapon
Test of Tactical Laser from a C-130 gunship burns hole in hood of vehicle
TSA Body Scanning Technology Strips Away Privacy
Israeli Robots Remake Battlefield
Welsh robot Adam takes A.I. to the next level
Robot Border Guards to Patrol Future Frontiers
First Annual National Robotics Week Is in Full Swing in US
MEART: the Machine that's Part Robot Part Rat
DNA Robots on the Move
iRobot’s Military Swarm of Wifi Bots Flips Into Action
Stanford’s Stickybot is Climbing Faster, Human Project Coming Soon? (video)

Updated 10/3/10 (Newest Additions at End of List)

No comments:

Post a Comment

Go to The Lamb Slain Home Page