aNewDomain.net — Will future drones have a sense of morality? Will life-or-death decisions someday be left up to machines — machines not just equipped with loads of data, but also a sense of right and wrong?
Check out the triangle drones and dancers in the video below. They express some intriguing hopes and deepest fears for the drone-enabled future. Here’s our video of the week, created and performed by elevenplay.
Video credit: YouTube Channel
The stunning video illustrates, through interpretation, audio and visual display, the double-edged future we can expect from drone technology. When Hellfire missiles are attached, we fear to sit in a drones’s crosshairs. Yet, when the drones deliver Amazon packages, smiling arrow and all, we love and want to be the center of their targeting system.
Drones are toys, weapons or advanced pizza delivery systems. In which crosshairs do you want to be? Tech always has multi-faceted uses — that is part of its appeal. We as humans need to decode and decide what to do with our products.
How will we regulate these potentially-disruptive innovations? Have we lost control? Technology is turning out to be society’s No. 1 “Frenemy.”
Public Opinion Debate
According to Ron Arkin, a professor of computer technology at Georgia Tech and director of the Georgia Tech Mobile Robot Lab, “It is fairly easy to make robots that behave more ethically and humanely than humans on the battlefield.”
Roger Berkowitz, from the Hanna Arendt center, worries about the increasing and alarming reliance on robotic decision makers in the fields of medicine and military strategy. How, he asks, does our desire to eliminate human fallibility disturb the fundamental landscape of personal and political life?
Is the conversation about the uses of these products in the tech community robust enough? I do not think so. Here are some voices that do have concerns. First, a view from a hyperallergic artist:
Drones are coming into our world very quickly and artists have already been reacting, utilizing, and protesting them. This is not a gimmick. While a publicity stunt for now, Amazon has unveiled future plans for using drones to expedite delivery to your door. Facebook even has ‘humanitarian’ plans on using drones to bring Internet to the rest of the world. For better or worse, drones are coming; figuring out how humans will live in relationship to them will be our challenge.”
Hollywood and Drones
Latest block buster X-men brings new awareness that might wake up the masses. The Daily Beast is asking if Hollywood turned against Obama through its drone portrait:
“This might sound like WikiLeaks’ latest video — a top secret recording of a drone strike in Pakistan that wound up ‘collaterally damaging’ dozens of innocent bystanders.
But it’s actually a description of the beginning of the new X-Men blockbuster Days of Future Past — a film that (underneath its comic-book action and time-travel shenanigans) questions the use of military robots and highlights the damage they can inflict on civilians.”
Because of its lengthy, intimate and inevitable relationship with culture, technology does not invite a close examination of its own consequences. It is the kind of friend that asks for trust and obedience, which most people are inclined to give because its gifts are truly bountiful. But of course, there is a dark side to this friend … it creates a culture without a moral foundation. It undermines certain mental processes and social relations that make human life worth living. Technology, in sum, is both friend and enemy.”
The Bottom Line
Drones can be incredibly useful and have great potential, but they can’t continue to go unregulated. A handful of states have restricted drone use due to privacy concerns, and lawmakers in Washington should follow suit, as there is no federal law.
This is needed as the Government is also getting into a new ambitious project of creating “Moral Robots,” a $7.5 million grant from the Office of Naval Research (ONR), is planning an in-depth survey to analyze what people think about when they make a moral choice. The researchers will then attempt to simulate that reasoning in a robot.
At the end of the five-year project, the scientists must present a demonstration of a robot making a moral decision. One example would be a robot medic that has been ordered to deliver emergency supplies to a hospital in order to save lives. On the way, it meets a soldier who has been badly injured. Should the robot abort the mission and help the soldier?” For more see www.the verge.com.
Where do you stand on issues of double-edge technologies and innovations?
For aNewDomain.net, I’m David Michaelis.
Based in Australia, David Michaelis is a world-renowned international journalist and founder of Link Tv. At aNewDomain.net, he covers the global beat, focusing on politics and other international topics of note for our readers in a variety of forums. Email him at DavidMc@aNewDomain.net.