Skip to Content

Share

Cora Currier: Drone Makers Gather to Defend Their Much-Maligned Machines

October 21, 2013

A dispatch from the conference of the Association for Unmanned Vehicle Systems.

https://www.guernicamag.com/wp-content/uploads/2013/10/5871751235_93563dd681_b.jpg
Image from Flickr via diabloazul

By Cora Currier
By arrangement with ProPublica

“I have some d-word difficulty,” said Michael Toscano, president and CEO of the Association for Unmanned Vehicle Systems International, a trade group for makers and enthusiasts of robots of air, land and sea.

The d-word, of course, is drones.

“Just when I say that word, ‘drrrrone,’” he intoned, waving his hands, “it has a negative connotation. Drone bees: They’re not smart, they just follow orders, they do things autonomously, and they die. When you think of a drone it’s just that, it does one thing and it blasts things out of the air.”

Toscano and I spoke over lunch at the Drones and Aerial Robotics Conference at New York University last weekend. Why was “drones” in the name? For one, it’s an attention grabber. For another, DARC is a “cool acronym,” said an organizer, even if it doesn’t help dispel the spooky associations that give Toscano a headache.

The conference was one part industry showcase, one part academic gathering, and one part workshop, reflecting the various camps of drone defenders and disparagers. Machines whirred around a stage in a demonstration, and their makers showed off a stream of videos of mountaintops, biking stunts, and cityscapes set to thumping music.

Far beyond their military uses, drones could pollinate crops, help firefighters–even accompany “a family on vacation in Hawaii,” said Colin Guinn, CEO of a company that makes drones for photography.

“There’s a reason we make the Phantom white, and not black. It’s not creepy. Look how cute it is!” said Guinn, referring to the small drone hovering at his side, flashing lights to charm its audience. (A researcher from Harvard arguably failed the creepy test, explaining to the audience what to consider “if you want to build a swarm of robotic bees.”)
The tech geeks, though, were almost outnumbered by those of another stripe: Philosophers, lawyers, and critics who propose that drones are “a different ontological category,” of “social machines,” as Ryan Calo, a law professor at the University of Washington, put it.

I asked Patrick Egan, President of the Silicon Valley chapter of Toscano’s group and editor at an industry blog, if drone manufacturers lay awake at night contemplating the ethics of technology, the brave new world that their products represent?

“The hyperbole is out of control,” he said. “It is transformative technology, but not in the way people think.”

The conference brought out some “different perspectives,” said Egan, who also does consulting for the military. “I’m on this panel with a women’s studies professor. She wants to say I’m a Randian. I don’t even get that. Hey, I’ve read a little Ayn Rand; right now I’m reading Naked Lunch! It wasn’t the industry that inspired me to do that.”

But industry line at the conference was that drones are merely a technological platform, with a range of possibilities. They don’t spy, or kill; the people ordering them around do.

The U.S. has virtually no commercial civilian drone market, as the Federal Aviation Administration has been slow to approve the widespread use of drones. In the past year, the public has increasingly pushed back against the drone war overseas and surveillance at home. ProPublica has covered the secrecy that surrounds the administration’s drone war, from signature strikes to civilian casualties. The lack of transparency (the government still won’t release documents related to its targeted killing program) has helped contribute to wariness about the pilotless craft.

But industry line at the conference was that drones are merely a technological platform, with a range of possibilities. They don’t spy, or kill; the people ordering them around do.

A panel on “life under drones” in Pakistan and Afghanistan turned tense when the presenters said they couldn’t show images of drone victims. (The organizers said it was a technical issue.)

“I don’t understand the hostility,” one young engineer said in reaction.

Toscano hates that critiques of U.S. airstrikes zero in on drones. “It’s not a drone strike unless they physically fly the aircraft into whatever the target is. It is an airstrike because it launches a Hellfire missile or a weapon.”

But Toscano–who spent years involved in research and development at the Pentagon–also defends the use of military drones: “If they fly manned systems, some of them could be shot down. Would you want those pilots to be shot down?”

Journalists in Yemen have made the same point about media using “drone” as a shorthand for U.S. military action in that country. But Toscano–who spent years involved in research and development at the Pentagon–also defends the use of military drones: “If they fly manned systems, some of them could be shot down. Would you want those pilots to be shot down?”

Domestic, unarmed drones were also scapegoats for the public’s concerns about privacy, he said. Other, more common technologies have already eroded privacy. The public lost privacy via “cellphones, they lost it on GPS, they lost it on the Internet. They can’t get that genie back in the bottle.” The difference with drones is that “we don’t have these systems flying.”

John Kaag, a philosopher at University of Massachusetts Lowell, had asked the audience at his lecture to stare into the eyes of the person next to them while he counted out five awkward seconds, to feel “the human” concern with surveillance. He advised the drone industry, “Make people know that you feel that.” Humans “are responsible, drones are not responsible.”

Toscano said he was fine with staring at the man beside him. “I’m an extrovert! The only thing I said to the guy is, ‘I don’t mind this at all but if you were a woman I’d probably enjoy it more.'”

And what about the concerns–both ethical and practical—that autonomous machines take humans out of the equation in novel and dangerous ways?

Cars already do a lot of things autonomously, Toscano offered. Car crashes kill thousands every year, but we consider the technology indispensable to modern life.

“If Martians came down to earth and said we will cure all of cancer on the globe, and for doing it, you have to give me 100,000 of your people for me to cannibalize, to eat, would we do the deal? Most people would say no. Our society does not believe that cannibalism is acceptable.”

“Right now, in human nature, it’s unacceptable for a machine to kill a human being,” he said.

That’s why people are uncomfortable with driverless cars or drones, Toscano said. He’s confident the “risk acceptance” will change, and that fears about the technology will become as quaint as 19th-century concerns about elevators.

Cora Currier covers national security for ProPublica.

Readers like you make Guernica possible. Please show your support.

Tagged with:

Share on FacebookShare on TwitterAdd to BufferShare on LinkedInShare on TumblrSubmit to StumbleUpon
Submit to redditShare on App.netShare via email

You might also like

Leave a comment




Anti-Spam Quiz:

Subscribe without commenting