The Night the Lightbulb Went On
A partial history of the public conversation on drones; and the birth of a dissertation
Photo via Shutterstock
Remember how, in the old Looney Toons, Wile E. Coyote would get an idea and a lightbulb would appear above his head? I’m pretty sure everyone knows that’s almost never how it works in real life, certainly not in academic research. Almost never. Once in a while it happens kind of like that, though. And it’s so rare, it’s basically never caught on video. Almost never. But in my case, it was.
Let’s go back in time to 6 March 2014. I’d been working late and suddenly remembered an event was happening at the Robert J. Dole Institute of Politics. I locked up my office and headed out on spur of the moment to attend the Innovations Series panel on drone policy, and here I intend to show you a moment from that panel that will clarify many of my future posts: the night my dissertation was born.
First, though, it helps to understand the way drones were perceived in spring 2014. It was nearly 13 years after the 9/11 attacks. Barack Obama was President. The Iraq War had ended in 2011 but we were still hearing about it. Osama bin Laden had been assassinated in 2011 also, but the War in Afghanistan, which had been escalating since 1999, was still ongoing. And a central feature of the fighting in Afghanistan wasthe drone. Models varied. Sometimes we heard about Predators, sometimes about Reapers. Mainstream news channels mentioned technical specifications from time to time, but most of the conversation around drones focused on policy implications of killing at a distance, punctuating the discourse with photos of faceless fuselages that connoted a blind American justice.
Longform journalism turned out to be a major locus of public conversation on drone policy; and looking back on it today, I’m grateful for the many journalists who spent considerable time and effort leaving a document trail about the Drone Wars to underscore the horror many felt at this new form of warfare. J.M. Berger first clarified the links between the 9/11 attackers and Anwar al-Awlaki, a New Mexico-born imam who’d had contact with the terrorists who eventually hijacked planes. President Obama ordered the assassination of al-Awlaki, who was assassinated by drone on 30 September 2011. Berger’s work underscored that al-Awlaki was a threat; but the imam’s American birth and citizenship made the Executive Branch’s unilateral decision to assassinate him via drone seem like execution without due process. This drew a howl from many, including Conor Friedersdorf and Ta-Nehisi Coates—two writers I don’t normally think of as having much in common.
But the real outrage occurred two weeks later, on 14 October 2011, when another drone strike killed al-Awlaki’s son, Abdulrahman, while he sat eating outdoors in Yemen with a cousin; the younger al-Awlaki, an American citizen like his father, was 16 years old.
Over the next two years, journalists worked to clarify the horror of living in areas where drone strikes were being ordered. In September 2012, Friedersdorf was trying to convey the terror of living under drones based on quotes from those living in tribal Pakistan and other affected areas, showing the trauma people were experiencing. As a girl-dad, this one kills me:
“When children hear the drones, they get really scared, and they can hear them all the time so they're always fearful that the drone is going to attack them,” an unidentified man reported. “Because of the noise, we're psychologically disturbed, women, men, and children.”
Early in 2013, Gen. Stanley McChrystal, who had been relieved of command in Afghanistan in 2010, was quoted thus:
"What scares me about drone strikes is how they are perceived around the world… The resentment created by American use of unmanned strikes ... is much greater than the average American appreciates. They are hated on a visceral level, even by people who've never seen one or seen the effects of one."
Elsewhere in the article he referred to a “perception of American arrogance that says, ‘Well we can fly where we want, we can shoot where we want, because we can.’” This PR nightmare was exacerbated by a summer 2013 editorial from Nasser al-Awlaki, Abdulrahman’s grandfather, who tragically reflected on the drone strike that killed his grandson:
Nearly two years later, I still have no answers. The United States government has refused to explain why Abdulrahman was killed. It was not until May of this year that the Obama administration, in a supposed effort to be more transparent, publicly acknowledged what the world already knew — that it was responsible for his death.
Later that summer, Hassan Abbas was pointing out what should have been obvious: all this fear of drones was working against the U.S.’s purported objective of peacemaking.
By the fall of 2013, news coverage intensely focused on drone warfare as inhumane. Mark Bowden’s landmark essay, “The Killing Machines: How to Think About Drones,” offered a sense of how drones worked and how they were being used. And Pakistani civilians’ testimony before Congress that fall revealed a general alienation from their environment thanks to drones, including the now-infamous quote, “I no longer love blue skies. In fact, I now prefer grey skies. The drones do not fly when skies are grey.” The boy who said it was 13 years old.
That pretty much brings you up to speed on how drones were perceived on 6 March 2014, when Admiral Timothy Baird (Ret.) and Scott Winship, both of Northrop Grumman, spoke on drones at the Dole Institute. For context, it’s good to know that Top Gun (1986), which receives a mention in this video, was basically an infomercial for the Grumman F-14 Tomcat—“Grumman” as in “Northrop Grumman.” So Winship’s quip, “Can I say ‘Top Gun’?” reveals what a company man he was at this moment. (And for further context on films like Top Gun, see Samantha Quigley’s exposition on just how much Hollywood needs the Pentagon’s permission to make such a film.) To understand what I’m saying below, you’ll really want to watch the the Dole Institute video below from 51:27 to 56:50. I’ll wait.
The first questioner you see is not someone I know, so if you happen to know who that was, email me with a name, would you?
Whoever he is, his long, meandering delivery is a study in how not to ask a question in any Q/A, especially an academic one; but given the gravity and newness of the issue he was trying to address, let’s cut him some slack. Clearly he’d done his homework. In fact, he sounds like he had read so much he was having trouble sifting through it all to find the most important points to make in order to set up his question. And if I understand him rightly, he was basically arguing that since the Northrop Grumman reps were employed by a company that acted as a material cause of the U.S. Government’s execution of its citizens without due process, they bore some of the moral blame for that breach of the al-Awlakis’ rights, as well as for any future in which those rights could be circumvented at will by the Executive Branch.
(By the way, assuming that’s what he was arguing, do you agree with him? Or do you think people should be able to say something that amounts to, “Hey, man, not my problem. I have a right to earn by doing this. Take it up with the people who give the orders”? Feel free to leave a comment below.)
What I found interesting about that first question, however, was that the questioner needed to resort to Terminator vocabulary in order to describe what he was concerned about. It was a policy forum in a Political Science research center and governmental archive, not an event at KU’s Hall Center for the Humanities. Science fiction references seemed out of place for a second; but, as it turns out, they weren’t. Even the moderator thought the question about film or fiction influencing drone development was “very interesting.”
The second questioner—the geek in the salmon dress shirt with the carelessly-knotted tie who asked about fiction or film—that was, uh, me. Or a past version of me, anyway. Two years earlier I’d moved to Lawrence, Kansas, where KU is located, from Washington, D.C., where I earned my M.A. in Literature from American University. During the four years I was there I got to know D.C. culture enough to know that when you talk to people in military and government, you keep it brief and punchy so that the question or point is clear beyond misinterpretation. If you do that, you can dress like a civilian (or whatever I was dressed like that night) and still earn respect.
In this video, Winship was more chatty than most of the defense sector folks I’d met in D.C. (And if I had to guess, I’d say Baird has experience in intelligence. Note how he said almost nothing for a lot of the Q/A. When you’re talking, you’re not listening or observing.) My professional assessment is that, by loosing so many words, Winship kind of boxed himself in. First, he openly acknowledged that he reads both fiction and nonfiction that influence his work. While that acknowledgement wasn’t damning in itself, it could easily have raised harder questions to answer, like “Which fiction is currently influencing your work?” And at that point he’d be in dangerous territory because an honest answer may offer clues about what he’s working on. And so, as we saw, second, Winship was either unwilling or unable to name then-current fiction relevant to drones, for example Daniel Suarez’s Kill Decision (2012). In fact, it looks like he was playing for time, trying somehow to come up with an answer. As I’ve returned to the footage over the years, this has raised lots of questions. Was Winship in a position to name books on his TBR pile, but then did he decide against that approach lest he tip Northrop Grumman’s hand? Or was he just bulling his way through the Q/A and trying to seem relatable, like some average joe just doing his job like the rest of us? If he was posing, it made him look foolish because, third—and this was his gravest error, at least from where I was sitting—grasping for some generally relatable or academically respectable response, he named a title that no well-read American would ever be glad to hear has influenced someone developing drone technology: Orwell’s Nineteen Eighty-Four. That’s when the lightbulb came on for me.
I left right after that. I was too restless. As I walked out I texted my dissertation advisor and she called within minutes. “Film and fiction have been influencing drone development,” I told her.
“That’s your dissertation topic,” she said.
And that’s the night my dissertation, The Rise of the Mechanimal (2020), was born.
What’s a “mechanimal”? More about that in a future post.
Dr. Aaron M. Long is a Lecturer in English at a flagship state university, and in Philosophy at a historied regional art school. He has published articles in Twentieth-Century Literature, The Nautilus, and Science Fiction Film & Television, among others. You can find him on Twitter (yeah, yeah, on X) or LinkedIn, and his website is here.
This is an interesting and compelling way to adapt academic work to a newer media format, and I'm interested to see where the story goes! Almost a serialized narrative of academic work.
To answer the question you asked--I think it's right that weapon manufacturers should share some of the moral blame for their misuse, given that most people who design weapons or manage weapon-manufacturing firms could easily get jobs elsewhere. (An interesting post on this here: https://www.hamiltonnolan.com/p/quit-your-evil-job?r=716j&utm_campaign=post&utm_medium=web)
Surely there are counterpoints. A single person quitting their job at Northrop-Grumman won't stop war crimes, for example; surely other people might step in and take the job. But no single act of moral uprightness ever eradicates wickedness.
On the other hand, I could never get any of these jobs if I wanted to, so of course it's easy for me to criticize those who can and do.
Memories :) I taught Kill Decision that year (after Kate Hayles brought it to your summer seminar) and for a while in frosh classes after that. I'm struck by how much of the current Silicon Valley enthusiasm for AI seems born of a similar...imperfect reading of fiction. Indeed, much comment has been made about how cautionary tales are being used as blueprints. Palantir, anyone?