Lucy Suchman, Karolina Follis and Jutta Weber: Tracking and targeting
This introduction to the special issue of the same title sets out the context for a critical examination of contemporary developments in sociotechnical systems deployed in the name of security. Our focus is on technologies of tracking, with their claims to enable the identification of those who comprise legitimate targets for the use of violent force. Taking these claims as deeply problematic, we join a growing body of scholarship on the technopolitical logics that underpin an increasingly violent landscape of institutions, infrastructures, and actions, promising protection to some but arguably contributing to our collective insecurity. We examine the asymmetric distributions of sociotechnologies of (in)security; their deadly and injurious effects; and the legal, ethical, and moral questions that haunt their operations.
Karolina Follis: Visions and transterritory: the borders of Europe
This essay is about the role of visual surveillance technologies in the policing of the external borders of the European Union (EU). Based on an analysis of documents published by EU institutions and independent organizations, I argue that these technological innovations fundamentally alter the nature of national borders. I discuss how new technologies of vision are deployed to transcend the physical limits of territories. In the last twenty years, EU member states and institutions have increasingly relied on various forms of remote tracking, including the use of drones for the purposes of monitoring frontier zones. In combination with other facets of the EU border management regime (such as transnational databases and biometrics), these technologies coalesce into a system of governance that has enabled intervention into neighboring territories and territorial waters of other states to track and target migrants for interception in the “prefrontier.” For jurisdictional reasons, this practice effectively precludes the enforcement of legal human rights obligations, which European states might otherwise have with regard to these persons. This article argues that this technologically mediated expansion of vision has become a key feature of post–cold war governance of borders in Europe. The concept of transterritory is proposed to capture its effects.
Christiane Wilke: Seeing and unmaking civilians in Afghanistan: visual technologies and contested professional visions
While the distinction between civilians and combatants is fundamental to international law, it is contested and complicated in practice. How do North Atlantic Treaty Organization (NATO) officers see civilians in Afghanistan? Focusing on 2009 air strike in Kunduz, this article argues that the professional vision of NATO officers relies not only on recent military technologies that allow for aerial surveillance, thermal imaging, and precise targeting but also on the assumptions, vocabularies, modes of attention, and hierarchies of knowledges that the officers bring to the interpretation of aerial surveillance images. Professional vision is socially situated and frequently contested with communities of practice. In the case of the Kunduz air strike, the aerial vantage point and the military visual technologies cannot fully determine what would be seen. Instead, the officers’ assumptions about Afghanistan, threats, and the gender of the civilian inform the vocabulary they use for coding people and places as civilian or noncivilian. Civilians are not simply “found,” they are produced through specific forms of professional vision.
Jon Lindsay: Target practice: Counterterrorism and the amplification of data friction
The nineteenth-century strategist Carl von Clausewitz describes “fog” and “friction” as fundamental features of war. Military leverage of sophisticated information technology in the twenty-first century has improved some tactical operations but has not lifted the fog of war, in part, because the means for reducing uncertainty create new forms of it. Drawing on active duty experience with an American special operations task force in Western Iraq from 2007 to 2008, this article traces the targeting processes used to “find, fix, and finish” alleged insurgents. In this case they did not clarify the political reality of Anbar province but rather reinforced a parochial worldview informed by the Naval Special Warfare community. The unit focused on the performance of “direct action” raids during a period in which “indirect action” engagement with the local population was arguably more appropriate for the strategic circumstances. The concept of “data friction”, therefore, can be understood not simply as a form of resistance within a sociotechnical system but also as a form of traction that enables practitioners to construct representations of the world that amplify their own biases.
M.C. Elish: Remote split: a history of US drone operations and the distributed labour of war
This article analyzes US drone operations through a historical and ethnographic analysis of the remote split paradigm used by the US Air Force. Remote split refers to the globally distributed command and control of drone operations and entails a network of human operators and analysts in the Middle East, Europe, and Southeast Asia as well as in the continental United States. Though often viewed as a teleological progression of “unmanned” warfare, this paper argues that historically specific technopolitical logics establish the conditions of possibility for the work of war to be divisible into discreet and computationally mediated tasks that are viewed as effective in US military engagements. To do so, the article traces how new forms of authorized evidence and expertise have shaped developments in military operations and command and control priorities from the Cold War and the “electronic battlefield” of Vietnam through the Gulf War and the conflict in the Balkans to contemporary deployments of drone operations. The article concludes by suggesting that it is by paying attention to divisions of labor and human–machine configurations that we can begin to understand the everyday and often invisible structures that sustain perpetual war as a military strategy of the United States.
I’ve discussed Christiane’s excellent article in detail before, but the whole issue repays careful reading.
And if you’re curious about the map that heads this post, it’s based on the National Security Agency’s Strategic Mission List (dated 2007 and published in the New York Times on 2 November 2013), and mapped at Electrospaces: full details here.
In Lucy Suchman‘s marvellous essay on ‘Situational Awareness’ in remote operations she calls attention to what she calls bioconvergence:
A corollary to the configuration of “their” bodies as targets to be killed is the specific way in which “our” bodies are incorporated into war fighting assemblages as operating agents, at the same time that the locus of agency becomes increasingly ambiguous and diffuse. These are twin forms of contemporary bioconvergence, as all bodies are locked together within a wider apparatus characterized by troubling lacunae and unruly contingencies.
In the wake of her work, there has been a cascade of essays insisting on the embodiment of air strikes carried out by Predators and Reapers – the bodies of the pilots, sensor operators and the legion of others who carry out these remote operations, and the bodies of their victims – and on what Lauren Wilcox calls the embodied and embodying nature of drone warfare (‘Embodying algorithmic war: Gender, race, and the posthuman in drone warfare’ in Security dialogue, 2016; see also Lorraine Bayard de Volo, ‘Unmanned? Gender recalibrations and the rise of drone warfare’, Politics and gender, 2015). Lauren distinguishes between visual, algorithmic and affective modes of embodiment, and draws on the transcript of what has become a canonical air strike in Uruzgan province (Afghanistan) on 21 February 2010 to develop her claims (more on this in a moment).
And yet it’s a strange sort of embodying because within the targeting process these three registers also produce an estrangement and ultimately an effacement. The corporeal is transformed into the calculative: a moving target, a data stream, an imminent threat. If this is still a body at all, it’s radically different from ‘our’ bodies. As I write these words, I realise I’m not convinced by the passage in George Brant‘s play Grounded in which the face of a little girl on the screen, the daughter of a ‘High Value Target’, becomes the face of the Predator pilot’s own daughter. For a digital Orientalism is at work through those modes of embodiment that interpellates those watching as spectators of what Edward Said once called ‘a living tableau of queerness’ that in so many cases will become a dead tableau of bodies which remain irredeemably Other.
There is a history to the embodiment of air strikes, as my image above shows. Aerial violence in all its different guises has almost invariably involved an asymmetriceffacement. The lives – and the bodies – of those who flew the first bombing missions over the Western Front in the First World War; the young men who sacrificed their lives during the Combined Bomber Offensive in the Second World War; and even the tribulations and traumas encountered by the men and women conducting remote operations over Afghanistan and elsewhere have all been documented in fact and in fiction.
And yet, while others – notably social historians, investigative journalists and artists – have sought to bring into view the lives shattered by aerial violence, its administration has long mobilised an affective distance between bomber and bombed. As I showed in ‘Doors into nowhere’ and ‘Lines of descent’ (DOWNLOADS tab), the bodies of those crouching beneath the bombs are transformed into abstract co-ordinates, coloured lights and target boxes. Here is Charles Lindbergh talking about the air war in the Pacific in May 1944:
You press a button and death flies down. One second the bomb is hanging harmlessly in your racks, completely under your control. The next it is hurtling through the air, and nothing in your power can revoke what you have done… How can there be writhing, mangled bodies? How can this air around you be filled with unseen projectiles? It is like listening to a radio account of a battle on the other side of the earth. It is too far away, too separated to hold reality.
Or Frank Musgrave, a navigator with RAF Bomber Command, writing about missions over Germany that same year:
These German cities were simply coordinates on a map of Europe, the first relatively near, involving around six hours of flying, the second depressingly distant, involving some eight or nine hours of flying. Both sets of coordinates were at the centre of areas shaded deep red on our maps to indicate heavy defences. For me ‘Dortmund’ and ‘Leipzig’ had no further substance or concrete reality.
Harold Nash, another navigator:
It was black, and then suddenly in the distance you saw lights on the floor, the fires burning. As you drew near, they looked like sparkling diamonds on a black satin background… [T]hey weren’t people to me, just the target. It’s the distance and the blindness which enabled you to do these things.
One last example – Peter Johnson, a Group Captain who served with distinction with RAF Bomber Command:
Targets were now marked by the Pathfinder Force … and these instructions, to bomb a marker, introduced a curiously impersonal factor into the act of dropping huge quantities of bombs. I came to realize that crews were simply bored by a lot of information about the target. What concerned them were the details of route and navigation, which colour Target Indicator they were to bomb… In the glare of searchlights, with the continual winking of anti-aircraft shells, the occasional thud when one came close and left its vile smell, what we had to do was search for coloured lights dropped by our own people, aim our bombs at them and get away.
The airspace through which the bomber stream flew was a viscerally biophysical realm, in which the crews’ bodies registered the noise of the engines, the shifts in course and elevation, the sound and stink of the flak, the abrupt lift of the aircraft once the bombs were released. They were also acutely aware of their own bodies: fingers numbed by the freezing cold, faces encased in rubbery oxygen masks, and frantic fumblings over the Elsan. But the physicality of the space far below them was reduced to the optical play of distant lights and flames, and the crushed, asphyxiated and broken bodies appeared – if they appeared at all – only in their nightmares.
These apprehensions were threaded into what I’ve called a ‘moral economy of bombing’ that sought (in different ways and at different times) to legitimise aerial violence by lionising its agents and marginalising its victims (see here: scroll down).
But remote operations threaten to transform this calculus. Those who control Predators and Reapers sit at consoles in air-conditioned containers, which denies them the physical sensations of flight. Yet in one, as it happens acutely optical sense they are much closer to the devastation they cause: eighteen inches away, they usually say, the distance from eye to screen. And the strikes they execute are typically against individuals or small groups of people (rather than objects or areas), and they rely on full-motion video feeds that show the situation both before and after in detail (however imperfectly). Faced with this highly conditional intimacy, as Lauren shows, the bodies that appear in the cross-hairs are produced as killable bodies through a process of somatic abstraction – leaving the fleshy body behind – that is abruptly reversed once the missile is released.
Thus in the coda to the original version of ‘Dirty Dancing’ (DOWNLOADS tab) – and which I’ve since excised from what was a very long essay; reworked, it will appear in a revised formas ‘The territory of the screen’ – I described how
intelligence agencies produce and reproduce the [Federally Administered Tribal Areas in Pakistan] as a data field that is systematically mined to expose seams of information and selectively sown with explosives to be rematerialised as a killing field. The screens on which and through which the strikes are animated are mediations in an extended sequence in which bodies moving into, through and out from the FATA are tracked and turned into targets in a process that Ian Hacking describes more generally as ‘making people up’: except that in this scenario the targets are not so much ‘people’ as digital traces. The scattered actions and interactions of individuals are registered by remote sensors, removed from the fleshiness of human bodies and reassembled as what Grégoire Chamayou calls ‘schematic bodies’. They are given codenames (‘Objective x’) and index numbers, they are tracked on screens and their danse macabre is plotted on time-space grids and followed by drones. But as soon as the Hellfire missiles are released the transformations that have produced the target over the preceding weeks and months cascade back into the human body: in an instant virtuality becomes corporeality and traces turn into remains.
There are two difficulties in operationalising that last sentence. One is bound up with evidence – and in particular with reading what Oliver Kearns calls the ‘residue’ of covert strikes (see his ‘Secrecy and absence in the residue of covert drone strikes’, Political Geography, 2016) – and the other is one that I want to address here.
To do so, let me turn from the FATA to Yemen. The Mwatana Organisation for Human Rights in Sa’ana has released a short documentary, Waiting for Justice, that details the effects of a US drone strike on civilians:
If the embedded version doesn’t work, you can find it on YouTube.
At 6 a.m. on 19 April 2014 a group of men – mainly construction workers, plus one young father hitching a ride to catch a bus into Saudi Arabia – set off from from their villages in al-Sawma’ah to drive to al-Baidha city; 20 to 30 metres behind their Toyota Hilux, it turned out, was a Toyota Land Cruiser carrying suspected members of Al Qaeda in the Arabian Peninsula.
That car was being tracked by a drone: it fired a Hellfire missile, striking the car and killing the occupants, and shrapnel hit the Hilux. Some of the civilians sought refuge in an abandoned water canal, when the drone (or its companion) returned for a second strike.
Four of them were killed – Sanad Hussein Nasser al-Khushum(30), Yasser Abed Rabbo al-Azzani (18), Ahmed Saleh Abu Bakr(65) and Abdullah Nasser Abu Bakr al-Khushu– and five were injured: the driver, Nasser Mohammed Nasser (35), Abdulrahman Hussein al-Khushum (22), Najib Hassan Nayef(35 years), Salem Nasser al-Khushum (40) and Bassam Ahmed Salem Breim (20).
The film draws on Death by Drone: civilian harm caused by US targeted killing in Yemen, a collaborative investigation carried out by the Open Society Justice Initiative in the United States and Mwatana in Yemen into nine drone strikes: one of them (see pp. 42-48) is the basis of the documentary; the strike is also detailed by the Bureau of Investigative Journalism as YEM159 here.
That report, together with the interview and reconstruction for the documentary, have much to tell us about witnesses and residues.
In addition the father of one of the victims, describing the strike in the film, says ‘They slaughter them like sheep‘…
… and, as Joe Pugliese shows in a remarkable new essay, that phrase contains a violent, visceral truth.
Joe describes a number of other US strikes in Yemen – by cruise missiles and by Hellfire missiles fired from drones (on which see here; scroll down) – in which survivors and rescuers confronted a horrific aftermath in which the incinerated flesh of dead animals and the flesh of dead human beings became indistinguishable. This is a radically different, post-strike bioconvergence that Joe calls a geobiomorphology:
The bodies of humans and animals are here compelled to enflesh the world through the violence of war in a brutally literal manner: the dismembered and melted flesh becomes the ‘tissue of things’ as it geobiomorphologically enfolds the contours of trees and rocks. What we witness in this scene of carnage is the transliteration of metadata algorithms to flesh. The abstracting and decorporealising operations of metadata ‘without content’ are, in these contexts of militarised slaughter of humans and animals, geobiomorphologically realised and grounded in the trammelled lands of the Global South.
Indeed, he’s adamant that it is no longer possible to speak of the corporeal in the presence of such ineffable horror:
One can no longer talk of corporeality here. Post the blast of a drone Hellfire missile, the corpora of animals-humans are rendered into shredded carnality. In other words, operative here is the dehiscence of the body through the violence of an explosive centripetality that disseminates flesh. The moment of lethal violence transmutes flesh into unidentifiable biological substance that is violently compelled geobiomorphologically to assume the topographical contours of the debris field.
By these means, he concludes,
the subjects of the Global South [are rendered] as non-human animals captivated in their lawlessness and inhuman savagery and deficient in everything that defines the human-rights-bearing subject. In contradistinction to the individuating singularity of the Western subject as named person, they embody the anonymous genericity of the animal and the seriality of the undifferentiated and fungible carcass. As subjects incapable of embodying the figure of “the human,” they are animals who, when killed by drone attacks, do not die but only come to an end.
You can read the essay, ‘Death by Metadata: The bioinformationalisation of life and the transliteration of algorithms to flesh’, in Holly Randell-Moon and Ryan Tippet (eds) Security, race, biopower: essays on technology and corporeality (London: Palgrave, 2016) 3-20.
It’s an arresting, truly shocking argument. You might protest that the incidents described in the essay are about ordnance not platform – that a cruise missile fired from a ship or a Hellfire missile fired from an attack helicopter would produce the same effects. And so they have. But Joe’s point is that where Predators and Reapers are used to execute targeted killings they rely on the extraction of metadata and its algorithmic manipulation to transform individualised, embodied life into a stream of data – a process that many of us have sought to recover – but that in the very moment of execution those transformations are not simply, suddenly reversed but displaced into a generic flesh. (And there is, I think, a clear implication that those displacements are pre-figured in the original de-corporealisation – the somatic abstraction – of the target).
Joe’s discussion is clearly not intended to be limited to those (literal) instances where animals are caught up in a strike; it is, instead, a sort of limit-argument designed to disclose the bio-racialisation of targeted killing in the global South. It reappears time and time again. Here is a sensor operator, a woman nicknamed “Sparkle”, describing the aftermath of a strike in Afghanistan conducted from Creech Air Force Base in Nevada:
Sparkle could see a bunch of hot spots all over the ground, which were likely body parts. The target was dead, but that isn’t always the case. The Hellfire missile only has 12 pounds of explosives, so making sure the target is in the “frag pattern,” hit by shrapnel, is key.
As the other Reaper flew home to refuel and rearm, Spade stayed above the target, watching as villagers ran to the smoldering motorbike. Soon a truck arrived. Spade and Sparkle watched as they picked up the target’s blasted body.
“It’s just a dead body,” Sparkle said. “I grew up elbows deep in dead deer. We do what we needed to do. He’s dead. Now we’re going to watch him get buried.”
The passage I’ve emphasised repeats the imaginary described by the strike survivor in Yemen – but from the other side of the screen.
Seen thus, Joe’s argument speaks directly to the anguished question asked by one of the survivors of the Uruzgan killings in Afghanistan:
How can you not identify us? (The question – and the still above – are taken from the reconstruction in the documentary National Bird). We might add: How do you identify us? These twin questions intersect with a vital argument developed by Christiane Wilke, who is deeply concerned that civilians now ‘have to establish, perform and confirm their civilianhood by establishing and maintaining legible patterns of everyday life, by conforming to gendered and racialized expectations of mobility, and by not ever being out of place, out of time’ (see her chapter, ‘The optics of war’, in Sheryl Hamilton, Diana Majury, Dawn Moore, Neil Sargent and Christiane Wilke, eds., Sensing Law [2017] pp 257-79: 278). As she wrote to me:
I’m really disturbed by the ways in which the burden of making oneself legible to the eyes in the sky is distributed: we don’t have to do any of that here, but the people to whom we’re bringing the war have to perform civilian-ness without fail.
Asymmetry again. Actors required to perform their civilian-ness in a play they haven’t devised before an audience they can’t see – and which all too readily misunderstands the plot. And if they fail they become killable bodies.
But embodying does not end there; its terminus is the apprehension of injured and dead bodies. So let me add two riders to the arguments developed by Lauren and Joe. I’ll do so by returning to the Uruzgan strike.
I should say at once that this is a complicated case (see my previous discussions here and here). In the early morning three vehicles moving down dusty roads and tracks were monitored for several hours by a Predator controlled by a flight crew at Creech Air Force Base in Nevada; to the south a detachment of US Special Forces was conducting a search operation around the village of Khod, supported by Afghan troops and police; and when the Ground Force Commander determined that this was a ‘convoy’ of Taliban that posed a threat to his men he called in an air strike executed by two OH-58 attack helicopters that killed 15 or 16 people and wounded a dozen others. All of the victims were civilians. This was not a targeted killing, and there is little sign of the harvesting of metadata or the mobilisation of algorithms – though there was some unsubstantiated talk of the possible presence of a ‘High-Value Individual’ in one of the vehicles, referred to both by name and by the codename assigned to him on the Joint Prioritised Effects List, and while the evidence for this seems to have been largely derived from chatter on short-wave radios picked up by the Special Forces on the ground it is possible that a forward-deployed NASA team at Bagram was also involved in communications intercepts. Still, there was no geo-locational fixing, no clear link between these radio communications and the three vehicles, and ultimately it was the visual construction of their movement and behaviour as a ‘hostile’ pattern of life that provoked what was, in effect, a signature strike. But this was not conventional Close Air Support either: the Ground Force Commander declared first a precautionary ‘Air TIC’ (Troops In Contact) so that strike aircraft could be ready on station to come to his defence – according to the investigation report, this created ‘a false sense of urgency’ – and then ‘Troops in Contact’. Yet when the attack helicopters fired their missiles no engagement had taken place and the vehicles were moving away from Khod (indeed, they were further away than when they were first observed). This was (mis)read as ‘tactical maneuvering’.
My first rider is that the process is not invariably the coldly, calculating sequence conjured by the emphasis on metadata and algorithms – what Dan McQuillancalls ‘algorithmic seeing’ – or the shrug-your-shouders attitude of Sparkle. This is why the affective is so important, but it is multidimensional. I doubt that it is only in films like Good Kill (below) or Eye in the Sky that pilots and sensor operators are uncomfortable, even upset at what they do. Not all sensor operators are Brandon Bryant – but they aren’t all Sparkle either.
All commentaries on the Uruzgan strike – including my own – draw attention to how the pilot, sensor operator and mission intelligence coordinator watching the three vehicles from thousands of miles away were predisposed to interpret every action as hostile. The crew was neither dispassionate nor detached; on the contrary, they were eager to move in for the kill. At least some of those in the skies above Uruzgan had a similar view. The lead pilot of the two attack helicopters that carried out the strike was clearly invested in treating the occupants of the vehicles as killable bodies. He had worked with the Special Operations detachment before, knew them very well, and – like the pilot of the Predator – believed they were ‘about to get rolled up and I wanted to go and help them out… [They] were about to get a whole lot of guys in their face.’
Immediately after the strike the Predator crew convinced themselves that the bodies were all men (‘military-aged males’):
08:53 (Safety Observer): Are they wearing burqas?
08:53 (Sensor): That’s what it looks like.
08:53 (Pilot): They were all PIDed as males, though. No females in the group.
08:53 (Sensor): That guy looks like he’s wearing jewelry and stuff like a girl, but he ain’t … if he’s a girl, he’s a big one.
Reassured, the crew relaxed and their conversation became more disparaging:
09:02 (Mission Intelligence Coordinator (MC)): There’s one guy sitting down.
09:02 (Sensor): What you playing with? (Talking to individual on ground.)
09:02 (MC): His bone.
….
09:04 (Sensor): Yeah, see there’s…that guy just sat up.
09:04 (Safety Observer): Yeah.
09:04 (Sensor): So, it looks like those lumps are probably all people.
09:04 (Safety Observer): Yep.
09:04 (MC): I think the most lumps are on the lead vehicle because everybody got… the Hellfire got…
….
09:06 (MC): Is that two? One guy’s tending the other guy?
09:06 (Safety Observer): Looks like it.
09:06 (Sensor): Looks like it, yeah.
09:06 (MC): Self‐Aid Buddy Care to the rescue.
09:06 (Safety Observer): I forget, how do you treat a sucking gut wound?
09:06 (Sensor): Don’t push it back in. Wrap it in a towel. That’ll work.
The corporeality of the victims flickers into view in these exchanges, but in a flippantly anatomical register (‘playing with … his bone’; ‘Don’t push it back in. Wrap it in a towel..’).
But the helicopter pilots reported the possible presence of women, identified only by their brightly coloured dresses, and soon after (at 09:10) the Mission Intelligence Coordinator said he saw ‘Women and children’, which was confirmed by the screeners. The earlier certainty, the desire to kill, gave way to uncertainty, disquiet.
These were not the only eyes in the sky and the sequence was not closed around them. Others watching the video feed – the analysts and screeners at Hurlburt Field in Florida, the staff at the Special Operations Task Force Operations Centre in Kandahar – read the imagery more circumspectly. Many of them were unconvinced that these were killable bodies – when the shift changed in the Operations Centre the Day Battle Captain called in a military lawyer for advice, and the staff agreed to call in another helicopter team to force the vehicles to stop and determine their status and purpose – and many of them were clearly taken aback by the strike. Those military observers who were most affected by the strike were the troops on the ground. The commander who had cleared the attack helicopters to engage was ferried to the scene to conduct a ‘Sensitive Site Exploitation’. What he found, he testified, was ‘horrific’: ‘I was upset physically and emotionally’.
My second rider is that war provides – and also provokes – multiple apprehensions of the injured or dead body. They are not limited to the corpo-reality of a human being and its displacement and dismemberment into what Joe calls ‘carcass’. In the Uruzgan case the process of embodying did not end with the strike and the continued racialization and gendering of its victims by the crew of the Predator described by Lauren.
The Sensitive Site Exploitation – the term was rescinded in June 2010; the US Army now prefers simply ‘site exploitation‘, referring to the systematic search for and collection of ‘information, material, and persons from a designated location and analyzing them to answer information requirements, facilitate subsequent operations, or support criminal prosecution’ – was first and foremost a forensic exercise. Even in death, the bodies were suspicious bodies. A priority was to establish a security perimeter and conduct a search of the site. The troops were looking for survivors but they were also searching for weapons, for evidence that those killed were insurgents and for any intelligence that could be gleaned from their remains and their possessions. This mattered: the basis for the attack had been the prior identification of weapons from the Predator’s video feed and a (highly suspect) inference of hostile intent. But it took three and a half hours for the team to arrive at the engagement site by helicopter, and a naval expert on IEDs and unexploded ordnance who was part of the Special Forces detachment was immediately convinced that the site had been ‘tampered with’. The bodies had been moved, presumably by people from a nearby village who had come to help:
The bodies had been lined up and had been covered… somebody else was on the scene prior to us … The scene was contaminated [sic] before we got there.
He explained to MG Timothy McHale, who lead the subsequent inquiry, what he meant:
The Ground Force Commander reported that he ‘wouldn’t take photos of the KIA [Killed in Action] – but of the strike’, yet it proved impossible to maintain a clinical distinction between them (see the right hand panel below; he also reported finding bodies still trapped in and under the vehicles).
His photographs of the three vehicles were annotated by the investigation team to show points of impact, but the bodies of some of the dead were photographed too. These still photographs presumably also had evidentiary value – though unlike conventional crime scene imagery they were not, so far I can tell, subject to any rigorous analysis. In any case: what evidentiary value? Or, less obliquely, whose crime? Was the disposition of the bodies intended to confirm they had been moved, the scene ‘contaminated’ – the investigator’s comments on the photograph note ‘Bodies from Vehicle Two did not match blast pattern’ – so that any traces of insurgent involvement could have been erased? (There is another story here, because the investigation uncovered evidence that staff in the Operations Centres refused to accept the first reports of civilian casualties, and there is a strong suspicion that initial storyboards were manipulated to conceal that fact). Or do the shattered corpses driven into metal and rock silently confirm the scale of the incident and the seriousness of any violation of the laws of war and the rules of engagement?
The Ground Force Commander also had his medics treat the surviving casualties, and called in a 9-line request (‘urgent one priority’) for medical evacuation (MEDEVAC). Military helicopters took the injured to US and Dutch military hospitals at Tarin Kowt, and en route they became the objects of a biomedical gaze that rendered their bodies as a series of visible wounds and vital signs that were distributed among the boxes of standard MEDEVAC report forms:
At that stage none of the injured was identified by name (see the first box on the top left); six of the cases – as they had become – were recorded as having been injured by ‘friendly’ forces, but five of them mark ‘wounded by’ as ‘unknown’. Once in hospital they were identified, and the investigation team later visited them and questioned them about the incident and their injuries (which they photographed).
These photographs and forms are dispassionate abstractions of mutilated and pain-bearing bodies, but it would be wrong to conclude from these framings that those producing them – the troops on the ground, the medics and EMTs – were not affected by what they saw.
And it would also be wrong to conclude that military bodies are immune from these framings. Most obviously, these are standard forms used for all MEDEVAC casualties, civilian or military, and all patients are routinely reduced to an object-space (even as they also remain so much more than that: there are multiple, co-existing apprehensions of the human body).
Yet I have in mind something more unsettling. Ken MacLeish reminds us that
for the soldier, there is no neat division between what gore might mean for a perpetrator and what it might mean for a victim, because he is both at once. He is stuck in the middle of this relation, because this relation is the empty, undetermined center of the play of sovereign violence: sometimes the terror is meant for the soldier, sometimes he is merely an incidental witness to it, and sometimes he, or his side, is the one responsible for it.
If there is no neat division there is no neat symmetry either; not only is there a spectacular difference between the vulnerability of pilots and sensor operators in the continental United States and their troops on the ground – a distance which I’ve argued intensifies the desire of some remote crews to strike whenever troops are in danger – but there can also be a substantial difference between the treatment of fallen friends and foe: occasional differences in the respect accorded to dead bodies and systematic differences in the (long-term) care of injured ones.
But let’s stay with Ken. He continues:
Soldiers say that a body that has been blown up looks like spaghetti. I heard this again and again – the word conjures texture, sheen, and abject, undifferentiated mass, forms that clump into knots or collapse into loose bits.
He wonders where this comes from:
Does it domesticate the violence and loss? Is it a critique? Gallows humor? Is it a reminder, perhaps, that you are ultimately nothing more than the dumb matter that you eat, made whole and held together only by changeable circumstance? Despite all the armor, the body is open to a hostile world and can collapse into bits in the blink of an eye, at the speed of radio waves, electrons, pressure plate springs, and hot metal. The pasta and red sauce are reminders that nothing is normal and everything has become possible. Some body—one’s own body—has been placed in a position where it is allowed to die. More than this, though, it has been made into a thing…
One soldier described recovering his friend’s body after his tank had been hit by an IED:
… everything above his knees was turned into fucking spaghetti. Whatever was left, it popped the top hatch, where the driver sits, it popped it off and it spewed whatever was left of him all over the front slope. And I don’t know if you know … not too many people get to see a body like that, and it, and it…
We went up there, and I can remember climbing up on the slope, and we were trying to get everybody out, ’cause the tank was on fire and it was smoking. And I kept slipping on – I didn’t know what I was slipping on, ’cause it was all over me, it was real slippery. And we were trying to get the hatch open, to try to get Chris out. My gunner, he reached in, reached in and grabbed, and he pulled hisself back. And he was like, “Holy shit!” I mean, “Holy shit,” that was all he could say. And he had cut his hand. Well, what he cut his hand on was the spinal cord. The spine had poked through his hand and cut his hand on it, ’cause there was pieces of it left in there. And we were trying to get up, and I reached down and pushed my hand down to get up, and I reached up and looked up, and his goddamn eyeball was sitting in my hand. It had splattered all up underneath the turret. It was all over me, it was all over everybody, trying to get him out of there…
I think Ken’s commentary on this passage provides another, compelling perspective on the horror so deeply embedded in Joe’s essay:
There is nothing comic or subversive here; only horror. Even in the middle of the event, it’s insensible, unspeakable: and it, and it …, I didn’t know what I was slipping on. The person is still there, and you have to “get him out of there,” but he’s everywhere and he’s gone at the same time. The whole is gone, and the parts – the eye, the spine, and everything else – aren’t where they should be. A person reduced to a thing: it was slippery, it was all over, that was what we sent home. He wasn’t simply killed; he was literally destroyed. Through a grisly physics, there was somehow less of him than there had been before, transformed from person into dumb and impersonal matter.
‘Gore,’ he concludes, ‘is about the horror of a person being replaced by stuff that just a moment ago was a person.’ Explosive violence ruptures the integrity of the contained body – splattered over rocks or metal surfaces in a catastrophic bioconvergence.
I hope it will be obvious that none of this is intended to substitute any sort of equivalence for the asymmetries that I have emphasised throughout this commentary. I hope, too, that I’ve provided a provisional supplement to some of the current work on metadata, algorithms and aerial violence – hence my title. As Linda McDowell remarked an age ago – in Working Bodies (pp. 223-4) – the term ‘meatspace’ is offensive in all sorts of ways (its origins lie in cyberpunk where it connoted the opposite to cyberspace, but I concede the opposition is too raw). Still, it is surely important to recover the ways in which later modern war and militarised violence (even in its digital incarnations) is indeed obdurately, viscerally offensive – for all of the attempts to efface what Huw Lemmey once called its ‘devastation in meatspace‘.
News from Lucy Suchman of an important essay she’s just completed with Jutta Weberon Human-Machine Autonomies, available from Academia.edu here.
This is how they begin:
This paper takes up the question of how we might think about the increasing automation of military systems not as an inevitable ‘advancement’ of which we are the interested observers, but as an effect of particular world-making practices in which we need urgently to intervene. We begin from the premise that the foundation of the legality of killing in situations of war is the possibility of discrimination between combatants and non-combatants. At a time when this defining form of situational awareness seems increasingly problematic, military investments in the automation of weapon systems are growing. The trajectory of these investments, moreover, is towards the development and deployment of lethal autonomous weapons; that is, weapon systems in which the identification of targets and initiation of fire is automated in ways that preclude deliberative human intervention. Challenges to these developments underscore the immorality and illegality of delegating responsibility for the use of force against human targets to machines, and the requirements of International Humanitarian Law that there be (human) accountability for acts of killing. In these debates, the articulation of differences between humans and machines is key.
Our aim in this paper is to strengthen arguments against the increasing automation of weapon systems, by expanding the frame or unit of analysis that informs these debates. We begin by tracing the genealogy of concepts of autonomy within the philosophical traditions that inform Artificial Intelligence (AI), with a focus on the history of early cybernetics and contemporary approaches to machine learning in behaviour-based robotics. We argue that while cybernetics and behaviour-based robotics challenge the premises of individual agency, cognition, communication and action that comprise the Enlightenment tradition, they also reiterate aspects of that tradition in the design of putatively intelligent, autonomous machines. This argument is made more concrete through a close reading of the United States Department of Defense Unmanned Systems Integrated Roadmap: FY2013-2038, particularly with respect to plans for future autonomous weapon systems. With that reading in mind, we turn to resources for refiguring agency and autonomy provided by recent scholarship in science and technology studies (STS) informed by feminist theory. This work suggests a shift in conceptions of agency and autonomy, from attributes inherent in entities, to effects of discourses and material practices that variously conjoin and/or delineate differences between humans and machines. This shift leads in turn to a reconceptualization of autonomy and responsibility as always enacted within, rather than as separable from, particular human- machine configurations. We close by considering the implications of these reconceptualizations for questions of responsibility in relation to automated/autonomous weapon systems. Taking as a model feminist projects of deconstructing categorical distinctions while also recognising those distinctions’ cultural-historical effects, we argue for simultaneous attention to the inseparability of human-machine agencies in contemporary war fighting, and to the necessity of delineating human agency and responsibility within political, legal and ethical/moral regimes of accountability.
It’s a must-read, I think, especially in the light of a report from the New York Times of the Long Range Anti-Ship Missile (above) developed for the US military by Lockheed Martin:
On a bright fall day last year off the coast of Southern California, an Air Force B-1 bomber launched an experimental missile that may herald the future of warfare.
Initially, pilots aboard the plane directed the missile, but halfway to its destination, it severed communication with its operators. Alone, without human oversight, the missile decided which of three ships to attack, dropping to just above the sea surface and striking a 260-foot unmanned freighter…
The Pentagon argues that the new antiship missile is only semiautonomous and that humans are sufficiently represented in its targeting and killing decisions. But officials at the Defense Advanced Research Projects Agency, which initially developed the missile, and Lockheed declined to comment on how the weapon decides on targets, saying the information is classified.
“It will be operating autonomously when it searches for the enemy fleet,” said Mark A. Gubrud, a physicist and a member of the International Committee for Robot Arms Control, and an early critic of so-called smart weapons. “This is pretty sophisticated stuff that I would call artificial intelligence outside human control.”
Paul Scharre, a weapons specialist now at the Center for a New American Security who led the working group that wrote the Pentagon directive, said, “It’s valid to ask if this crosses the line.”
And the Israeli military and armaments industry, for whom crossing any line is second nature, are developing what they call a ‘suicide drone’ (really). At Israel Unmanned Systems 2014, a trade fair held in Tel Aviv just three weeks after Israel’s latest assault on Gaza, Dan Cohenreported:
Lieutenant Colonel Itzhar Jona, who heads Israel Aerospace Industries, spoke about “loitering munitions” — what he called a “politically correct” name for Suicide Drones. They are a hybrid of drone and missile technology that have “autonomous and partially autonomous” elements, and are “launched like a missile, fly like an UAV [unmanned aerial vehicle],” and once they identify a target, revert to “attack like a missile.” Jona called the Suicide Drone a “UAV that thinks and decides for itself,” then added, “If you [the operator] aren’t totally clear on the logic, it can even surprise you.”
Jona praised the advantage of the Suicide Drone because the operator “doesn’t have to bring it home or deal with all sorts of dilemmas.” The Suicide Drone will quickly find a target using its internal logic, which Jona explained in this way: “It carries a warhead that eventually needs to explode. There needs to be a target at the end that will want to explode. Or it won’t want to and we will help it explode.”
So thoughtful to protect ‘the operator’ from any stress (even if s/he might be a little ‘surprised’). Here is Mondoweiss‘s subtitled clip from the meeting, which opens with a short discussion of the major role played by UAVs in the air and ground attacks on Gaza, and then Jona describes how ‘we always live on the border’:
My work on drones has been invigorated by reading an outstandingly creative essay by Lucy Suchmanon ‘Situational Awareness: deadly bioconvergence at the boundaries of bodies and machines’, forthcoming at the ever-interesting Mediatropes. It’s sparked both an e-mail conversation and an invitation to speak at a symposium on Security by remote control: automation and autonomy in robot weapon systems at Lancaster University, 22-23 May. Here is the call for papers:
Remotely operated and robotic systems are central to contemporary military operations. Robotic weapons can select targets and deliver lethal force with varying degrees of human control, and technologies for fully autonomous weapon systems are currently in development. Alongside military reconnaissance and the prospective configuration of ‘killer- robots,’ drone technologies are being deployed for ostensibly peaceful purposes, most notably surveillance of public space, private property and national borders. More generally, the frame offered by contemporary security discourses has redrawn previous boundaries regarding the use of state violence in the name of homeland protection. But despite an extended history of investment in technologies that promise to rationalise the conflict zone and accurately identify the imminent threat, the legitimacy and efficacy of actions taken in the name of security is increasingly in question.
The purpose of this symposium is to present and debate current scholarship on the ethics and legality of robotic systems in war and beyond. By robotic systems we mean networked devices with on-board algorithms that direct machine actions (in this case, tracking, targeting and deploying force) in varying configurations of pre-programmed operation and remote human control. The line between automation and autonomy has come under renewed debate in the context of contemporary developments in remotely controlled weapon systems, most prominently uninhabited aerial vehicles or drones. For 
example, in April of 2013 a coalition led by Human Rights Watch initiated a campaign in favour of a legally binding prohibition on the development, production and use of fully autonomous weapon systems. Simultaneously, some military and robotics experts emphasize the advantages of automated weapons and argue that equipping robots with the capacity to make ethical judgments is an achievable technological goal. Within these debates, the ‘human in the loop’ is posited alternately as the safeguard against illegitimate killing, or its source. Implicit across the debate is the premise of a moment of decision in which judgements of identification and appropriate response are made.
While emerging arms control strategies focus on the ‘red line’ that would prohibit the development and use of weapons that remove human judgment from the identification of targets and the decision to fire, the question remains to what extent human judgment and decision-making are already compromised by the intensifications of speed, and associated increase in forms and levels of automation, that characterise contemporary war-fighting, particularly in situations of remote control. Rather than attempting to establish one or the other of these concerns as correct, or even as more important than the other, we seek to focus our discussion on the troubling space between automation and autonomy, to understand more deeply their intimate relations, and the inherent contradictions that conjoin them.
To explore the key stakes and lines of argument in this debate, we invite contributions from scholars in the fields of security, peace and conflict studies, international human rights law, anthropology/sociology of science and technology, technoculture and technomilitarism, computing, simulation and cyber law. The ambition for this event is to stimulate ongoing cross-disciplinary discussion and further research on this topic, drawing on the resources of the Lancaster University centres that are its co-sponsors.
Confirmed Speakers:
Patrick Crogan, Senior Lecturer in Film Studies at the University of the West of England in Bristol, scholar of technoculture, videogames and military technoscience, author of Gameplay Mode: War, Simulation, and Technoculture (2011);
Derek Gregory, Peter Wall Distinguished Professor and Professor of Geography at the University of British Columbia in Vancouver, author of multiple works interrogating social and spatial dimensions of conflict, and currently completing a book titled The Everywhere War (forthcoming);
M. Shane Riza, command pilot and former instructor at the U.S. Air Force Weapons School, author of Killing Without Heart: Limits on Robotic Warfare in an Age of Persistent Conflict (2013);
Christiane Wilke, Associate Professor in Law and Legal Studies at Carleton University, Canada. She has been researching legal responses to state violence and is working on a project about visuality, photography, and international law.
To indicate your interest in participating, or for further information, please contact Lucy Suchman l.suchman@lancaster.ac.uk.
I’m really excited about this; I’m part way through Shane Riza’s book, and it’s already clear that I’m going to learn a lot from the meeting.
The image at the top of this post comes from the CFP, incidentally, but the image below is Margaret Bourke-White‘s classic photograph from the rubble of a bombed German city, which I use when I talk about the ways in which the trauma of air war dislocates the very sinews of language and the capacity to write and re-present (see ‘Doors in to nowhere’, an extended reflection on W.G. Sebald: DOWNLOADS tab). Perhaps I’ll use my time at Lancaster (given the name, a peculiarly appropriate place) to join the dots between the two images and revisit ‘The natural history of destruction’ for the twenty-first century….