Google eyes

The Oxford English Dictionary recognised ‘google’ as a verb in 2006, and its active form is about to gain another dimension.  One of the most persistent anxieties amongst those executing remote warfare, with its extraordinary dependence on (and capacity for) real-time full motion video surveillance as an integral moment of the targeting cycle, has been the ever-present risk of ‘swimming in sensors and drowning in data‘.

But now Kate Conger and Dell Cameron report for Gizmodo on a new collaboration between Google and the Pentagon as part of Project Maven:

Project Maven, a fast-moving Pentagon project also known as the Algorithmic Warfare Cross-Functional Team (AWCFT), was established in April 2017. Maven’s stated mission is to “accelerate DoD’s integration of big data and machine learning.” In total, the Defense Department spent $7.4 billion on artificial intelligence-related areas in 2017, the Wall Street Journal reported.

The project’s first assignment was to help the Pentagon efficiently process the deluge of video footage collected daily by its aerial drones—an amount of footage so vast that human analysts can’t keep up, according to Greg Allen, an adjunct fellow at the Center for a New American Security, who co-authored a lengthy July 2017 report on the military’s use of artificial intelligence. Although the Defense Department has poured resources into the development of advanced sensor technology to gather information during drone flights, it has lagged in creating analysis tools to comb through the data.

“Before Maven, nobody in the department had a clue how to properly buy, field, and implement AI,” Allen wrote.

Maven was tasked with using machine learning to identify vehicles and other objects in drone footage, taking that burden off analysts. Maven’s initial goal was to provide the military with advanced computer vision, enabling the automated detection and identification of objects in as many as 38 categories captured by a drone’s full-motion camera, according to the Pentagon. Maven provides the department with the ability to track individuals as they come and go from different locations.

Google has reportedly attempted to allay fears about its involvement:

A Google spokesperson told Gizmodo in a statement that it is providing the Defense Department with TensorFlow APIs, which are used in machine learning applications, to help military analysts detect objects in images. Acknowledging the controversial nature of using machine learning for military purposes, the spokesperson said the company is currently working “to develop polices and safeguards” around its use.

“We have long worked with government agencies to provide technology solutions. This specific project is a pilot with the Department of Defense, to provide open source TensorFlow APIs that can assist in object recognition on unclassified data,” the spokesperson said. “The technology flags images for human review, and is for non-offensive uses only. Military use of machine learning naturally raises valid concerns. We’re actively discussing this important topic internally and with others as we continue to develop policies and safeguards around the development and use of our machine learning technologies.”

 

As Mehreen Kasana notes, Google has indeed ‘long worked with government agencies’:

2017 report in Quartz shed light on the origins of Google and how a significant amount of funding for the company came from the CIA and NSA for mass surveillance purposes. Time and again, Google’s funding raises questions. In 2013, a Guardian report highlighted Google’s acquisition of the robotics company Boston Dynamics, and noted that most of the projects were funded by the Defense Advanced Research Projects Agency (DARPA).

Drone Imaginaries and Society

News from Kathrin Maurer of a conference on Drone Imaginaries and Society, 5-6 June 2018, at the University of Southern Denmark (Odense).  I’ll be giving a keynote, and look forward to the whole event very much (and not only because, while I’ve visited Denmark lots of times and loved every one of them, I’ve never made it to Odense):

Drones are in the air. The production of civilian drones for rescue, transport, and leisure activity is booming. The Danish government, for example, proclaimed civilian drones a national strategy in 2016. Accordingly, many research institutions as well as the industry focus on the development, usage, and promotion of drone technology. These efforts often prioritize commercialization and engineering as well as setting-up UAV (Unmanned Arial Vehicle) test centers. As a result, urgent questions regarding how drone technology impacts our identity as humans as well as their effects on how we envision the human society are frequently underexposed in these initiatives.

Our conference aims to change this perspective. By investigating cultural representations of civilian and military drones in visual arts, film, and literature, we intend to shed light on drone technology from a humanities’ point of view. This aesthetic “drone imaginary” forms not only the empirical material of our discussions but also a prism of knowledge which provides new insights into the meaning of drone technology for society today.

Several artists, authors, film makers, and thinkers have already engaged in this drone imaginary. While some of these inquiries provide critical reflection on contemporary and future drone technologies – for instance issues such as privacy, surveillance, automation, and security – others allow for alternative ways of seeing and communicating as well as creative re-imagination of new ways of organizing human communities. The goal of the conference is to bring together these different aesthetic imaginaries to better understand the role of drone technologies in contemporary and future societies.

 The focus points of the conference are:

–     Aesthetic drone imaginaries: Which images, metaphors, ethics, emotions and affects are associated to drones through their representation in art, fiction and popular culture?

–     Drone technology and its implications for society: How do drones change our daily routines and push the balance between publicity and privacy?

–     Historical perspective on drones: In what way do drone imaginaries allow for a counter-memory that can challenge, for instance, the military implementation of drones?

–     Drones as vulnerability: Do drones make societies more resilient or more fragile, and are societies getting overly dependent on advanced technologies?

     Utopian or dystopian drone imaginaries: What dream or nightmare scenarios are provided by drone fiction and how do they allow for a (re)imagining of future societies?

–     Drones and remote sensing: In what way do drones mark a radical new way of seeing and sensing by their remotely vertical gaze and operative images?

     Drone warfare: Do drones mark a continuation or rupture of the way we understand war and conflict, and how do they change the military imaginary?

The conference is sponsored by the Drone Network (Danish Research Council) and Institute for the Study of Culture at the University of Southern Denmark.

 You can contact Kathrin at  kamau@sdu.dk

The conference website is here.

Army of None

Coming next spring from Norton – and with a very clever title – Paul Scharre‘s Army of None: Autonomous weapons and the future of war.

A Pentagon defense expert and former U.S. Army Ranger traces the emergence of autonomous weapons.

What happens when a Predator drone has as much autonomy as a Google car? Although it sounds like science fiction, the technology to create weapons that could hunt and destroy targets on their own already exists. Paul Scharre, a leading expert in emerging weapons technologies, draws on incisive research and firsthand experience to explore how increasingly autonomous weapons are changing warfare.

This far-ranging investigation examines the emergence of fully autonomous weapons, the movement to ban them, and the legal and ethical issues surrounding their use. Scharre spotlights the role of artificial intelligence in military technology, spanning decades of innovation from German noise-seeking Wren torpedoes in World War II—antecedents of today’s armed drones—to autonomous cyber weapons. At the forefront of a game-changing debate, Army of None engages military history, global policy, and bleeding-edge science to explore what it would mean to give machines authority over the ultimate decision: life or death.

You can get a taste – in fact a whole tasting menu – of Paul’s arguments at Just Security here.

Paul is at the Centre for a New American Security, and you can download two reports (which have been endlessly ripped by others) on Robotics on the Battlefield: Part I is Range, persistence and daring and Part II is The Coming Swarm (both 2014).

Tracking and targeting

News from Lucy Suchman of a special issue of Science, Technology and Human Values [42 (6) (2017)]  on Tracking and targeting: sociotechnologies of (in)security, which she’s co-edited with Karolina Follis and Jutta Weber.

Here’s the line-up:

Lucy Suchman, Karolina Follis and Jutta Weber: Tracking and targeting

This introduction to the special issue of the same title sets out the context for a critical examination of contemporary developments in sociotechnical systems deployed in the name of security. Our focus is on technologies of tracking, with their claims to enable the identification of those who comprise legitimate targets for the use of violent force. Taking these claims as deeply problematic, we join a growing body of scholarship on the technopolitical logics that underpin an increasingly violent landscape of institutions, infrastructures, and actions, promising protection to some but arguably contributing to our collective insecurity. We examine the asymmetric distributions of sociotechnologies of (in)security; their deadly and injurious effects; and the legal, ethical, and moral questions that haunt their operations.

Karolina Follis: Visions and transterritory: the borders of Europe

This essay is about the role of visual surveillance technologies in the policing of the external borders of the European Union (EU). Based on an analysis of documents published by EU institutions and independent organizations, I argue that these technological innovations fundamentally alter the nature of national borders. I discuss how new technologies of vision are deployed to transcend the physical limits of territories. In the last twenty years, EU member states and institutions have increasingly relied on various forms of remote tracking, including the use of drones for the purposes of monitoring frontier zones. In combination with other facets of the EU border management regime (such as transnational databases and biometrics), these technologies coalesce into a system of governance that has enabled intervention into neighboring territories and territorial waters of other states to track and target migrants for interception in the “prefrontier.” For jurisdictional reasons, this practice effectively precludes the enforcement of legal human rights obligations, which European states might otherwise have with regard to these persons. This article argues that this technologically mediated expansion of vision has become a key feature of post–cold war governance of borders in Europe. The concept of transterritory is proposed to capture its effects.

Christiane Wilke: Seeing and unmaking civilians in Afghanistan: visual technologies and contested professional visions

While the distinction between civilians and combatants is fundamental to international law, it is contested and complicated in practice. How do North Atlantic Treaty Organization (NATO) officers see civilians in Afghanistan? Focusing on 2009 air strike in Kunduz, this article argues that the professional vision of NATO officers relies not only on recent military technologies that allow for aerial surveillance, thermal imaging, and precise targeting but also on the assumptions, vocabularies, modes of attention, and hierarchies of knowledges that the officers bring to the interpretation of aerial surveillance images. Professional vision is socially situated and frequently contested with communities of practice. In the case of the Kunduz air strike, the aerial vantage point and the military visual technologies cannot fully determine what would be seen. Instead, the officers’ assumptions about Afghanistan, threats, and the gender of the civilian inform the vocabulary they use for coding people and places as civilian or noncivilian. Civilians are not simply “found,” they are produced through specific forms of professional vision.

Jon Lindsay: Target practice: Counterterrorism and the amplification of data friction

The nineteenth-century strategist Carl von Clausewitz describes “fog” and “friction” as fundamental features of war. Military leverage of sophisticated information technology in the twenty-first century has improved some tactical operations but has not lifted the fog of war, in part, because the means for reducing uncertainty create new forms of it. Drawing on active duty experience with an American special operations task force in Western Iraq from 2007 to 2008, this article traces the targeting processes used to “find, fix, and finish” alleged insurgents. In this case they did not clarify the political reality of Anbar province but rather reinforced a parochial worldview informed by the Naval Special Warfare community. The unit focused on the performance of “direct action” raids during a period in which “indirect action” engagement with the local population was arguably more appropriate for the strategic circumstances. The concept of “data friction”, therefore, can be understood not simply as a form of resistance within a sociotechnical system but also as a form of traction that enables practitioners to construct representations of the world that amplify their own biases.

M.C. Elish: Remote split: a history of US drone operations and the distributed labour of war

This article analyzes US drone operations through a historical and ethnographic analysis of the remote split paradigm used by the US Air Force. Remote split refers to the globally distributed command and control of drone operations and entails a network of human operators and analysts in the Middle East, Europe, and Southeast Asia as well as in the continental United States. Though often viewed as a teleological progression of “unmanned” warfare, this paper argues that historically specific technopolitical logics establish the conditions of possibility for the work of war to be divisible into discreet and computationally mediated tasks that are viewed as effective in US military engagements. To do so, the article traces how new forms of authorized evidence and expertise have shaped developments in military operations and command and control priorities from the Cold War and the “electronic battlefield” of Vietnam through the Gulf War and the conflict in the Balkans to contemporary deployments of drone operations. The article concludes by suggesting that it is by paying attention to divisions of labor and human–machine configurations that we can begin to understand the everyday and often invisible structures that sustain perpetual war as a military strategy of the United States.

I’ve discussed Christiane’s excellent article in detail before, but the whole issue repays careful reading.

And if you’re curious about the map that heads this post, it’s based on the National Security Agency’s Strategic Mission List (dated 2007 and published in the New York Times on 2 November 2013), and mapped at Electrospaces: full details here.

The evolution of warfare

irrc-2016

The latest issue of the International Review of the Red Cross (open access here) focuses on the evolution of warfare:

To mark the 100th anniversary of the First World War, the Review asked historians, legal scholars and humanitarian practitioners to look back at the wars of the past century from a humanitarian point of view. In using what we know of the past to illuminate the present and the future, this issue of the Review adopts a long-term perspective, with the aim to illustrate the changing face of conflict by placing human suffering ‒ so often relegated to the backdrop of history ‒ front and center. It focuses on WWI and the period immediately leading up to it as a turning point in the history of armed conflict, drawing important parallels between the past and the changes we are witnessing today.

Among the highlights: an interview with Richard Overy on the history of bombing; Eric Germain, ‘Out of sight, out of reach: Moral issues in the globalization of the battlefield’; Lindsey Cameron, ‘The ICRC in the First World War: Unwavering belief in the power of law?’; Rain Liivoja, ‘Technological change and the evolution of the law of war’; Claudia McGoldrick, ‘The state of conflicts today: Can humanitarian action adapt?’; and Anna Di Lellio and Emanuele Castano, ‘The danger of “new norms” and the continuing relevance of IHL in the post-9/11 era’.

Incidentally, there may be something Darwinian about the trajectory of modern war – but I’m not sure that ‘evolution’ is exactly the right word…

Seeing machines

 

graham-drone-cover

The Transnational Institute has published a glossy version of a chapter from Steve Graham‘s Vertical – called Drone: Robot Imperium, you can download it here (open access).  Not sure about either of the terms in the subtitle, but it’s a good read and richly illustrated.

Steve includes a discussion of the use of drones to patrol the US-Mexico border, and Josh Begley has published a suggestive account of the role of drones but also other ‘seeing machines’ in visualizing the border.

One way the border is performed — particularly the southern border of the United States — can be understood through the lens of data collection. In the border region, along the Rio Grande and westward through the desert Southwest, Customs and Border Protection (CBP) deploys radar blimps, drones, fixed-wing aircraft, helicopters, seismic sensors, ground radar, face recognition software, license-plate readers, and high-definition infrared video cameras. Increasingly, they all feed data back into something called “The Big Pipe.”

Josh downloaded 20,000 satellite images of the border, stitched them together, and then worked with Laura Poitras and her team at Field of Vision to produce a short film – Best of Luck with the Wall – that traverses the entire length of the border (1, 954 miles) in six minutes:

The southern border is a space that has been almost entirely reduced to metaphor. It is not even a geography. Part of my intention with this film is to insist on that geography.

By focusing on the physical landscape, I hope viewers might gain a sense of the enormity of it all, and perhaps imagine what it would mean to be a political subject of that terrain.

begley-fatal-migrations-1

If you too wonder about that last sentence and its latent bio-physicality – and there is of course a rich stream of work on the bodies that seek to cross that border – then you might visit another of Josh’s projects, Fatal Migrations, 2011-2016 (see above and below).

begley-fatal-migrations-2

There’s an interview with Josh that, among other things, links these projects with his previous work.

I have a couple of projects that are smartphone centered. One of them is about mapping the geography of places around the world where the CIA carries out drone strikes—mostly in Pakistan, Yemen, and Somalia. Another was about looking at the geography of incarceration in the United States—there are more than 5,000 prisons—and trying to map all of them and see them through satellites. I currently have an app that is looking at the geography of police violence in the United States. Most of these apps are about creating a relationship between data and the body, where you can receive a notification every time something unsettling happens. What does that mean for the rest of your day? How do you live with that data—data about people? In some cases the work grows out of these questions, but in other cases the work really is about landscape….

There’s just so much you can never know from looking at satellite imagery. By definition it flattens and distorts things. A lot of folks who fly drones, for instance, think they know a space just from looking at it from above. I firmly reject that idea. The bird’s eye view is never what it means to be on the ground somewhere, or what it means to have meaningful relationships with people on the ground. I feel like I can understand the landscape from 30,000 feet, but it is not the same as spending time in a space.

Anjali Nath has also provided a new commentary on one of Josh’s earlier projects, Metadata, that he cites in that interview – ‘Touched from below: on drones, screens and navigation’, Visual Anthropology 29 (3) (2016) 315-30.

It’s part of a special issue on ‘Visual Revolutions in the Middle East’, and as I explore the visual interventions I’ve included in this post I find myself once again thinking of a vital remark by Edward Said:

we-are-also-looking-at-our-observers-001

That’s part of the message behind the #NotaBugSplat image on the cover of Steve’s essay: but what might Said’s remark mean more generally today, faced with the proliferation of these seeing machines?

 

Game of Drones

creechcasino_0

Joe Pugliese has sent me a copy of his absorbing new essay, ‘Drone casino mimesis: telewarfare and civil militarization‘, which appears in Australia’s Journal of sociology (2016) (online early).  Here’s the abstract:

This article stages an examination of the complex imbrication of contemporary civil society with war and militarized violence. I ground my investigation in the context of the increasing cooption of civil sites, practices and technologies by the United States military in order to facilitate their conduct of war and the manner in which drone warfare has now been seamlessly accommodated within major metropolitan cities such as Las Vegas, Nevada. In the context of the article, I coin and deploy the term civil militarization. Civil militarization articulates the colonizing of civilian sites, practices and technologies by the military; it names the conversion of such civilian technologies as video games and mobile phones into technologies of war; and it addresses the now quasi-seamless flow that telewarfare enables between military sites and the larger suburban grid and practices of everyday life. In examining drone kills in the context of Nellis Air Force Base, Las Vegas, I bring into focus a new military configuration that I term ‘drone casino mimesis’.

I’m particularly interested in what Joe has to say about what he calls the ‘casino logic and faming mimesis’ of ‘the drone habitus’.  Most readers will know that ‘Nellis’ (more specifically, Creech Air Force Base, formerly Indian Springs), for long the epicentre of the US Air Force’s remote operations, is a short drive from Las Vegas – and those who have seen Omar Fast‘s 5,000 Feet is Best will remember the artful way in which it loops between the two.

drone-pilots

Two passages from Joe’s essay have set me thinking.  First Joe moves far beyond the usual (often facile) comparison between the video displays in the Ground Control Station and video games to get at the algorithms and probabilities that animate them:

‘…there are mimetic relations of exchange between Las Vegas’s and Nellis’s gaming consoles, screens and cubicles.

AR-701029956.jpg&updated=201401020851&MaxW=800&maxH=800&noborder

‘Iconographically and infrastructurally, casino gaming and drone technologies stand as mirror images of each other. My argument, however, is not that both these practices and technologies merely ‘reflect’ each other; rather, I argue that gaming practices and technologies effectively work to constitute and inflect drone practices and technologies on a number of levels. Casino drone mimesis identifies, in new materialist terms, the agentic role of casino and gaming technologies precisely as ‘actors’ (Latour, 2004: 226) in the shaping and mutating of both the technologies and conduct of war. Situated within a new materialist schema, I contend that the mounting toll of civilian deaths due to drone strikes is not only a result of human failure or error – for example, the misreading of drone video feed, the miscalculation of targets and so on. Rather, civilian drone kills must be seen as an in-built effect of military technologies that are underpinned by both the morphology (gaming consoles, video screens and joysticks) and the algorithmic infrastructure of gaming – with its foundational dependence on ‘good approximation’ ratios and probability computation.’

And then this second passage where Joe develops what he calls ‘the “bets” and “gambles” on civilian life’:

‘[Bugsplat’ constitutes a] militarized colour-coding system that critically determines the kill value of the target. In the words of one former US intelligence official:

You say something like ‘Show me the Bugsplat.’ That’s what we call the probability of a kill estimate when we are doing this final math before the ‘Go go go’ decision. You would actually get a picture of a compound, and there will be something on it that looks like a bugsplat actually with red, yellow, and green: with red being anybody in that spot is dead, yellow stands a chance of being wounded; green we expect no harm to come to individuals where there is green. (Quoted in Woods, 2015: 150)

Described here is a mélange of paintball and video gaming techniques that is underpinned, in turn, by the probability stakes of casino gaming: as the same drone official concludes, ‘when all those conditions have been met, you may give the order to go ahead and spend the money’ (quoted in Woods, 2015: 150). In the world of drone casino mimesis, when all those gaming conditions have been met, you spend the money, fire your missiles and hope to make a killing. In the parlance of drone operators, if you hit and kill the person you intended to kill ‘that person is called a “jackpot”’ (Begley, 2015: 7). Evidenced here is the manner in which the lexicon of casino gaming is now clearly constitutive of the practices of drone kills. In the world of drone casino mimesis, the gambling stakes are high. ‘The position I took,’ says a drone screener, ‘is that every call I make is a gamble, and I’m betting on their life’ (quoted in Fielding-Smith and Black, 2015).

There is much more to Joe’s essay than this, but these passages add considerably to my own discussion of the US targeted killing program in the Federally Administered Tribal Areas of Pakistan in ‘Dirty dancing’.  You can find the whole essay under the DOWNLOADS tab, but this is the paragraph I have in mind (part of an extended discussion of the ‘technicity’ of the US targeted killing program and its reliance on kill lists, signals intercepts and visual feeds):

The kill list embedded in the [disposition] matrix has turned out to be infinitely extendable, more like a revolving door than a rolodex, so much so that at one point an exasperated General Kayani demanded that Admiral Mullen explain how, after hundreds of drone strikes, ‘the United States [could] possibly still be working its way through a “top 20” list?’  The answer lies not only in the remarkable capacity of al Qaeda and the Taliban to regenerate: the endless expansion of the list is written into the constitution of the database and the algorithms from which it emerges. The database accumulates information from multiple agencies, but for targets in the FATA the primary sources are ground intelligence from agents and informants, signals intelligence from the National Security Agency (NSA), and surveillance imagery from the US Air Force. Algorithms are then used to search the database to produce correlations, coincidences and connections that serve to identify suspects, confirm their guilt and anticipate their future actions. Jutta Weber explains that the process follows ‘a logic of eliminating every possible danger’:

‘[T]he database is the perfect tool for pre-emptive security measures because it has no need of the logic of cause and effect. It widens the search space and provides endless patterns of possibilistic networks.’

Although she suggests that the growth of ‘big data’ and the transition from hierarchical to relational and now post-relational databases has marginalised earlier narrative forms, these reappear as soon as suspects have been conjured from the database. The case for including – killing – each individual on the list is exported from its digital target folder to a summary Powerpoint slide called a ‘baseball card’ that converts into a ‘storyboard’ after each mission. Every file is vetted by the CIA’s lawyers and General Counsel, and by deputies at the National Security Council, and all ‘complex cases’ have to be approved by the President. Herein lies the real magic of the system. ‘To make the increasingly powerful non-human agency of algorithms and database systems invisible,’ Weber writes, ‘the symbolic power of the sovereign is emphasised: on “Terror Tuesdays” it (appears that it) is only the sovereign who decides about life and death.’ But this is an optical illusion. As Louise Amoore argues more generally, ‘the sovereign strike is always something more, something in excess of a single flash of decision’ and emerges instead from a constellation of prior practices and projected calculations.