Tracking and targeting

News from Lucy Suchman of a special issue of Science, Technology and Human Values [42 (6) (2017)]  on Tracking and targeting: sociotechnologies of (in)security, which she’s co-edited with Karolina Follis and Jutta Weber.

Here’s the line-up:

Lucy Suchman, Karolina Follis and Jutta Weber: Tracking and targeting

This introduction to the special issue of the same title sets out the context for a critical examination of contemporary developments in sociotechnical systems deployed in the name of security. Our focus is on technologies of tracking, with their claims to enable the identification of those who comprise legitimate targets for the use of violent force. Taking these claims as deeply problematic, we join a growing body of scholarship on the technopolitical logics that underpin an increasingly violent landscape of institutions, infrastructures, and actions, promising protection to some but arguably contributing to our collective insecurity. We examine the asymmetric distributions of sociotechnologies of (in)security; their deadly and injurious effects; and the legal, ethical, and moral questions that haunt their operations.

Karolina Follis: Visions and transterritory: the borders of Europe

This essay is about the role of visual surveillance technologies in the policing of the external borders of the European Union (EU). Based on an analysis of documents published by EU institutions and independent organizations, I argue that these technological innovations fundamentally alter the nature of national borders. I discuss how new technologies of vision are deployed to transcend the physical limits of territories. In the last twenty years, EU member states and institutions have increasingly relied on various forms of remote tracking, including the use of drones for the purposes of monitoring frontier zones. In combination with other facets of the EU border management regime (such as transnational databases and biometrics), these technologies coalesce into a system of governance that has enabled intervention into neighboring territories and territorial waters of other states to track and target migrants for interception in the “prefrontier.” For jurisdictional reasons, this practice effectively precludes the enforcement of legal human rights obligations, which European states might otherwise have with regard to these persons. This article argues that this technologically mediated expansion of vision has become a key feature of post–cold war governance of borders in Europe. The concept of transterritory is proposed to capture its effects.

Christiane Wilke: Seeing and unmaking civilians in Afghanistan: visual technologies and contested professional visions

While the distinction between civilians and combatants is fundamental to international law, it is contested and complicated in practice. How do North Atlantic Treaty Organization (NATO) officers see civilians in Afghanistan? Focusing on 2009 air strike in Kunduz, this article argues that the professional vision of NATO officers relies not only on recent military technologies that allow for aerial surveillance, thermal imaging, and precise targeting but also on the assumptions, vocabularies, modes of attention, and hierarchies of knowledges that the officers bring to the interpretation of aerial surveillance images. Professional vision is socially situated and frequently contested with communities of practice. In the case of the Kunduz air strike, the aerial vantage point and the military visual technologies cannot fully determine what would be seen. Instead, the officers’ assumptions about Afghanistan, threats, and the gender of the civilian inform the vocabulary they use for coding people and places as civilian or noncivilian. Civilians are not simply “found,” they are produced through specific forms of professional vision.

Jon Lindsay: Target practice: Counterterrorism and the amplification of data friction

The nineteenth-century strategist Carl von Clausewitz describes “fog” and “friction” as fundamental features of war. Military leverage of sophisticated information technology in the twenty-first century has improved some tactical operations but has not lifted the fog of war, in part, because the means for reducing uncertainty create new forms of it. Drawing on active duty experience with an American special operations task force in Western Iraq from 2007 to 2008, this article traces the targeting processes used to “find, fix, and finish” alleged insurgents. In this case they did not clarify the political reality of Anbar province but rather reinforced a parochial worldview informed by the Naval Special Warfare community. The unit focused on the performance of “direct action” raids during a period in which “indirect action” engagement with the local population was arguably more appropriate for the strategic circumstances. The concept of “data friction”, therefore, can be understood not simply as a form of resistance within a sociotechnical system but also as a form of traction that enables practitioners to construct representations of the world that amplify their own biases.

M.C. Elish: Remote split: a history of US drone operations and the distributed labour of war

This article analyzes US drone operations through a historical and ethnographic analysis of the remote split paradigm used by the US Air Force. Remote split refers to the globally distributed command and control of drone operations and entails a network of human operators and analysts in the Middle East, Europe, and Southeast Asia as well as in the continental United States. Though often viewed as a teleological progression of “unmanned” warfare, this paper argues that historically specific technopolitical logics establish the conditions of possibility for the work of war to be divisible into discreet and computationally mediated tasks that are viewed as effective in US military engagements. To do so, the article traces how new forms of authorized evidence and expertise have shaped developments in military operations and command and control priorities from the Cold War and the “electronic battlefield” of Vietnam through the Gulf War and the conflict in the Balkans to contemporary deployments of drone operations. The article concludes by suggesting that it is by paying attention to divisions of labor and human–machine configurations that we can begin to understand the everyday and often invisible structures that sustain perpetual war as a military strategy of the United States.

I’ve discussed Christiane’s excellent article in detail before, but the whole issue repays careful reading.

And if you’re curious about the map that heads this post, it’s based on the National Security Agency’s Strategic Mission List (dated 2007 and published in the New York Times on 2 November 2013), and mapped at Electrospaces: full details here.

The evolution of warfare

irrc-2016

The latest issue of the International Review of the Red Cross (open access here) focuses on the evolution of warfare:

To mark the 100th anniversary of the First World War, the Review asked historians, legal scholars and humanitarian practitioners to look back at the wars of the past century from a humanitarian point of view. In using what we know of the past to illuminate the present and the future, this issue of the Review adopts a long-term perspective, with the aim to illustrate the changing face of conflict by placing human suffering ‒ so often relegated to the backdrop of history ‒ front and center. It focuses on WWI and the period immediately leading up to it as a turning point in the history of armed conflict, drawing important parallels between the past and the changes we are witnessing today.

Among the highlights: an interview with Richard Overy on the history of bombing; Eric Germain, ‘Out of sight, out of reach: Moral issues in the globalization of the battlefield’; Lindsey Cameron, ‘The ICRC in the First World War: Unwavering belief in the power of law?’; Rain Liivoja, ‘Technological change and the evolution of the law of war’; Claudia McGoldrick, ‘The state of conflicts today: Can humanitarian action adapt?’; and Anna Di Lellio and Emanuele Castano, ‘The danger of “new norms” and the continuing relevance of IHL in the post-9/11 era’.

Incidentally, there may be something Darwinian about the trajectory of modern war – but I’m not sure that ‘evolution’ is exactly the right word…

Seeing machines

 

graham-drone-cover

The Transnational Institute has published a glossy version of a chapter from Steve Graham‘s Vertical – called Drone: Robot Imperium, you can download it here (open access).  Not sure about either of the terms in the subtitle, but it’s a good read and richly illustrated.

Steve includes a discussion of the use of drones to patrol the US-Mexico border, and Josh Begley has published a suggestive account of the role of drones but also other ‘seeing machines’ in visualizing the border.

One way the border is performed — particularly the southern border of the United States — can be understood through the lens of data collection. In the border region, along the Rio Grande and westward through the desert Southwest, Customs and Border Protection (CBP) deploys radar blimps, drones, fixed-wing aircraft, helicopters, seismic sensors, ground radar, face recognition software, license-plate readers, and high-definition infrared video cameras. Increasingly, they all feed data back into something called “The Big Pipe.”

Josh downloaded 20,000 satellite images of the border, stitched them together, and then worked with Laura Poitras and her team at Field of Vision to produce a short film – Best of Luck with the Wall – that traverses the entire length of the border (1, 954 miles) in six minutes:

The southern border is a space that has been almost entirely reduced to metaphor. It is not even a geography. Part of my intention with this film is to insist on that geography.

By focusing on the physical landscape, I hope viewers might gain a sense of the enormity of it all, and perhaps imagine what it would mean to be a political subject of that terrain.

begley-fatal-migrations-1

If you too wonder about that last sentence and its latent bio-physicality – and there is of course a rich stream of work on the bodies that seek to cross that border – then you might visit another of Josh’s projects, Fatal Migrations, 2011-2016 (see above and below).

begley-fatal-migrations-2

There’s an interview with Josh that, among other things, links these projects with his previous work.

I have a couple of projects that are smartphone centered. One of them is about mapping the geography of places around the world where the CIA carries out drone strikes—mostly in Pakistan, Yemen, and Somalia. Another was about looking at the geography of incarceration in the United States—there are more than 5,000 prisons—and trying to map all of them and see them through satellites. I currently have an app that is looking at the geography of police violence in the United States. Most of these apps are about creating a relationship between data and the body, where you can receive a notification every time something unsettling happens. What does that mean for the rest of your day? How do you live with that data—data about people? In some cases the work grows out of these questions, but in other cases the work really is about landscape….

There’s just so much you can never know from looking at satellite imagery. By definition it flattens and distorts things. A lot of folks who fly drones, for instance, think they know a space just from looking at it from above. I firmly reject that idea. The bird’s eye view is never what it means to be on the ground somewhere, or what it means to have meaningful relationships with people on the ground. I feel like I can understand the landscape from 30,000 feet, but it is not the same as spending time in a space.

Anjali Nath has also provided a new commentary on one of Josh’s earlier projects, Metadata, that he cites in that interview – ‘Touched from below: on drones, screens and navigation’, Visual Anthropology 29 (3) (2016) 315-30.

It’s part of a special issue on ‘Visual Revolutions in the Middle East’, and as I explore the visual interventions I’ve included in this post I find myself once again thinking of a vital remark by Edward Said:

we-are-also-looking-at-our-observers-001

That’s part of the message behind the #NotaBugSplat image on the cover of Steve’s essay: but what might Said’s remark mean more generally today, faced with the proliferation of these seeing machines?

 

Game of Drones

creechcasino_0

Joe Pugliese has sent me a copy of his absorbing new essay, ‘Drone casino mimesis: telewarfare and civil militarization‘, which appears in Australia’s Journal of sociology (2016) (online early).  Here’s the abstract:

This article stages an examination of the complex imbrication of contemporary civil society with war and militarized violence. I ground my investigation in the context of the increasing cooption of civil sites, practices and technologies by the United States military in order to facilitate their conduct of war and the manner in which drone warfare has now been seamlessly accommodated within major metropolitan cities such as Las Vegas, Nevada. In the context of the article, I coin and deploy the term civil militarization. Civil militarization articulates the colonizing of civilian sites, practices and technologies by the military; it names the conversion of such civilian technologies as video games and mobile phones into technologies of war; and it addresses the now quasi-seamless flow that telewarfare enables between military sites and the larger suburban grid and practices of everyday life. In examining drone kills in the context of Nellis Air Force Base, Las Vegas, I bring into focus a new military configuration that I term ‘drone casino mimesis’.

I’m particularly interested in what Joe has to say about what he calls the ‘casino logic and faming mimesis’ of ‘the drone habitus’.  Most readers will know that ‘Nellis’ (more specifically, Creech Air Force Base, formerly Indian Springs), for long the epicentre of the US Air Force’s remote operations, is a short drive from Las Vegas – and those who have seen Omar Fast‘s 5,000 Feet is Best will remember the artful way in which it loops between the two.

drone-pilots

Two passages from Joe’s essay have set me thinking.  First Joe moves far beyond the usual (often facile) comparison between the video displays in the Ground Control Station and video games to get at the algorithms and probabilities that animate them:

‘…there are mimetic relations of exchange between Las Vegas’s and Nellis’s gaming consoles, screens and cubicles.

AR-701029956.jpg&updated=201401020851&MaxW=800&maxH=800&noborder

‘Iconographically and infrastructurally, casino gaming and drone technologies stand as mirror images of each other. My argument, however, is not that both these practices and technologies merely ‘reflect’ each other; rather, I argue that gaming practices and technologies effectively work to constitute and inflect drone practices and technologies on a number of levels. Casino drone mimesis identifies, in new materialist terms, the agentic role of casino and gaming technologies precisely as ‘actors’ (Latour, 2004: 226) in the shaping and mutating of both the technologies and conduct of war. Situated within a new materialist schema, I contend that the mounting toll of civilian deaths due to drone strikes is not only a result of human failure or error – for example, the misreading of drone video feed, the miscalculation of targets and so on. Rather, civilian drone kills must be seen as an in-built effect of military technologies that are underpinned by both the morphology (gaming consoles, video screens and joysticks) and the algorithmic infrastructure of gaming – with its foundational dependence on ‘good approximation’ ratios and probability computation.’

And then this second passage where Joe develops what he calls ‘the “bets” and “gambles” on civilian life’:

‘[Bugsplat’ constitutes a] militarized colour-coding system that critically determines the kill value of the target. In the words of one former US intelligence official:

You say something like ‘Show me the Bugsplat.’ That’s what we call the probability of a kill estimate when we are doing this final math before the ‘Go go go’ decision. You would actually get a picture of a compound, and there will be something on it that looks like a bugsplat actually with red, yellow, and green: with red being anybody in that spot is dead, yellow stands a chance of being wounded; green we expect no harm to come to individuals where there is green. (Quoted in Woods, 2015: 150)

Described here is a mélange of paintball and video gaming techniques that is underpinned, in turn, by the probability stakes of casino gaming: as the same drone official concludes, ‘when all those conditions have been met, you may give the order to go ahead and spend the money’ (quoted in Woods, 2015: 150). In the world of drone casino mimesis, when all those gaming conditions have been met, you spend the money, fire your missiles and hope to make a killing. In the parlance of drone operators, if you hit and kill the person you intended to kill ‘that person is called a “jackpot”’ (Begley, 2015: 7). Evidenced here is the manner in which the lexicon of casino gaming is now clearly constitutive of the practices of drone kills. In the world of drone casino mimesis, the gambling stakes are high. ‘The position I took,’ says a drone screener, ‘is that every call I make is a gamble, and I’m betting on their life’ (quoted in Fielding-Smith and Black, 2015).

There is much more to Joe’s essay than this, but these passages add considerably to my own discussion of the US targeted killing program in the Federally Administered Tribal Areas of Pakistan in ‘Dirty dancing’.  You can find the whole essay under the DOWNLOADS tab, but this is the paragraph I have in mind (part of an extended discussion of the ‘technicity’ of the US targeted killing program and its reliance on kill lists, signals intercepts and visual feeds):

The kill list embedded in the [disposition] matrix has turned out to be infinitely extendable, more like a revolving door than a rolodex, so much so that at one point an exasperated General Kayani demanded that Admiral Mullen explain how, after hundreds of drone strikes, ‘the United States [could] possibly still be working its way through a “top 20” list?’  The answer lies not only in the remarkable capacity of al Qaeda and the Taliban to regenerate: the endless expansion of the list is written into the constitution of the database and the algorithms from which it emerges. The database accumulates information from multiple agencies, but for targets in the FATA the primary sources are ground intelligence from agents and informants, signals intelligence from the National Security Agency (NSA), and surveillance imagery from the US Air Force. Algorithms are then used to search the database to produce correlations, coincidences and connections that serve to identify suspects, confirm their guilt and anticipate their future actions. Jutta Weber explains that the process follows ‘a logic of eliminating every possible danger’:

‘[T]he database is the perfect tool for pre-emptive security measures because it has no need of the logic of cause and effect. It widens the search space and provides endless patterns of possibilistic networks.’

Although she suggests that the growth of ‘big data’ and the transition from hierarchical to relational and now post-relational databases has marginalised earlier narrative forms, these reappear as soon as suspects have been conjured from the database. The case for including – killing – each individual on the list is exported from its digital target folder to a summary Powerpoint slide called a ‘baseball card’ that converts into a ‘storyboard’ after each mission. Every file is vetted by the CIA’s lawyers and General Counsel, and by deputies at the National Security Council, and all ‘complex cases’ have to be approved by the President. Herein lies the real magic of the system. ‘To make the increasingly powerful non-human agency of algorithms and database systems invisible,’ Weber writes, ‘the symbolic power of the sovereign is emphasised: on “Terror Tuesdays” it (appears that it) is only the sovereign who decides about life and death.’ But this is an optical illusion. As Louise Amoore argues more generally, ‘the sovereign strike is always something more, something in excess of a single flash of decision’ and emerges instead from a constellation of prior practices and projected calculations.

The machinery of (writing about) bombing

I began the first of my Tanner Lectures – Reach from the Sky – with a discussion of the machinery of bombing, and I started by describing an extraordinary scene: the window of a Georgian terrace house in London being popped out – but not by a bomb.  The year was 1968, and the novelist Len Deighton was taking delivery of the first word-processor to be leased (not even sold) to an individual.

As Matthew Kirschenbaum told the story in Slate:

The IBM technician who serviced Deighton’s typewriters had just heard from Deighton’s personal assistant, Ms. Ellenor Handley, that she had been retyping chapter drafts for his book in progress dozens of times over. IBM had a machine that could help, the technician mentioned. They were being used in the new ultramodern Shell Centre on the south bank of the Thames, not far from his Merrick Square home.

A few weeks later, Deighton stood outside his Georgian terrace home and watched as workers removed a window so that a 200-pound unit could be hoisted inside with a crane. The machine was IBM’s MTST (Magnetic Tape Selectric Typewriter).

It was a lovely story, because the novel Deighton was working on – almost certainly the first to be written on a word-processor – was his brilliant account of bombing in the Second World War, Bomber.  It had started out as a non-fiction book (and Deighton has published several histories of the period) but as it turned into a novel the pace of research never slackened.

Deighton recalls that he had shelved his original project until a fellow writer, Julian Symons, told him that he was ‘the only person he could think of who actually liked machines’:

I had been saying that machines are simply machines… That conversation set me thinking again about the bombing raids. And about writing a book about them. The technology was complex but not so complex as to be incomprehensible. Suppose I wrote a story in which the machines of one nation fought the machines of another? The epitome of such a battle must be the radar war fought in pitch darkness. To what extent could I use my idea in depicting the night bombing war? Would there be a danger that such a theme would eliminate the human content of the book? The human element was already a difficult aspect of writing such a story.

And so Bomber was born.

The novel describes the events surrounding an Allied attack during the night of 31 June (sic) 1943 – the planned target was Krefeld, but the town that was attacked, a ‘target of opportunity’, was ‘Altgarten’.  And like the bombing raid, it was a long haul.  As Deighton explained:

I am a slow worker so that each book takes well over a year—some took several years—and I had always ‘constructed’ my books rather than written them. Until the IBM machine arrived I used scissors and paste (actually Copydex one of those milk glues) to add paras, dump pages and rearrange sections of material. Having been trained as an illustrator I saw no reason to work from start to finish. I reasoned that a painting is not started in the top left hand corner and finished in the bottom right corner: why should a book be put together in a straight line?

Deighton’s objective, so he said, was ‘to emphasize the dehumanizing effect of mechanical warfare. I like machines but in wars all humans are their victims.’

I pulled all this together in this slide:

Len Deighton BOMBER (Tanner Lecture 1).001

I then riffed off Deighton’s work in two ways.

First, I noted that Bomber was written at the height of the Vietnam War, what James Gibson calls ‘techno-war’:

Len Deighton TECHNOWAR (Tanner 2).001

I focused on the so-called ‘electronic battlefield’ that I had discussed in detail in ‘Lines of descent’ (DOWNLOADS tab), and its attempt to interdict the supply lines that snaked along the Ho Chi Minh Trail by sowing it with sensors and automating bombing:

Electronic battlefield 1 (Tanner Lectures).001 Electronic battlefield 2 (Tanner Lectures).001

The system was an expensive failure – technophiles and technophobes alike miss that sharp point – but it prefigured the logic that animates today’s remote operations:

Electronic battlefield 3 (Tanner Lectures).001

Second – in fact, in the second lecture – I returned to Bomber and explored the relations between Deighton’s ‘men and machines’.  There I emphasised the intimacy of a bomber crew in the Second World War (contrasting this with the impersonal shift-work that characterises today’s crews operating Predators and Reapers).  ‘In the air’, wrote John Watson in Johnny Kinsman, ‘they were component parts of a machine, welded together, dependent on each other.’  This was captured perfectly, I think, in this photograph by the inimitable Margaret Bourke-White:

Men-machines (BOURKE-WHITE) Tanner Lectures).001

Much to say about the human, the machine and the cyborg, no doubt, but what has brought all this roaring back is another image of the entanglements between humans and machines that returns me to my starting-point.  In a fine essay in The Paris Review, ‘This faithful machine‘, Matthew Kirschenbaum revisits the history of word-processing.  It’s a fascinating read, and it’s headed by this photograph of Len Deighton working on Bomber in his study:

deighton-home-office-1

Behind him you can see giant cut-away diagrams of British and German bombers, and on the left a Bomber Command route map to ‘the target for tonight’ (the red ribbon crossing the map of Europe), and below that a target map.  ‘Somber things,’ he called them in Bomber:

‘inflammable forest and built-up areas defined as grey blocks and shaded angular shapes.  The only white marks were the thin rivers and blobs of lake.  The roads were purple veins so that the whole thing was like a badly bruised torso.’

More on all that in my ‘Doors into nowhere’ (DOWNLOADS tab), and much more on the history of word-processign in Matthew’s Track Changes: A Literary History of Word Processing just out from Harvard University Press:

The story of writing in the digital age is every bit as messy as the ink-stained rags that littered the floor of Gutenberg’s print shop or the hot molten lead of the Linotype machine. During the period of the pivotal growth and widespread adoption of word processing as a writing technology, some authors embraced it as a marvel while others decried it as the death of literature. The product of years of archival research and numerous interviews conducted by the author, Track Changes is the first literary history of word processing.

Matthew Kirschenbaum examines how the interests and ideals of creative authorship came to coexist with the computer revolution. Who were the first adopters? What kind of anxieties did they share? Was word processing perceived as just a better typewriter or something more? How did it change our understanding of writing?

Track Changes balances the stories of individual writers with a consideration of how the seemingly ineffable act of writing is always grounded in particular instruments and media, from quills to keyboards. Along the way, we discover the candidates for the first novel written on a word processor, explore the surprisingly varied reasons why writers of both popular and serious literature adopted the technology, trace the spread of new metaphors and ideas from word processing in fiction and poetry, and consider the fate of literary scholarship and memory in an era when the final remnants of authorship may consist of folders on a hard drive or documents in the cloud.

And, as you’d expect, it’s available as an e-book.

Planetary bombing

NORAD's Santa

You’ve probably read the tinsel-and-glitter story about NORAD tracking Santa Claus on Christmas Eve – like Santa’s sleigh, it goes the rounds every year – but Matt Novak provides an appropriately explosive rendition of it here.

It was a smart move for the military. When American kids asked their parents what NORAD was, the U.S. parents would be able to respond “those are the people who help Santa” rather than “those are the people who are ensuring our second strike capabilities after you and everyone in your play group are turned to dust by a nuclear attack.”

Among other plums in the pudding, Matt pulls out a syndicated story from AP in December 1955, in which the military promised that it would ‘continue to track and guard Santa and his sleigh on his trip to and from the U.S. against possible attack from those who do not believe in Christmas‘ (emphasis added).

Atomic Weapons Requirements Study for 1959 JPEG

Just before Christmas this year, while NORAD was busy preparing to track Santa’s sleigh again, the National Security Archive at George Washington University released US Strategic Air Command’s Atomic Weapons Requirements Study for 1959, produced the year after that AP story.  The study

‘provides the most comprehensive and detailed list of nuclear targets and target systems that has ever been declassified. As far as can be told, no comparable document has ever been declassified for any period of Cold War history.’

Based on the Bombing Encyclopedia of the World, the Air Force planners proposed

the “systematic destruction” of Soviet bloc urban-industrial targets that specifically and explicitly targeted “population” in all cities, including Beijing, Moscow, Leningrad, East Berlin, and Warsaw. Purposefully targeting civilian populations as such directly conflicted with the international norms of the day, which prohibited attacks on people per se (as opposed to military installations with civilians nearby).

The study ‘listed over 1200 cities in the Soviet bloc, from East Germany to China, also with priorities established. Moscow and Leningrad were priority one and two respectively. Moscow included 179 Designated Ground Zeros (DGZs) while Leningrad had 145, including “population” targets.’  Every target was preceded by an eight-digit code from the Bombing Encyclopedia.

Selected SAC targets 1959 JPEG

William Burr provides an excellent, detailed commentary to accompany the Study here; you can also find more from Joseph Trevithick on this ‘catalog of nuclear death over at War is Boring here.

But all of this is prelude to the real plum in my Christmas pudding, the best paper I’ve read all year: Joseph Masco‘s ‘The Age of Fallout’ in the latest issue of History of the Present [5 (2) (2015) 137-168].

Being able to assume a planetary, as opposed to a global, imaginary is a surprisingly recent phenomenon. Although depictions of an earthly sphere are longstanding and multiple, I would argue that the specific attributes of being able to see the entire planet as a single unit or system is a Cold War creation. This mode of thinking is therefore deeply imbricated not only in nuclear age militarism, but also in specific forms of twentieth-century knowledge production and a related proliferation of visualization technologies.  A planetary imaginary includes globalities of every kind (finance, technology, international relations) – along with geology, atmosphere, glaciers, oceans, and the biosphere – as one totality.

What is increasingly powerful about this point of view is that it both relies on the national security state for the technologies, finances, and interests that create the possibility of seeing in this fashion, but also, in a single gesture, exceeds the nation-state as the political form that matters. A planetary optic is thus a national security creation (in its scientific infrastructures, visualization technologies, and governing ambitions) that transcends these structures to offer an alternative ground for politics and future making. Proliferating forms of globality – including the specific visualizations of science, finance, politics, and environment – each achieve ultimate scale and are unified at the level of the planetary. This achievement ultimately raises an important set of questions about how collective security problems can, and should, be imagined.

It’s a tour de force which, as these opening paragraphs show, is beautifully written too.  Joe begins with a richly suggestive discussion of the idea of ‘fallout’:

‘Fallout comes after the event; it is the unacknowledged-until-lived crisis that is built into the infrastructure of a system, program, or process. Fallout is therefore understood primarily retrospectively, but it is lived in the future anterior becoming a form of history made visible in negative outcomes.’

Its horizons are as much spatial as they are temporal – though Joe makes the sharp point that radioactive fallout was initially conceived as ‘the bomb’s lesser form’ and that it was the ‘explosive power of the bomb that was fetishized by the US military’ – and that fallout involves ‘individual actions and lived consequences, a post-sociality lived in isolation from the collective action of society or the war machine’ that mutates into what he sees as ‘an increasingly post-Foucaudian kind of governmentality’.

FALLOUT JPEG

When he elaborates the multiple registers in which radioactive fallout appears as an atmospheric toxicity Joe moves far beyond the nostrums of Peter Sloterdijk and others – which, to me anyway, seem to be based on almost wilfully superficial research – and connects it, both substantively and imaginatively, to contemporary critical discussions around global climate change and the Anthropocene.

In a cascade of maps and images, Joe shows how

Space and time are radically reconfigured in these fallout studies, constituting a vision of a collective future that is incrementally changing in unknown ways through cumulative industrial effects. The logics of a national security state (with its linkage of a discrete territory to a specific population) becomes paradoxical in the face of mounting evidence of ecological damage on a collective scale, not from nuclear war itself but rather from nuclear research and development programs. It is important to recognize that while cast as “experiments,” U.S. atmospheric nuclear tests were in reality planetary-scale environmental events.

In short, ‘since 1945 human beings have become post-nuclear creatures, marked with the signatures of nuclear weapons science.’

Towards the end of his essay, Joe says this:

In applying the lessons of the twentieth-century nuclear complex to contemporary geoengineering schemes to manage climate change, we might question 1) the claim to both newness and absolute crisis that installs a state of emergency and suspends normal forms of law and regulation; 2) a process that rhetorically reproduces the split between the event and its fallout so completely; and 3) the suggestion that geoengineering is a novel activity, that it is not an ancient practice with many antecedent examples to think with in assessing our current moment. We might also interrogate how the past fifty years of multidisciplinary work to create detailed visualizations of the planet has installed a dangerous confidence in globality itself, as increasingly high resolution visualizations come to stand in for both objectivity and sovereignty, and thus enable psychosocial feelings of control over vastly complex earth systems that remain, at best, only partially understood.

It’s an immensely provocative, perceptive paragraph; it not only makes me retrace my own wanderings through the nuclear wastelands (see here, here and here) but it also obliges me to rethink what I once called ‘the everywhere war’, to map its contours much more carefully  (the original impulse was simply to provide a counterpoint to those commentators who emphasised war time – ‘the forever war’, ‘permanent war’, ‘never-ending war’ – and who never noticed its spaces), and – particularly with that remark about ‘high resolution visualizations com[ing] to stand in for both objectivity and sovereignty’ in mind – perhaps even to see it as another dimension of Joe’s ‘Age of Fallout’.

Citizen Ex

Algorithmic citizenship JPEG

I’m late to this, so apologies, but if you are either weary of web-surfing or can’t get off your digital board, check out James Bridle‘s Citizen Ex project on ‘algorithmic citizenship’:

Algorithmic Citizenship is a form of citizenship which is not assigned at birth, or through complex legal documents, but through data. Like other computerised processes, it can happen at the speed of light, and it can happen over and over again, constantly revising and recalculating. It can split a single citizenship into an infinite number of sub-citizenships, and count and weight them over time to produce combinations of affiliations to different states.

Citizen Ex calculates your Algorithmic Citizenship based on where you go online. Every site you visit is counted as evidence of your affiliation to a particular place, and added to your constantly revised Algorithmic Citizenship. Because the internet is everywhere, you can go anywhere – but because the internet is real, this also has consequences.

The basic idea is derived from an essay by John Cheney-Lippold in Theory, culture and society here:

Marketing and web analytic companies have implemented sophisticated algorithms to observe, analyze, and identify users through large surveillance networks online. These computer algorithms have the capacity to infer categories of identity upon users based largely on their web-surfing habits. In this article I will first discuss the conceptual and theoretical work around code, outlining its use in an analysis of online categorization practices. The article will then approach the function of code at the level of the category, arguing that an analysis of coded computer algorithms enables a supplement to Foucauldian thinking around biopolitics and biopower, of what I call soft biopower and soft biopolitics. These new conceptual devices allow us to better understand the workings of biopower at the level of the category, of using computer code, statistics and surveillance to construct categories within populations according to users’ surveilled internet history. Finally, the article will think through the nuanced ways that algorithmic inference works as a mode of control, of processes of identification that structure and regulate our lives online within the context of online marketing and algorithmic categorization.

From James’s Citizen Ex site you can download (from the banner, top left) an extension to your browser which – after you’ve browsed some more – will calculate, in a very rough and ready way, your own algorithmic citizenship.  Mine (from today’s little effort) is shown at the head of this post.

This may look like an entertaining distraction, but what lies behind it is of course deadly serious: read, for example, James’s (short) stories on Libya and Syria.

Created as a browser plug-in, Citizen Ex shows us the true physical locations of the sites we visit and the territories that govern our actions as we traverse the web. In this reality, every mouse click leaves a trace, as our personal data is collected and stored in locations around the globe. It is with this information that governments and corporations construct a notional vision of our lives. This is our ‘algorithmic citizenship’ — the way we appear to the network. This programmatic fluidity is far removed from the true complexity of human identity. It reduces it to something calculable, which has profound implications for our understanding of privacy, citizenship and the self.

It also has profound implications for surveillance and the digital production of the killing spaces of later modern war.  Read this alongside Louise Amoore‘s brilliant work on The politics of possibility and you can perhaps see where I’m going:

‘[W]hat comes to count as the actionable intelligence behind a sovereign decision is a mosaic of overwhelmingly ordinary fragments of a life that become, once arrayed together, secret and sensitive evidence…

‘Drawing some elements of past activities into the calculation, the mosaic nonetheless moves over the surface of multiple past subjects and events in order to imagine a future unknown subject.’

It’s not difficult to divine (sic) how ‘Citizen Ex’ becomes ‘Citizen-Ex’.

Matters of definition

Since my post on the use of drones to provide intelligence, surveillance and reconnaissance over Iraq and Syria I’ve been thinking about the image stream provided by Predators and Reapers.  Then I used an image from what I think must be an MQ-9 Reaper operated by France which was in full colour and – this is the important part – in high definition.  Over the weekend the New York Times published a report, culled from the Italian magazine L’Espresso, which – together with the accompanying video clip (the link is to the Italian original not the Times version) – confirmed the power of HD full motion video, this time from a Reaper operated by Italy:

The footage … begins with grainy black-and-white images of an airstrike on what appears to have been a checkpoint on a road in northern Iraq, beneath a huge black flag.

Then there is something altogether different: high-resolution, color video of four distinct armed figures walking out of a house and along the streets of a town. At one stage, the picture suddenly zooms in on two of the suspected militants to reveal that one of them is almost certainly a child, propping a rifle on his shoulder that indicates how small he is relative to the man next to him. The images are so clear that even the shadows of the figures can be examined.

Italian Drone video BItalian drone video CItalian drone video AItalian drone video DItalian drone video I

But the significance of all this is less straightforward than it might appear.

First, not all drones have this HD capability.  We know from investigations into civilian casualty incidents in Afghanistan that the feeds from Predators but also early model (‘Block’) Reapers are frequently grainy and imprecise.  Sean Davies reports that the video compression necessary for data transmission squeezed 560 x 480 pixel resolution images into 3.2 MBps at 30 frames per second whereas the newer (Block 5) Reapers provide 1280 x 720 pixel resolution images resolution images at 6.4 MBps.  The enhanced video feeds can be transmitted not only to the Ground Control Stations from which the aircraft are flown – and those too have been upgraded (see image below) – but also to operations centres monitoring the missions and, crucially, to ruggedized laptops (‘ROVERs’) used by special forces and other troops on the ground.

ground-control-stations

The significance of HD full-motion video is revealed in the slide below, taken from a briefing on ‘small footprint operations’ in Somalia and Yemen prepared in February 2013 and published as part of The Intercept‘s Drone Papers, which summarises its impact on the crucial middle stage of the ‘find, fix, finish‘ cycle of targeted killing:

HD FMV impact on Fix

As you can see, HD FMV was involved in as many as 72 per cent of the successful ‘fixes’ and was absent from 88 per cent of the unsuccessful ones.

Second, Eyal Weizman cautions that the image stream shown on the Italian video was captured ‘either very early or very late in the day.  Without shadows we could not identify these as weapons at all.’  Infra-red images captured at night could obviously not provide definition of this quality, but even so-called ‘Day TV’ would not show clear shadows at most times of the day. In Eyal’s view, ‘showing these rare instances could skew our understanding of how much can be seen by drones and how clear what we see is.’

Third, no matter how high the resolution of the video feeds, we need to remember that their interpretation is a techno-cultural process.  One of the figures shown in the Italian video ‘is almost certainly a child’, reports the New York Times.  So bear in mind this exchange between the crew of a Predator circling over three vehicles travelling through the mountains of Uruzgan in February 2010 (see also here and here):

1:07 􏰀(MC):􏰀 screener􏰀 said 􏰀at least 􏰀one 􏰀child 􏰀near 􏰀SUV􏰀

1:07 􏰀(Sensor):􏰀 bull􏰀 (expletive 􏰀deleted)…where!?􏰀

1:07 􏰀(Sensor): 􏰀send 􏰀me 􏰀a 􏰀(expletive􏰀deleted) 􏰀still,􏰀􏰀 I􏰀 don’t 􏰀think 􏰀they 􏰀have 􏰀kids 􏰀out 􏰀at 􏰀this 􏰀hour, 􏰀I 􏰀know􏰀 they’re 􏰀shady 􏰀but􏰀 come􏰀 on􏰀

1:07􏰀 (Pilot):􏰀 at 􏰀least 􏰀one 􏰀child…􏰀Really?􏰀 Listing 􏰀the􏰀 MAM [Military-Aged Male], 􏰀uh, 􏰀that 􏰀means 􏰀he’s 􏰀guilty􏰀

1:07􏰀 (Sensor):􏰀 well 􏰀may be􏰀 a 􏰀teenager 􏰀but 􏰀I 􏰀haven’t􏰀 seen􏰀 anything 􏰀that 􏰀looked 􏰀that 􏰀short, 􏰀granted 􏰀they’e􏰀 all 􏰀grouped 􏰀up 􏰀here,􏰀 but.􏰀..

1:07 􏰀(MC): 􏰀They’re 􏰀reviewing􏰀

1:07 􏰀(Pilot):􏰀Yeah 􏰀review 􏰀that􏰀 (expletive 􏰀deleted)…why􏰀 didn’t 􏰀he 􏰀say􏰀 possible􏰀 child,􏰀 why􏰀 are􏰀 they􏰀 so 􏰀quick􏰀 to 􏰀call 􏰀(expletive􏰀 deleted) 􏰀kids 􏰀but􏰀 not 􏰀to 􏰀call 􏰀(expletive􏰀deleted) 􏰀a 􏰀rifle􏰀….

03:10 􏰀(Pilot):􏰀 And 􏰀Kirk􏰀97, 􏰀good 􏰀copy􏰀 on􏰀 that.􏰀 We 􏰀are 􏰀with 􏰀you.􏰀 Our 􏰀screener􏰀 updated􏰀 only􏰀 one􏰀 adolescent 􏰀so 􏰀that’s 􏰀one􏰀 double 􏰀digit􏰀 age 􏰀range.􏰀 How􏰀 Copy?􏰀

03:10 􏰀(JAG25):􏰀We’ll􏰀 pass 􏰀that 􏰀along 􏰀to 􏰀the 􏰀ground 􏰀force􏰀 commander.􏰀 But 􏰀like 􏰀I 􏰀said, 􏰀12 􏰁13 􏰀years 􏰀old 􏰀with􏰀 a 􏰀weapon 􏰀is􏰀 just 􏰀as􏰀 dangerous.􏰀􏰀

In other words – it’s more than a matter of high definition; it’s also a matter of political and cultural definition.

Big Data and Bombs on Fifth Avenue

Big Data, No Thanks

James Bridle has posted a lightly edited version of the excellent presentation he gave to “Through Post-Atomic Eyes” in Toronto last month – Big Data, No Thanks – at his blog booktwo.  It’s an artful mix of text and images and, as always with James, both repay close scrutiny.

If you look at the situation we are in now, a couple of years after the Snowden revelations, most if not all of the activities which they uncovered have been, if not secretly authorised already, signed into law and continued without much fuss.

As Trevor Paglen has said: Wikileaks and the NSA have essentially the same political position: there are dark secrets at the heart of the world, and if we can only bring them to light, everything will magically be made better. One legitimises the other. Transparency is not enough – and certainly not when it operates in only one direction.  This process has also made me question my own practice and that of many others, because making the invisible visible is not enough either.

James talks about the ‘existential dread’ he feels caused not ‘by the shadow of the bomb, but by the shadow of data’:

It’s easy to feel, looking back, that we spent the 20th Century living in a minefield, and I think we’re still living in a minefield now, one where critical public health infrastructure runs on insecure public phone networks, financial markets rely on vulnerable, decades-old computer systems, and everything from mortgage applications to lethal weapons systems are governed by inscrutable and unaccountable softwares. This structural and existential threat, which is both to our individual liberty and our collective society, is largely concealed from us by commercial and political interests, and nuclear history is a good primer in how that has been standard practice for quite some time.

newyorker-720-loIt’s a much richer argument than these snippets can convey.  For me, the high spot comes when James talks about IBM’s Selective Sequence Electronic Calculator (really), which turns out to be the most explosive combination of secrecy and visibility that you could possibly imagine.

I’m not going to spoil it – go and read it for yourself, and then the title of this post will make horrible sense.  You can read more in George Dyson‘s absorbingly intricate account of Turing’s Cathedral: the origins of the digital universe (Allen Lane/Penguin, 2012).

“I’ve looked at clouds from both sides now…”

Eyal Weizman‘s stunning Wall Exchange, “Forensic Architecture”, which he presented at the Vogue Theatre in Vancouver earlier this month, is now up on YouTube here and embedded below.

https://www.youtube.com/watch?v=qBDWPn7QcIg

If you are puzzled by my riff on Joni Mitchell, start around 46:10 (though you’ll miss a lot if you do…).

And while we’re on the subject of clouds, you’ll find a remarkable analysis of a different sort of militarized cloud in Tung-Hui Hu‘s A Prehistory of the Cloud from MIT:

We may imagine the digital cloud as placeless, mute, ethereal, and unmediated. Yet the reality of the cloud is embodied in thousands of massive data centers, any one of which can use as much electricity as a midsized town. Even all these data centers are only one small part of the cloud. Behind that cloud-shaped icon on our screens is a whole universe of technologies and cultural norms, all working to keep us from noticing their existence. In this book, Tung-Hui Hu examines the gap between the real and the virtual in our understanding of the cloud.

Hu shows that the cloud grew out of such older networks as railroad tracks, sewer lines, and television circuits. He describes key moments in the prehistory of the cloud, from the game “Spacewar” as exemplar of time-sharing computers to Cold War bunkers that were later reused as data centers. Countering the popular perception of a new “cloudlike” political power that is dispersed and immaterial, Hu argues that the cloud grafts digital technologies onto older ways of exerting power over a population. But because we invest the cloud with cultural fantasies about security and participation, we fail to recognize its militarized origins and ideology. Moving between the materiality of the technology itself and its cultural rhetoric, Hu’s account offers a set of new tools for rethinking the contemporary digital environment.

Prehistory of the Cloud

Here is Lisa Parks on what is surely one of the must-reads of the year:

“Hu’s riveting genealogy of the cloud takes us into its precursors and politics, and boldly demonstrates how fantasies of sovereignty, security, and participation are bound up in it. Much more than a data center, the cloud is a diffuse and invisible structure of power that has yielded a data-centric order. Imaginative and lucidly written, this book will be core to digital media studies.”