Google eyes

The Oxford English Dictionary recognised ‘google’ as a verb in 2006, and its active form is about to gain another dimension.  One of the most persistent anxieties amongst those executing remote warfare, with its extraordinary dependence on (and capacity for) real-time full motion video surveillance as an integral moment of the targeting cycle, has been the ever-present risk of ‘swimming in sensors and drowning in data‘.

But now Kate Conger and Dell Cameron report for Gizmodo on a new collaboration between Google and the Pentagon as part of Project Maven:

Project Maven, a fast-moving Pentagon project also known as the Algorithmic Warfare Cross-Functional Team (AWCFT), was established in April 2017. Maven’s stated mission is to “accelerate DoD’s integration of big data and machine learning.” In total, the Defense Department spent $7.4 billion on artificial intelligence-related areas in 2017, the Wall Street Journal reported.

The project’s first assignment was to help the Pentagon efficiently process the deluge of video footage collected daily by its aerial drones—an amount of footage so vast that human analysts can’t keep up, according to Greg Allen, an adjunct fellow at the Center for a New American Security, who co-authored a lengthy July 2017 report on the military’s use of artificial intelligence. Although the Defense Department has poured resources into the development of advanced sensor technology to gather information during drone flights, it has lagged in creating analysis tools to comb through the data.

“Before Maven, nobody in the department had a clue how to properly buy, field, and implement AI,” Allen wrote.

Maven was tasked with using machine learning to identify vehicles and other objects in drone footage, taking that burden off analysts. Maven’s initial goal was to provide the military with advanced computer vision, enabling the automated detection and identification of objects in as many as 38 categories captured by a drone’s full-motion camera, according to the Pentagon. Maven provides the department with the ability to track individuals as they come and go from different locations.

Google has reportedly attempted to allay fears about its involvement:

A Google spokesperson told Gizmodo in a statement that it is providing the Defense Department with TensorFlow APIs, which are used in machine learning applications, to help military analysts detect objects in images. Acknowledging the controversial nature of using machine learning for military purposes, the spokesperson said the company is currently working “to develop polices and safeguards” around its use.

“We have long worked with government agencies to provide technology solutions. This specific project is a pilot with the Department of Defense, to provide open source TensorFlow APIs that can assist in object recognition on unclassified data,” the spokesperson said. “The technology flags images for human review, and is for non-offensive uses only. Military use of machine learning naturally raises valid concerns. We’re actively discussing this important topic internally and with others as we continue to develop policies and safeguards around the development and use of our machine learning technologies.”


As Mehreen Kasana notes, Google has indeed ‘long worked with government agencies’:

2017 report in Quartz shed light on the origins of Google and how a significant amount of funding for the company came from the CIA and NSA for mass surveillance purposes. Time and again, Google’s funding raises questions. In 2013, a Guardian report highlighted Google’s acquisition of the robotics company Boston Dynamics, and noted that most of the projects were funded by the Defense Advanced Research Projects Agency (DARPA).

Big Data and Bombs on Fifth Avenue

Big Data, No Thanks

James Bridle has posted a lightly edited version of the excellent presentation he gave to “Through Post-Atomic Eyes” in Toronto last month – Big Data, No Thanks – at his blog booktwo.  It’s an artful mix of text and images and, as always with James, both repay close scrutiny.

If you look at the situation we are in now, a couple of years after the Snowden revelations, most if not all of the activities which they uncovered have been, if not secretly authorised already, signed into law and continued without much fuss.

As Trevor Paglen has said: Wikileaks and the NSA have essentially the same political position: there are dark secrets at the heart of the world, and if we can only bring them to light, everything will magically be made better. One legitimises the other. Transparency is not enough – and certainly not when it operates in only one direction.  This process has also made me question my own practice and that of many others, because making the invisible visible is not enough either.

James talks about the ‘existential dread’ he feels caused not ‘by the shadow of the bomb, but by the shadow of data’:

It’s easy to feel, looking back, that we spent the 20th Century living in a minefield, and I think we’re still living in a minefield now, one where critical public health infrastructure runs on insecure public phone networks, financial markets rely on vulnerable, decades-old computer systems, and everything from mortgage applications to lethal weapons systems are governed by inscrutable and unaccountable softwares. This structural and existential threat, which is both to our individual liberty and our collective society, is largely concealed from us by commercial and political interests, and nuclear history is a good primer in how that has been standard practice for quite some time.

newyorker-720-loIt’s a much richer argument than these snippets can convey.  For me, the high spot comes when James talks about IBM’s Selective Sequence Electronic Calculator (really), which turns out to be the most explosive combination of secrecy and visibility that you could possibly imagine.

I’m not going to spoil it – go and read it for yourself, and then the title of this post will make horrible sense.  You can read more in George Dyson‘s absorbingly intricate account of Turing’s Cathedral: the origins of the digital universe (Allen Lane/Penguin, 2012).

Security archipelagos

Three short contributions that have caught my eye raise a series of interesting questions about contemporary ‘security archipelagos’ (in multiple sense of the term, hence the plural).

amar-security-archipelagoThe term itself comes from Paul Amar, and Austin Zeiderman has a short but interesting review of his The Security Archipelago: Human-Security States, Sexuality Politics, and the End of Neoliberalism (Duke, 2013) over at Public Books (Public Culture‘s public site):

‘Amar asserts that we need an analytical framework focused on the rise of human security—a governance regime that “aim[s] to protect, rescue, and secure certain idealized forms of humanity.” This new regime is gradually replacing neoliberalism, Amar contends, “as the hegemonic project of global governance and of state administration.” This shift is evident in how security is now justified and pursued by states. The antagonistic relationship between security and human rights that characterized the “neoliberal market states” of the late 20th century is no longer so evident. The repressive security strategies that underpinned earlier development paradigms have been succeeded by the “promise to reconcile human rights and national security interests” in the interest of economic prosperity. Progressive and conservative security doctrines now agree on the imperative to “humanize” (or “humanitarianize”) both state and parastatal security apparatuses. The result, Amar argues, is what he calls the “human-security state”: a globally emergent governance regime with “consistent character and political profile.” From Latin America to the Middle East, political legitimacy is increasingly based on securing humanity against a range of malicious forces….

If the megacities of the Global South are indeed “laboratories” in which new logics and techniques of global governance are being created, it is up to other researchers to fill out and develop further Amar’s concept of the “security archipelago.” Though his study provides both the theoretical rationale and the analytical tools with which to do so, it may be worth questioning whether the “human” is necessarily central to emerging security regimes. For along with human security apparatuses and the human actors struggling to articulate progressive alternatives, a host of non-humans—drones, border fences, hurricanes—are actively producing the security landscape of the future.’

Secondly, I’ve been thinking about the ways in which the work of these ‘laboratories’ often relies on non-state, which is to say corporate, commercial sites (this isn’t news to Paul, of course, even if he wants to challenge our ideas about neoliberalism).  We surely know that the traditional concept of the military-industrial complex now needs wholesale revision, and I’ve noted before the timely and important essay by Jeremy Crampton, Sue Roberts and Ate Poorthuis on ‘The new political economy of geospatial intelligence‘ in the Annals of the Association of American Geographers 104 (1)  (2014) (to which I plan to return in a later post).  The latest MIT Technology Review has a short but suggestive essay by Antonio Regalado, ‘Spinoffs from Spyland’, which describes some of the pathways through which the National Security Agency commercializes (and thus potentially subcontracts and, in some cases, even subverts) its surveillance technology:

In 2011, the NSA released 200,000 lines of code to the Apache Foundation. When Atlas Venture’s Lynch read about that, he jumped—here was a technology already developed, proven to work on tens of terabytes of data, and with security features sorely needed by heavily regulated health-care and banking customers. When Fuchs’s NSA team got cold feet about leaving, says Lynch, “I said ‘Either you do it, or I’ll find five kids from MIT to do it and they’ll steal your thunder.’”

Eventually, Fuchs and several others left the NSA, and now their company [Sqrrl] is part of a land grab in big data, where several companies, like Splunk, Palantir, and Cloudera, have quickly become worth a billion dollars or more.

Over the summer, when debate broke out over NSA surveillance of Americans and others, Sqrrl tried to keep a low profile. But since then, it has found that its connection to the $10-billion-a-year spy agency is a boost, says Ely Kahn, Sqrrl’s head of business development and a cofounder. “Large companies want enterprise-scale technology. They want the same technology the NSA has,” he says.


And finally, before we rush to radicalise and globalise Foucault’s critique of the Panopticon, it’s worth reading my friend Gaston Gordillo‘s cautionary note – prompted by the search for missing Malaysian Airlines Flight 370 – on ‘The Opaque Planet’:

The fascination with, and fetishization of, technologies of global location and surveillance often makes us forget that, for all their sophistication, we live on a planet riddled with opaque zones that will always erode the power of human-made systems of orientation, for the simple fact that no such system (contrary to what the NSA seems to believe) will ever manage to create an all-seeing God. This opacity is intrinsic to the textured, three-dimensional materiality of the surface of the planet, and is especially marked in the liquid vastness of the ocean.


Phil Steinberg has already commented on the geopolitics of the search, but Gaston draws out attention to the gaps in the surveillance capabilities of states, and here too the geopolitical meshes (and sometimes jibes against) the geoeconomic, as described in this report from Reuters:

Analysts say the gaps in Southeast Asia’s air defenses are likely to be mirrored in other parts of the developing world, and may be much greater in areas with considerably lower geopolitical tensions.

“Several nations will be embarrassed by how easy it is to trespass their airspace,” said Air Vice Marshal Michael Harwood, a retired British Royal Air Force pilot and ex-defense attache to Washington DC. “Too many movies and Predator (unmanned military drone) feeds from Afghanistan have suckered people into thinking we know everything and see everything. You get what you pay for. And the world, by and large, does not pay.”