The eyes have it…

The Disorder of Things has hosted a symposium on Antoine Bousquet‘s The eye of war: military perception from the telescope to the drone (Minnesota UP, 2018).  Antoine’s introduction is here.

There were four other participants, and below I’ve linked to their commentaries and snipped extracts to give you a sense of their arguments: but it really is worth reading them in full.

Kate Hall‘Linear Perspective, the Modern Subject, and the Martial Gaze’

For Bousquet this future of globalised targeting that the birth of linear perspective has brought us to throws the role of the human into question. With the move of perception into the realm of the technical, Bousquet sees that perception has become a process without a subject, and as human agency is increasingly reduced, so does the possibility for politics – leading, perhaps much like the concerns of the Frankfurt School, to passivity and a closing of the space of critique. For Bousquet the figure that captures this positioning or transformation of the human, and the image that ends the book, is the bomber instructor recording aircraft movement within a dark camera obscura tent. As Bousquet concludes, “…the camera obscura’s occupant is both a passive object of the targeting process and an active if compliant agent tasked with the iterative process and optimization of its performance. Perhaps this duality encapsulates the martial condition we inhabit today, caught between our mobilization within the circulatory networks of the logistics of perception and the roving crosshairs of a global imperium of targeting – and all watched over by machines of glacial indifference.” 

If this is the figure that encapsulates the condition of the present, Bousquet has shown in Eye of War how its foundations are found in the early modern period. And in tracing this history, it is clear the future does not look promising for humans (both as passive subjects and as objects of lethal surveillance). But Bousquet does not give us a sense of how we might change course. Eye of War does not ask, where is the space for politics in this analysis of the present?

Dan Öberg‘Requiem for the Battlefield’

While the culminating battle of the Napoleonic wars, Waterloo, was fought at a battlefield where 140,000 men and 400 guns were crammed into an area of roughly 3,5 miles, the latter half of the 19th century becomes characterised by the dispersal and implosion of the battlefield. As Bousquet has directed our attention to in his work, after the birth of modern warfare the battlefield dissolves due to the increased range of weapons systems. Its disappearance is also facilitated by how the military logistics of perception conditions the appearances of targets, particularly through how the “eye of war” manages to move from the commander occupying a high-point next to the field of battle, to being facilitated by balloons, binoculars, aerial reconnaissance, satellites, algorithms, and cloud computing. It is as part of this process we eventually reach the contemporary era where targeting is characterised by polar inertia, as targets arrive as digital images from anywhere on the globe in front of a stationary targeteer. However, I would like to argue that, parallel to this, there is a corresponding process taking place, which erases and remodels the battlefield as a result of the military disposition that is born with the operational dimension of warfare.

To grasp this disposition and its consequences we need to ponder the fact that it is no coincidence that the operational dimension emerges at precisely the time when the traditional battlefield is starting to disappear. As The Eye of War outlines, global targeting is enabled by a logistics of perception. However, the demand for maps and images as well as the attempts to make sense of the battlefield arguably receives its impetus and frame of reference from elsewhere. It finds its nexus in standard operating procedures, regulations, instructions and manuals, military working groups, administrative ideals, organisational routines, and bureaucratic rituals. And, as the battlefield is managed, coded, and homogenised, it simultaneously starts to become an external point of reference, enacted through operational analysis and planning far from the battlefield itself.

Matthew Ford‘Totalising the State through Vision and War’

The technologies of vision that Antoine describes emerge from and enable the political and military imaginaries that inspired them. The technological fix that this mentality produces is, however, one that locks military strategy into a paradox that privileges tactical engagement over identifying political solutions. For the modern battlefield is a battlefield of fleeting targets, where speed and concealment reduce the chance of being attacked and create momentary opportunities to produce strategic effects (Bolt, 2012). The assemblages of perspective, sensing, imaging and mapping, described in The Eye of War may make it possible to anticipate and engage adversaries before they can achieve these effects but by definition they achieve these outcomes at the tactical level.

The trap of the martial gaze is, then, twofold. On the one hand, by locking technologies of vision into orientalist ways of seeing, strategies that draw on these systems tend towards misrepresenting adversaries in a manner that finds itself being reproduced in military action. At the same time, in an effort to deliver decisive battle, the state has constructed increasingly exquisite military techniques. These hold out the prospect of military success but only serve to further atomise war down tactical lines as armed forces find more exquisite ways to identify adversaries and adversaries find more sophisticated ways to avoid detection. The result is that the military constructs enemies according to a preconceived calculus and fights them in ways that at best manage outcomes but at worst struggle to deliver political reconciliation.

Jairus Grove, ‘A Martial Gaze Conscious of Itself

If we take the assemblage and the more-than-human approach of Bousquet’s book seriously, which I do, then we ought not believe that the dream of sensing, imaging, mapping, and targeting ends with the intact human individual. As an early peak at what this could become, consider Bousquet’s review of the late 1970’s research on ‘cognitive cartography’ and the concern that human technology would need to be altered to truly take advantage of the mapping revolution. More than the development of GIS and other targeting technologies, the dream of cognitive mapping and conditioning was to manage the complex informatics of space and the human uses of it from the ground up. That is in the making of user-friendly human subjects. One can image targeting following similar pathways. The “martial gaze that roams our planet” will not be satisfied with the individual any more than it was satisfied with the factory, the silo, the unit, or the home.

The vast data revolutions in mapping individual and collective behavior utilized in the weaponization of fake and real news, marketing research, fMRI advances and brain mapping, as well nanodrones, directed energy weapons, and on and on, suggest to me that just as there has never been an end of history for politics, or for that matter war, there will be no end of history or limit to what the martial gaze dreams of targeting. I can imagine returns to punishment where pieces of the enemy’s body are taken. Jasbir Puar’s work on debility suggests (see our recent symposium) already suggests such a martial vision of the enemy at play in the new wars of the 21st century. Following the long tails of Bousquet’s machinic history, I can further imagine the targeting of ideas and behaviors for which ‘pattern-of-life’ targeting and gait analysis are use are only crude and abstract prototypes.

If we, like the machines we design, are merely technical assemblages, then the molecularization of war described by Bousquet is not likely to remain at the level of the intact human, as if individuals were the martial equivalent of Plank’s quanta of energy. The martial gaze will want more unless fundamentally interrupted by other forces of abstraction and concretization.

Antoine‘s response is here.

Lots to think about here for me – especially since one of my current projects on ‘woundscapes‘ (from the First World War through to the present) is located at the intersection of the military gaze (‘the target’) and the medical gaze (‘the wound’) but rapidly spirals beyond these acutely visual registers, as it surely must….  More soon!

The Violence of Populism and Precarity

The LA Review of Books has an interesting interview (conducted by Brad Evans) with Mark Duffield here.  I’m not sure the posted title (‘The Death of Humanitarianism’) captures the range and force of Mark’s critique – it’s a long way from Didier Fassin, for example, whose work I also admire – but see for yourself:

Late liberalism’s turn to catastrophism is a response to global recalcitrance. A quarter-century ago, an emergent liberal interventionism boasted that the age of absolute sovereignty was over. As a result of pushback, however, such exceptionalism has evaporated. Coupled with the downturn, liberalism seems but one among many competing powers and truths. Greeted with alarm in the West and dismissed as so much backward or populist reaction, we have to be more open to the run of the present.

If the computational turn has allowed a post-humanist vision of a world that is smaller than the sum of its parts to consolidate, then late liberalism has authored a realist ontopolitics of accepting this world as it is — rather than worrying how it ought to be. It is a connected world of disruptive logistics, mobility differentials, data asymmetries, vast inequalities, and remote violence: a world of precarity.

Populism is seemingly an inevitable response to an unwanted future through the reassertion of autonomy. As a political model, it is instructive. Resistance requires the active recreation of autonomy. During the 1960s, large areas of social, economic, and cultural life still lay outside capitalism. The university campus, the shop floor, and the “Third World” as it was termed, already existed as areas of effective autonomy. For the New Left, this made them potential sites for liberation and revolution. In a connected world, such nurturing autonomy no longer exists.

Political pushback involves the recreation of autonomy via the repoliticization of ground and place through their imbrications with history, culture, and the life that should be lived. It is a resistance that seeks to renegotiate its position and reconnect with the world anew. And so the question we confront is: Can we reassert a progressive autonomy, or at least a humanitarian autonomy based on a resistance to the dystopia of permanent emergency?

When post-humanism holds that design has supplanted revolution, perhaps it’s time to imbue a new humanitarian ethic based on resisting design. A resistance that privileges more the sentiments of spontaneity, circulation, and necessary difference. We cannot imagine the yet to be. We can, however, encourage its arrival by resisting the negative loss and abjection of precarity through a politics of humanitarian critique.

The interview coincides with the publication of Mark’s new book, Post-humanitarianism: governing precarity in the digital world (Polity, December 2018):

The world has entered an unprecedented period of uncertainty and political instability. Faced with the challenge of knowing and acting within such a world, the spread of computers and connectivity, and the arrival of new digital sense-making tools, are widely celebrated as helpful. But is this really the case, or have we lost more than gained in the digital revolution?

In Post-Humanitarianism, renowned scholar of development, security and global governance Mark Duffield offers an alternative interpretation. He contends that connectivity embodies new forms of behavioural incorporation, cognitive subordination and automated management that are themselves inseparable from the emergence of precarity as a global phenomenon. Rather than protect against disasters, we are encouraged to accept them as necessary for strengthening resilience. At a time of permanent emergency, humanitarian disasters function as sites for trialling and anticipating the modes of social automation and remote management necessary to govern the precarity that increasingly embraces us all.

Post-Humanitarianism critically explores how increasing connectivity is inseparable from growing societal polarization, anger and political push-back. It will be essential reading for students of international and social critique, together with anyone concerned about our deepening alienation from the world.

Here is the list of contents:

Chapter One: Introduction – Questioning Connectivity
Chapter Two: Against Hierarchy
Chapter Three: Entropic Barbarism
Chapter Four: Being There
Chapter Five: Fantastic Invasion
Chapter Six: Livelihood Regime
Chapter Seven: Instilling Remoteness
Chapter Eight: Edge of Catastrophe
Chapter Nine: Connecting Precarity
Chapter Ten: Post-Humanitarianism
Chapter Eleven: Living Wild
Chapter Twelve: Conclusion – Automating Precarity

Death machines

New from Elke Schwarz, Death machines: the ethics of violent technologies (Manchester UP):

As innovations in military technologies race toward ever-greater levels of automation and autonomy, debates over the ethics of violent technologies tread water. Death Machines reframes these debates, arguing that the way we conceive of the ethics of contemporary warfare is itself imbued with a set of bio-technological rationalities that work as limits. The task for critical thought must therefore be to unpack, engage, and challenge these limits. Drawing on the work of Hannah Arendt, the book offers a close reading of the technology-biopolitics-complex that informs and produces contemporary subjectivities, highlighting the perilous implications this has for how we think about the ethics of political violence, both now and in the future.

Contents:

Introduction: The conditioned human
1. Biopolitics and the technological subject
2. Biopolitical technologies in Arendt and Foucault
3. Anti-political (post)modernity
4. Procedural violence
5. Ethics as technics
6. All hail our robot overlords
7. Prescription drones
Conclusion: For an ethics beyond technics

In my crosshairs

Two new books on the military gaze:

First, from the ever-interesting Roger StahlThrough the Crosshairs: War, Visual Culture, and the Weaponized Gaze (Rutgers).

Now that it has become so commonplace, we rarely blink an eye at camera footage framed by the crosshairs of a sniper’s gun or from the perspective of a descending smart bomb. But how did this weaponized gaze become the norm for depicting war, and how has it influenced public perceptions?

Through the Crosshairs traces the genealogy of this weapon’s-eye view across a wide range of genres, including news reports, military public relations images, action movies, video games, and social media posts. As he tracks how gun-camera footage has spilled from the battlefield onto the screens of everyday civilian life, Roger Stahl exposes how this raw video is carefully curated and edited to promote identification with military weaponry, rather than with the targeted victims. He reveals how the weaponized gaze is not only a powerful propagandistic frame, but also a prime site of struggle over the representation of state violence.

Contents:

1 A Strike of the Eye
2 Smart Bomb Vision
3 Satellite Vision
4 Drone Vision
5 Sniper Vision
6 Resistant Vision
7 Afterword: Bodies Inhabited and Disavowed

And here’s Lisa Parks on the book:

“Immersing readers in the perilous visualities of smart bombs, snipers, and drones, Through the Crosshairs delivers a riveting analysis of the weaponized gaze and powerfully explicates the political stakes of screen culture’s militarization.  Packed with insights about the current conjuncture, the book positions Stahl as a leading critic of war and media.”

Incidentally, if you don’t know Roger’s collaborative project, The vision machine: media, war, peace – I first blogged about it five years ago – now is the time to visit: here.

And from Antoine Bousquet, The Eye of War: military perception from the telescope to the drone (Minnesota):

From ubiquitous surveillance to drone strikes that put “warheads onto foreheads,” we live in a world of globalized, individualized targeting. The perils are great. In The Eye of War, Antoine Bousquet provides both a sweeping historical overview of military perception technologies and a disquieting lens on a world that is, increasingly, one in which anything or anyone that can be perceived can be destroyed—in which to see is to destroy.

Arguing that modern-day global targeting is dissolving the conventionally bounded spaces of armed conflict, Bousquet shows that over several centuries, a logistical order of militarized perception has come into ascendancy, bringing perception and annihilation into ever-closer alignment. The efforts deployed to evade this deadly visibility have correspondingly intensified, yielding practices of radical concealment that presage a wholesale disappearance of the customary space of the battlefield. Beginning with the Renaissance’s fateful discovery of linear perspective, The Eye of War discloses the entanglement of the sciences and techniques of perception, representation, and localization in the modern era amid the perpetual quest for military superiority. In a survey that ranges from the telescope, aerial photograph, and gridded map to radar, digital imaging, and the geographic information system, Bousquet shows how successive technological systems have profoundly shaped the history of warfare and the experience of soldiering.

A work of grand historical sweep and remarkable analytical power, The Eye of War explores the implications of militarized perception for the character of war in the twenty-first century and the place of human subjects within its increasingly technical armature.

Contents:

Introduction: Visibility Equals Death
1. Perspective
2. Sensing
3. Imaging
4. Mapping
5. Hiding
Conclusion: A Global Imperium of Targeting

And here is Daniel Monk on the book:

The Eye of War is a masterful contemporary history of the martial gaze that reviews the relation between seeing and targeting. The expansion of ocularcentrism—the ubiquitization of vision as power—Antoine Bousquet shows us, coincides with the inverse: the relegation of the eye to an instrument of a war order that relies on the sensorium as the means to its own ends. As he traces the development of a technocracy of military vision, Bousquet discloses the vision of a military technocracy that has transformed the given world into units of perception indistinct from ‘kill boxes.’

Coming from excellent US university presses that – unlike the commercial behemoths (you know who you are) favoured by too many authors in my own field (you know who you are too) – these books are both attractively designed and accessibly priced.

Googled

Following up my post on Google and Project Maven here, there’s an open letter (via the International Committee for Robot Arms Control) in support of Google employees opposed to the tech giant’s participation in Project Maven here: it’s open for for (many) more signatures…

An Open Letter To:

Larry Page, CEO of Alphabet;
Sundar Pichai, CEO of Google;
Diane Greene, CEO of Google Cloud;
and Fei-Fei Li, Chief Scientist of AI/ML and Vice President, Google Cloud,

As scholars, academics, and researchers who study, teach about, and develop information technology, we write in solidarity with the 3100+ Google employees, joined by other technology workers, who oppose Google’s participation in Project Maven. We wholeheartedly support their demand that Google terminate its contract with the DoD, and that Google and its parent company Alphabet commit not to develop military technologies and not to use the personal data that they collect for military purposes. The extent to which military funding has been a driver of research and development in computing historically should not determine the field’s path going forward. We also urge Google and Alphabet’s executives to join other AI and robotics researchers and technology executives in calling for an international treaty to prohibit autonomous weapon systems.

Google has long sought to organize and enhance the usefulness of the world’s information. Beyond searching for relevant webpages on the internet, Google has become responsible for compiling our email, videos, calendars, and photographs, and guiding us to physical destinations. Like many other digital technology companies, Google has collected vast amounts of data on the behaviors, activities and interests of their users. The private data collected by Google comes with a responsibility not only to use that data to improve its own technologies and expand its business, but also to benefit society. The company’s motto “Don’t Be Evil” famously embraces this responsibility.

Project Maven is a United States military program aimed at using machine learning to analyze massive amounts of drone surveillance footage and to label objects of interest for human analysts. Google is supplying not only the open source ‘deep learning’ technology, but also engineering expertise and assistance to the Department of Defense.

According to Defense One, Joint Special Operations Forces “in the Middle East” have conducted initial trials using video footage from a small ScanEagle surveillance drone. The project is slated to expand “to larger, medium-altitude Predator and Reaper drones by next summer” and eventually to Gorgon Stare, “a sophisticated, high-tech series of cameras…that can view entire towns.” With Project Maven, Google becomes implicated in the questionable practice of targeted killings. These include so-called signature strikesand pattern-of-life strikes that target people based not on known activities but on probabilities drawn from long range surveillance footage. The legality of these operations has come into question under international[1] and U.S. law.[2] These operations also have raised significant questions of racial and gender bias (most notoriously, the blanket categorization of adult males as militants) in target identification and strike analysis.[3]These problems cannot be reduced to the accuracy of image analysis algorithms, but can only be addressed through greater accountability to international institutions and deeper understanding of geopolitical situations on the ground.

While the reports on Project Maven currently emphasize the role of human analysts, these technologies are poised to become a basis for automated target recognition and autonomous weapon systems. As military commanders come to see the object recognition algorithms as reliable, it will be tempting to attenuate or even remove human review and oversight for these systems. According to Defense One, the DoD already plans to install image analysis technologies on-board the drones themselves, including armed drones. We are then just a short step away from authorizing autonomous drones to kill automatically, without human supervision or meaningful human control. If ethical action on the part of tech companies requires consideration of who might benefit from a technology and who might be harmed, then we can say with certainty that no topic deserves more sober reflection – no technology has higher stakes – than algorithms meant to target and kill at a distance and without public accountability.

We are also deeply concerned about the possible integration of Google’s data on people’s everyday lives with military surveillance data, and its combined application to targeted killing. Google has moved into military work without subjecting itself to public debate or deliberation, either domestically or internationally. While Google regularly decides the future of technology without democratic public engagement, its entry into military technologies casts the problems of private control of information infrastructure into high relief.

Should Google decide to use global internet users’ personal data for military purposes, it would violate the public trust that is fundamental to its business by putting its users’ lives and human rights in jeopardy. The responsibilities of global companies like Google must be commensurate with the transnational makeup of their users. The DoD contracts under consideration by Google, and similar contracts already in place at Microsoft and Amazon, signal a dangerous alliance between the private tech industry, currently in possession of vast quantities of sensitive personal data collected from people across the globe, and one country’s military. They also signal a failure to engage with global civil society and diplomatic institutions that have already highlighted the ethical stakes of these technologies.

We are at a critical moment. The Cambridge Analytica scandal demonstrates growing public concern over allowing the tech industries to wield so much power. This has shone only one spotlight on the increasingly high stakes of information technology infrastructures, and the inadequacy of current national and international governance frameworks to safeguard public trust. Nowhere is this more true than in the case of systems engaged in adjudicating who lives and who dies.
We thus ask Google, and its parent company Alphabet, to:

  • Terminate its Project Maven contract with the DoD.

  • Commit not to develop military technologies, nor to allow the personal data it has collected to be used for military operations.

  • Pledge to neither participate in nor support the development, manufacture, trade or use of autonomous weapons; and to support efforts to ban autonomous weapons.

Google eyes

The Oxford English Dictionary recognised ‘google’ as a verb in 2006, and its active form is about to gain another dimension.  One of the most persistent anxieties amongst those executing remote warfare, with its extraordinary dependence on (and capacity for) real-time full motion video surveillance as an integral moment of the targeting cycle, has been the ever-present risk of ‘swimming in sensors and drowning in data‘.

But now Kate Conger and Dell Cameron report for Gizmodo on a new collaboration between Google and the Pentagon as part of Project Maven:

Project Maven, a fast-moving Pentagon project also known as the Algorithmic Warfare Cross-Functional Team (AWCFT), was established in April 2017. Maven’s stated mission is to “accelerate DoD’s integration of big data and machine learning.” In total, the Defense Department spent $7.4 billion on artificial intelligence-related areas in 2017, the Wall Street Journal reported.

The project’s first assignment was to help the Pentagon efficiently process the deluge of video footage collected daily by its aerial drones—an amount of footage so vast that human analysts can’t keep up, according to Greg Allen, an adjunct fellow at the Center for a New American Security, who co-authored a lengthy July 2017 report on the military’s use of artificial intelligence. Although the Defense Department has poured resources into the development of advanced sensor technology to gather information during drone flights, it has lagged in creating analysis tools to comb through the data.

“Before Maven, nobody in the department had a clue how to properly buy, field, and implement AI,” Allen wrote.

Maven was tasked with using machine learning to identify vehicles and other objects in drone footage, taking that burden off analysts. Maven’s initial goal was to provide the military with advanced computer vision, enabling the automated detection and identification of objects in as many as 38 categories captured by a drone’s full-motion camera, according to the Pentagon. Maven provides the department with the ability to track individuals as they come and go from different locations.

Google has reportedly attempted to allay fears about its involvement:

A Google spokesperson told Gizmodo in a statement that it is providing the Defense Department with TensorFlow APIs, which are used in machine learning applications, to help military analysts detect objects in images. Acknowledging the controversial nature of using machine learning for military purposes, the spokesperson said the company is currently working “to develop polices and safeguards” around its use.

“We have long worked with government agencies to provide technology solutions. This specific project is a pilot with the Department of Defense, to provide open source TensorFlow APIs that can assist in object recognition on unclassified data,” the spokesperson said. “The technology flags images for human review, and is for non-offensive uses only. Military use of machine learning naturally raises valid concerns. We’re actively discussing this important topic internally and with others as we continue to develop policies and safeguards around the development and use of our machine learning technologies.”

 

As Mehreen Kasana notes, Google has indeed ‘long worked with government agencies’:

2017 report in Quartz shed light on the origins of Google and how a significant amount of funding for the company came from the CIA and NSA for mass surveillance purposes. Time and again, Google’s funding raises questions. In 2013, a Guardian report highlighted Google’s acquisition of the robotics company Boston Dynamics, and noted that most of the projects were funded by the Defense Advanced Research Projects Agency (DARPA).

Drone Imaginaries and Society

News from Kathrin Maurer of a conference on Drone Imaginaries and Society, 5-6 June 2018, at the University of Southern Denmark (Odense).  I’ll be giving a keynote, and look forward to the whole event very much (and not only because, while I’ve visited Denmark lots of times and loved every one of them, I’ve never made it to Odense):

Drones are in the air. The production of civilian drones for rescue, transport, and leisure activity is booming. The Danish government, for example, proclaimed civilian drones a national strategy in 2016. Accordingly, many research institutions as well as the industry focus on the development, usage, and promotion of drone technology. These efforts often prioritize commercialization and engineering as well as setting-up UAV (Unmanned Arial Vehicle) test centers. As a result, urgent questions regarding how drone technology impacts our identity as humans as well as their effects on how we envision the human society are frequently underexposed in these initiatives.

Our conference aims to change this perspective. By investigating cultural representations of civilian and military drones in visual arts, film, and literature, we intend to shed light on drone technology from a humanities’ point of view. This aesthetic “drone imaginary” forms not only the empirical material of our discussions but also a prism of knowledge which provides new insights into the meaning of drone technology for society today.

Several artists, authors, film makers, and thinkers have already engaged in this drone imaginary. While some of these inquiries provide critical reflection on contemporary and future drone technologies – for instance issues such as privacy, surveillance, automation, and security – others allow for alternative ways of seeing and communicating as well as creative re-imagination of new ways of organizing human communities. The goal of the conference is to bring together these different aesthetic imaginaries to better understand the role of drone technologies in contemporary and future societies.

 The focus points of the conference are:

–     Aesthetic drone imaginaries: Which images, metaphors, ethics, emotions and affects are associated to drones through their representation in art, fiction and popular culture?

–     Drone technology and its implications for society: How do drones change our daily routines and push the balance between publicity and privacy?

–     Historical perspective on drones: In what way do drone imaginaries allow for a counter-memory that can challenge, for instance, the military implementation of drones?

–     Drones as vulnerability: Do drones make societies more resilient or more fragile, and are societies getting overly dependent on advanced technologies?

     Utopian or dystopian drone imaginaries: What dream or nightmare scenarios are provided by drone fiction and how do they allow for a (re)imagining of future societies?

–     Drones and remote sensing: In what way do drones mark a radical new way of seeing and sensing by their remotely vertical gaze and operative images?

     Drone warfare: Do drones mark a continuation or rupture of the way we understand war and conflict, and how do they change the military imaginary?

The conference is sponsored by the Drone Network (Danish Research Council) and Institute for the Study of Culture at the University of Southern Denmark.

 You can contact Kathrin at  kamau@sdu.dk

The conference website is here.