Twilight Grotesques, Aftermath Innocents
On directing violence through the unknown knowns and fogs of forever war.
Edited by Althea May Atherton
Before there was the fog of war, the fog of war was there. The fog descended the first time a group of humans picked up arms and planned violence against their neighbors, though it took until mass gunpowder warfare for the term to be coined. It is a fog durable at all times, a miasma of unknowing that shrouds coming violence. It is only in the aftermath that any of the truths hidden by the fog are revealed.
“Fog of war”, broadly speaking, is a catch-all term of the uncertain and unknowable in war. War is violence amidst uncertainty, and it is impossible to accurately talk about war without talking about failures of perception.
This is a story about the fog of war. It is, in unequal parts, a story about learning war through video games, about mass gunpowder, about the promise of a video-game-esque gun modification, and about a drone strike in Kabul.
Video Frames
I first encountered “fog of war” as a gameplay mechanic in Warcraft II. In game, the fog manifests first as the unrevealed parts of the map, blank and unknown. Once a player has explored with units, the mostly fixed terrain of the world remains visible, but only what is within the line of sight of player units is seen in real time. The fog falls where the player’s forces are not.
Video games, especially the strategy games that I devoured as a kid, incorporate the fog as a hidden information mechanic. It makes scouting meaningful and surprise possible, as players try to move outside of their enemies’ awareness.
In simulations with more or a lot more realism than Warcraft, the fog can apply beyond just what is visible for units. In R.U.S.E., a 2010 strategy game bound by the nations and technologies of World War II, units and buildings are visible unless players take steps to conceal their forces.1 These steps can include hiding the location of buildings under camouflage netting, adopting radio silence to conceal unit movements until they open fire, and scrambling radio signals so infantry appear as tanks and vice versa. All of these measures fall apart under the eyes of scouts, which makes scouting so important in the game.
The military doesn’t talk about scouting as much any more (certain missions of service notwithstanding), instead grouping the whole mess of piercing the fog of war under “ISR,” or “Intelligence, Surveillance, and Reconnaissance.”
It’s a messy acronym, folding in everything from spy work to public assessments of ship numbers and tank inventory with persistent drone flights and also the kind of observations made by infantry with spotter’s scopes.
In Command: Modern Air/Naval Operations, the player gets to simulate that acronym-heavy approach to war, fighting over real-world oceans and shorelines with units whose attributes are meticulously sourced to the best publicly available information.
In this game, the fog exists at the limit of sensors. A simulated radar will ping a ship in one place, but by the time a patrol boat could see it with visual confirmation or even its own radars, the vessel may have moved. Command is a game about not knowing as much as it is about making informed decisions, and success depends on knowing the limits of sensors as well as their strengths.
Each of these games understates the potential for fog, even when the games go to great lengths to model sensor limits and active deception. By offering perfect information, in real time, over where every friendly unit is, games give players an omniscience generals could only dream of.
Literal Fog, Metaphorical Fog
It is hard to identify a specific worst era of war, but the gunpowder wars of the 14th through 18th centuries are among the worst. The battles, conducted with professional militaries, forced tens of thousands in close proximity, while threatening them with far-reaching artillery. A general commanding forces in this era would know some of what his army could expect to face, reported from spies and scouts, but both of those sources are fallible in human ways.
When the armies met, blackpowder gunfire created smoke, obscuring the violence between the volleys. As best I can tell, this fighting-induced obfuscation was never specifically termed “fog of war.”
Instead, “fog” is one of several impediments to executing violence that appears in On War, the definitive 1832 tome by Carl von Clausewitz. Clausewitz, a Prussian veteran of the Napoleonic wars, was familiar with both clouded battlefields and uncertainty in war. On War mentions fog four times, though only once that approximates the modern usage.
“All action takes place, so to speak, in a kind of twilight, which, like fog or moonlight, often tends to make things seem grotesque and larger than they really are,” wrote Clausewitz. If military jargon-crafters had stuck to his wording, “twilight of war” would be the term used to describe one of many ways uncertainty leads to poor decision making and friction in combat.
Instead, “Fog of War” as a coinage comes from 1896, not too long after newly patented smokeless powder eliminated the clouds of smoke generated by rifle fire. Lonsdale Hall, military correspondent for The Times and a colonel in the Royal Engineers, used the term to describe “the contrast between the difficulty encountered in obtaining information in war and the ease with which it is obtained in piece maneuvers.” This fog, Hall reportedly said, “descends over the whole theater of war, including the battlefields,” and hides not just foes from a commander but friends, too.
A tremendous amount of military research and technique is about precisely identifying where enemies are. An underrated amount of work goes into correctly identifying where friendly forces are, too.
Engineering An End To Fog
Last week, I wrote about an after-market gun modification for Popular Science. Produced by Israeli defense firm Elbit, the “Assault Rifle Combat Application System” (ARCAS) is built for weapons like the familiar M4 carbine. The ARCAS works as both a camera-enabled gunsight and, thanks to the computer in its foregrip, it also promises the capacity for a host of processing functions, like ballistic calculation or target identification. It is, like most tech stories built around press releases, about the desired capability far more than it is about the actual deliverables capabilities.
I still think it’s worth looking at the promise of a weapon system, even when what is delivered is all but guaranteed to fall short of that. In this instance, Elbit includes a little promotional video about what the ARCAS can deliver to an individual soldier, and to a squad of soldiers.
“We can lift the fog of war to know what lies ahead,” says the video’s narrator, “and we never march alone.”
As portrayed, the ARCAS lets a soldier clearly see an enemy illuminated against a backdrop, helpfully highlighted in the bright red of a video game display. The friendly force tracking comes with a squad of green figures, all presumably using the same rifle software system, working together to do violence against these unaware red sentries.
The video is a sales pitch, for politicians and for acquisition officers to approve buying new rifle-mounted sensors. It doesn’t highlight the risks inherent in creating a remote system that reveals the locations of every friendly soldier on the same network. That information need not be revealed to change how the military plans around it.2
Because a commander can never know exactly what an enemy knows about their forces, the fog of war remains unliftable. So long as a human in charge of combat has finite ability to perceive what is happening, fog exists, masking the movements of people and machines.
This is true even in video games, even with total information revealed (by cheat codes or other means). A programmed-in fog of war can hide some portion of battle, but the greater limit is that a person playing a game can only observe so much at any given time, and everything that keeps happening beyond their perception still matters.
At best, the fog of war can be managed, not lifted. What is worse than planning around uncertainty and limitations is pursuing violence convinced such limits do not exist.
Signature, Struck
On August 29th, a missile fired by the US from a drone struck a car in Kabul. The strike killed ten civilians.
“We are still assessing the results of this strike, which we know disrupted an imminent ISIS-K threat to the airport,” said Bill Urban, spokesman for U.S. Central Command, in a press release sent out hours after the drone strike. Urban’s insistence that the strike was justified, made on behalf of the entire military, hinges on the premise that the threat was not just knowable but known, and that the US-fired missile stopped it.
On September 17th, General Kenneth McKenzie of CENTCOM held a press conference about the investigation into the aftermath of the strike.
“I am now convinced that as many as 10 civilians including up to seven children were tragically killed in that strike,” said McKenzie. “Moreover, we now assess that it is unlikely that the vehicle and those who died were associated with ISIS-K or were a direct threat to U.S. forces.”
Besides the grating use of passive voice, McKenzie’s statement only came a week after a New York Times investigation revealed the identities of the car’s occupants, including Zemari Ahmadi, an electrical engineer who for 14 years had worked for the Kabul office of Nutrition and Education International. Forensic analysis of the drone strike by The Washington Post undermined the military explanation of a car filled with explosives.
McKenzie’s timeline of events, as well as those established by press, line up in important regards. Ahmadi drove a white Corolla, made several stops, picked up passengers and cargo. Where the pre-launch assessment differed from the now-known truth of the strike is in the nature of the passengers, the packages, and the threat. The car was under surveillance for 8 hours by the military, watching through the cameras of as many as six MQ-9 Reaper drones.
This abundance of information is deceptive. We know now it was built on false premises, like misreading a residence as a safehouse for ISIS-K. We only know that through rigorous work in public that compelled the military to admit to its error.
These errors have long plagued the entire targeted killing program, which used drones and other means to track, hunt, and kill humans based on private assessments of risk. The drone war, as the targeted killing program is largely known, generated dead bodies and good headlines, with follow-ups admitting error buried in later investigations if carried out at all.
We know clearly the death of Ahmadi because the media was present, because the strike was highly visible in an urban area with abundant cameras and internet access, and because the strike itself was publicized immediately. Much of the war by air over Afghanistan took place far from cameras, far from accountability, far from apologies.
The cameras on a Reaper can read a license plate from 50,000 below where it is flying. The communications links between the Reaper and human operators, half the globe away, allow for near-real-time analysis of the video recorded. This tech, impressive now, is scheduled for regular upgrades, forever incrementally improving on what can already be done.
It is not enough, it will never be enough, to cut through the Clausewitzian “twilight of action,” to reveal things that seem “grotesque and larger than they really are” in their normal size, without the heightened monstrosity of fear warping perceptions.
War is about decisions in the face of uncertainty, and in the press conference, McKenzie repeatedly emphasized that though this strike was clearly in error, it was part of a range of decisions that mostly produced the desired outcome for the US, which was a withdrawal from Afghanistan.
Without declassified assessments or extensive public investigation, it is impossible to know if McKenzie is right about the other actions taken. Even with that, the intentions and movements of armed groups exist beyond surveillance. The fog of war sits heavy over the last twenty years.
I am drafting this on September 18th, two decades to the day since George W. Bush signed the 2001 AUMF and ushered in the durable legal apparatus of forever war.
Under this fog of forever war, deaths like Ahmadi’s are common, while investigations revealing the murderous error are not. No amount of technology will fix the uncertainty of war, despite the promises of arms makers.
Thank you all for reading this. I appreciate every time I get to sink into a topic like this, and learn in the process of writing.
If you’re looking for a wordy reflection on the tech that 9/11 built, and in turn how Silicon Valley was built by war tech, I was a guest on two episodes of This Machine Kills podcast this month. The first is available for anyone, while the second is just for Patreon backers for now, but you can listen to a preview.
In addition, may I recommend checking out what we’re doing over at Discontents? This month I really appreciated the entry from Gaby Del Vall of BORDER/LINES, about the grim continuity of border cruelty from Trump to Biden.
The maps in R.U.S.E. are also symmetrical, and when zoomed out every game appears as taking place on a table in a war room. R.U.S.E. retains the major players of WWII as factions, though it is very much a game about seeing if Italian or Japanese or French tanks are better than it is any meaningful exploration of the war.
A friendly unit tracker is sometimes referred to as a “blue force tracker,” after the military jargon of Blue for one's own side, green for allies, and red for enemies. The danger from using a “blue force tracker” is that the enemy might have access to the same information, and thus also know the exact location of everyone they are trying to defeat.
This might feel out of left field, but the anime Attack on Titan does a really good job of illustrating the fear, paralysis, and disjointed battlefield action that happens when the fog of war is thick. It came to mind a few times reading this. Also, re: WC2...For the Horde!