Labor, Force Revisited
A look back at Tomorrow Wars, a look ahead at reader-directed coverage for Wars of Future Past
A targeting algorithm is a piece of code with two humans of indeterminate combatant status on either end of it.
Okay, that’s a clumsy adaptation, but in the spirit of Labor Day, I wanted to capture something of the old saw that ‘a bayonet is simply a tool with a worker on either end of it’.
While the relationship in combat between uniformed people remains largely the same, much of the design of the weapons of future war is about creating distance and obscurity between the making of weapons, the use of weapons, and the kind of people who die from those weapons.
Roughly a year ago, I wrote the sixth issue of Tomorrow Wars, and to match the holiday, I themed it “Labor, Force.” Topically, my piece was linked to the public debate over whether or not Silicon Valley would, in the face of worker protest, be willing to continue coding weapons or weapon-adjacent tools for the military. Workers at Google and Microsoft both authored and published petitions, which the companies have responded to in part, to explain what kind of collaboration is and isn’t acceptable.
Here is some of what I wrote at the time:
The moral dimensions of this debate are beyond the scope of this newsletter, but there is a fundamental challenge of technology that sets the moral conflict in place. Both Project Maven and HoloLens adapt image processing tools built for the commercial market to military ends, albeit with specific labor from workers making the adaptation possible. Without the skilled workers familiar with these projects, the technology would not exist at present, much less exist specifically for the military.
Keeping within the bounds of Tomorrow Wars, I focused, especially, on the way that dual-use technologies complicate clear divisions between what is and isn’t technology built for war.
For one, more recent example, consider the phenomena of the Selfie Drone. A bigger part of the burgeoning commercial drone scene in 2015 or so, “selfie drones” were pitched as a very fancy version of a selfie stick. With a push of a button, they fly a distance away, track the human operator, and give a dramatic angle on a shot.
As the tech goes, it is innocuous until it maybe isn’t. Cheap robots that can follow people read ominous, though limitations on flight times for electric quadcopters are still very real. But with a tweak to the distance and a little bit of improvement in the machine, a small drone that tracks a person moves from “a neat camera” to “a threat.”
That threat looks differently if a uniformed military adapts the cheap drone into a weapon, or if an irregular force does, or if it’s a lone person doing weird violence aided by robots. The technology facilitates all of the above, as well as perfectly fine selfies. The tech isn’t so much neutral as it is versatile, and so any story about the design, use, and potential of the tech should include the range of possibilities, and nods towards who at what point is responsible for what it does.
The work of designing technology, and writing about technology, should come with some discomfort about possible consequences. This is more so true for the same about weapons, though the essence of dual-use tech is that it’s hard to know how the tech may ultimately be used.
It is also why, I think, it is so important to cover, where I can, worker voices in the process, and to at least transcribe worker discontent with the malicious end uses of the fruits of their labor. A company may wipe its hands clean of code that kills, and may have a legal team ensure it never faces public sanction, but the worker who wrote the code still has to live with any consequences.
So far, the public worker unease over coding weapons has hardly diminished tech executive interest in contracting with the military. It remains to be seen, as some of the same technologies built for war are adapted to repressive purposes at home, if that unrest can remain quieted by the gradual roll-out of ethical principles for AI.
Thanks, all of you, for subscribing to the paid special features of Wars of Future Past. I know that the economy is in shambles for tens of millions right now, and anything you have contributed is deeply appreciated. Your support here makes it possible for me to not only write these newsletters, but helps sustain me and my family through my professional freelance career more broadly.
To help make this not just a one-way exchange, I’ve turned on comments on all my newsletters to paid subscribers. Normal norms of commenting apply: to the extent that the platform allows me to moderate comments, I will, to facilitate a space of worthwhile and respectful conversation. Within those bounds, you can now comment, and share on the post directly what you think of what I’ve written. I hope to learn from you all.
Finally, I want to know what I can offer to make your paid subscriptions feel worthwhile.
I’ve created a short survey in Google Forms for paid subscribers. Please, fill it out so I can better make the kind of newsletter you want to sustain and support. You can take the survey here: https://forms.gle/96Q9Cksc1r2TurAE9.
In addition, if you want to email me directly about anything related to the newsletter, you can do so at warsoffuturepast@gmail.com.
Thank you, again, and may your alienation from your labor always be well compensated.