A response to Wendell Wallach

0
After attending Wendell Wallach&rsquo;s talk at UW Feb. 10, I felt compelled to write a response. Not because I think that ethicists and social scientists should refrain from engaging in meaningful dialogues with engineers and scientists with the goal of a more ethical research trajectory, but because I am concerned with the language in which he concealed his arguments and concerns.</p>

Much like Stephen Hawking’s recent comments on the potential consequences of artificial intelligence research, Wallach’s talk circulated around the potential for future technologies to turn on their masters and usher in an apocalypse composed of grey goo, malevolent drones, and designer diseases.

Their solution to these future problems is what they refer to as “value alignment.” Simply put: imbuing machines with our own ethical sensibilities. This strikes me as odd considering the wild abandon with which we — as sentient, ethical beings — have engaged with technology. Framing the damaging consequences of our relationship with technology as only a potentiality — something that might confront us in the future — is a staggeringly ethnocentric claim. It is as though we are meant to see our current technological milieu as an epoch of unbridled success and innovation that will endure until the karmic balance of the universe shifts and we are confronted with global annihilation. 

Nowhere in his talk did Wallach mention the unequal scales along which the benefits of technological innovation are currently doled out. He did not mention how the drones (which may turn on us!) currently bomb and burn the bodies of the world’s poor and undesired. He did not mention how innovations in rapid disease testing are used to enforce bio-security measures along boarders and make decisions about whose life is worth saving or even living. He does not discuss how the grey goo of the future exists now in our oceans, waterways, and in the bloodstreams of babies in Flint, Michigan, and around the world.

I think this discourse about the future of technology and the fear tied to it exists because we see our own reflection in the spectre of death lurking in the global periphery and within our own nations, among persons of colour and indigenous peoples. I think it is a way of sublimating our concerns about the present and projecting them onto a fictional landscape while still maintaining the self-congratulation that comes with having coordinated a preemptive intervention. I think that it’s high time we spoke about it now.

Brian Schram

<

p class=”p5″ style=”color: rgb(0, 0, 0); font-family: Times; font-size: medium; text-align: right;”>
PhD Student, Sociology and Legal Studies