Selected pieces of a twitter thread by Yonatan Zunger, posted verbatim here because Twitter is awful for long-form text:
I didn’t come up in computer science; I used to be a physicist. That transition gives me a rather specific perspective on this situation: that computer science is a field which hasn’t yet encountered consequences.
Chemistry had two reckonings, in the late 19th and early 20th centuries: first with dynamite, and then with chemical weapons. Physics had its reckoning with the Bomb. These events completely changed the fields, and the way people come up in them.
Before then, both fields were dominated by hope: the ways that science could be used to make the world a fundamentally better place. New dyes, new materials, new sources of energy, new modes of transport; everyone could see the beauty.
Afterwards, everyone became painfully, continuously aware of how things could be turned against everything they ever dreamed of.
I don’t know the stories from chemistry as well. In physics, I can tell you that everyone, from their first days as an undergrad (or often before), encounters this and wrestles with it. They talk about it in the halls or late at night, they worry about it.
For a long time, it frightened me that biology hadn’t yet had this moment of reckoning — that there hadn’t yet been an incident which seared the importance of ethics and consequences into the hearts of every young scientist. Today, it frightens me more about computer scientists.
Young engineers treat ethics as a speciality, something you don’t really need to worry about; you just need to learn to code, change the world, disrupt something. They’re like kids in a toy shop full of loaded AK-47’s.
The hard lesson which other fields had to learn was this: you can never ignore that for a minute. You can never stop thinking about the uses your work might be put to, the consequences which might follow, because the worst case is so much worse than you can imagine.
Short postscript: As several people have pointed out, many fields of biology have had these reckonings (thanks to eugenics and the like), and civil engineering did as well, with things like bridge collapses in the late 19th century.
Civil engineering responded to this by developing codes of ethics and systems of professional licensure which shape it to this day. I’ve been wondering about this a lot, recently: whether we should be doing the same in CS.
That is, ethical codes with teeth, and licensing boards with the real ability to throw someone out of the profession, the way boards can in engineering, medicine, or law.
The university I attended didn’t teach an ethics course as a part of the Computer Science program. I’ve heard others describe such classes as an easy way for people to bump up their GPAs while they should be off learning how to build compilers, or network infrastructure, or some other Hard Thing. I don’t know why we programmers are such suckers for technical self-flagellation, especially when there’s so much moral self-flagellation to be had.
Zunger is right—software developers are writing the script that the future will run on, and we’re doing so while asleep at the wheel. Between Equifax, Cambridge Analytica, and whatever major breach happens next1, we’ve set the world up to be taken advantage of by whichever player is willing to be the most malicious.
We can’t expect legislators to fix this, because they’ve shown that they don’t understand the technical side. We can’t rely on the market to fix this, because we’ve taught the world that software should be free, and so we desperately need advertising dollars (and all the associated tracking that comes along with that) to keep the industry afloat. We can’t rely on a professional standards body to fix this, because there’s no way to keep a motivated kid from learning to code—and let’s be honest here, most of us were that kid at some point.
So that leaves us. We’re going to have to fix this mess on our own.
Notice that it isn’t controversial when someone suggests there are more serious breaches coming? ↩