I’ve joked a few times now that algorithms are just other people’s abstractions.

What I mean is that they are other people’s abstractions — never your own! — your own abstractions are how you objectively arrived at the solution.

If algorithms are, as Nick Seaver says, “collections of human practices that are in interaction with each other”, then much of the angst around the algorithms of the Fates comes from seeing beliefs and biases reflected that are alien or hostile to us. There’s a certain kind of violence that comes from the weight of an entire system that disagrees with you that will be familiar to the disenfranchised.

If hell is other people, then what is other people crystallized at the titanic scale that our social platforms operate on?


There was a moment during the 90s networking of personal computers when people in tech shifted their posture to security. If you developed a computing platform with vulnerabilities, you could not simply throw up your hands and lay blame at the feet of those doing the exploiting. You had to assume that where there was motive (financial or political), there would be someone there to take advantage, and it was on the platform makers to find and plug those holes faster than others could.

What I’m saying is, we’ll know that platforms like Facebook or Twitter or YouTube are serious about protecting their users when they have Red Teams for the abuse of their algorithms.