Interesting write-up on Vox about the political science of Facebook, and how it has been designed to avoid governance and accountability:
…Zuckerberg claims that precisely because he’s not responsible to shareholders, he is able instead to answer his higher responsibility to “the community.”
And he’s very clear, as he says in interview after interview and hearing after hearing, that he takes this responsibility very seriously and is very sorry for having violated it. Just as he’s been sorry ever since he was a first-year college student. But he’s never actually been held responsible.
I touched on this in my RSA presentation about driverless cars several years ago. My take was the Facebook management is a regression of many centuries (pre-Magna Carta). Their primitive risk control concepts, and executive team opposition to modern governance, puts us all on a path of global catastrophe from automation systems, akin to the Cuban Missile Crisis.
I called it “Dar-Win or Lose: The Anthropology of Security Evolution”
It is not one of my most watched videos, that’s for certain.
It seems like talks over the years where I frame code as poetry, with AI security failures like an ugly performance, I garner far more attention. If the language all programmers know best is profanity, who will teach their machines manners?
Meanwhile, my references to human behavior science to describe machine learning security, such as this one about anthropology, fly below radar (pun intended).