Answer: To crash on the other side.
Recently I was honored and privileged to give a keynote at the MindTheSec 2021 conference. Here are my slides:
This one seems to be especially popular:

And here are the numbers they shared with me afterwards:

Answer: To crash on the other side.
Recently I was honored and privileged to give a keynote at the MindTheSec 2021 conference. Here are my slides:
This one seems to be especially popular:
And here are the numbers they shared with me afterwards:
A while ago I wrote about Stanford’s role in genocide, as well as Polk.
The California Historical Society has a webinar coming up on November 30th with more details: “Truth & Resistance: Mapping American Indian Genocide in San Francisco”
The American Indian Cultural District (AICD) in San Francisco is undertaking a project called Mapping Genocide to examine the intentional erasure of American Indian history and contributions. AICD’s Co-founder and Executive Director Sharaya Souza (Taos Pueblo, Ute, Kiowa) and Director of Community Development & Partnerships Paloma Flores (Pit River, Purhepecha) will discuss some of the individuals San Francisco has chosen to honor and their role in American Indian genocide. The panelists will also talk about how you can help create resistance against the systemic erasure of American Indian history throughout San Francisco.
Ivermectin research is plagued with data integrity failures, raising an important question for security and privacy professionals: what better data control options are available?
The latest news seems right on track to demand interoperability from technology that facilitates more individually controlled patient data stores:
…calling for scientists to adopt a new standard for meta-analyses, where individual patient data, not just a summary of that data, is provided by scientists who conducted the original trials and subsequently collected for analysis”.
In other words using the Solid protocol would enable patients to participate in a consensual study by opening access to their data for research, while still allowing the highest possible integrity.
Saying “accuracy is still bad” is the defining security story of the 2010s and now 2020s as well… seriously holding back technology usefulness by undermining knowledge.
Integrity is lacking innovation and needs a complete new approach; it’s way too far behind where we are in terms of confidentiality and availability control engineering.
This paragraph caught my attention, as I’ve been trying to shift the discussion from surveillance to debt capitalism.
What does this algorithm-industrial complex look like and who is involved?
Perhaps our first glimpse of the catastrophic impact of algorithms on civil society was robodebt, deemed unlawful by the Federal Court in a blistering assessment describing it as a “massive failure in public administration” of Australia’s social security scheme.
Very well said.
The algorithm-industrial complex is characterised by a power and skills distortion: public sectors gutted of skills, and the influence of and outrageous expenditure on outsourcing, tech, and consultants.
…what defines the algorithm-industrial complex is the emergence of policy which can only be executed via algorithms.
That’s not quite right, since humans have used algorithms for thousands of years, but I get the point.
One of Khwãrezm’s most famous residents was Muhammad ibn Mūsa al-Khwarizmī, an influential 9th century scholar, astronomer, geographer, and mathematician known especially for his contributions to the study of algebra. Indeed, the latinization of his name, which meant ‘the native of Khwãrezm’ in Persian, gave English the word algorithm. He wrote a book in Arabic about Hindu-Arabic numerals; the Latin translation of the book title was Algoritmi de numero Indorum (in English Al-Khwarizmi on the Hindu Art of Reckoning).
Although the word algorithm can be traced to this man’s name in ancient Baghdad, it’s really the even more ancient Babylonians who started using algorithms 4,000 years ago, as computer scientist and mathematician Donald E. Knuth wrote in Ancient Babylonian Algorithms, Communications of the ACM 15, no. 7 (July 1972) 671-77:
One of the ways to help make computer science respectable is to show that is deeply rooted in history, not just a short-lived phenomenon. Therefore it is natural to turn to the earliest surviving documents which deal with computation, and to study how people approached the subject nearly 4000 years ago. […] The calculations described in Babylonian tablets are not merely the solutions to specific individual problems; they are actually general procedures for solving a whole class of problems.
The problem today thus isn’t someone or something following instructions, its how people centralize instructions to be brittle and dictatorial (concentrate wealth through high exit barriers) instead of making them flexible to embrace the compromise of representative democracy.
Hitler’s devout followers are not much different from a descendant of his followers who sets up companies of self-serving algorithms. So the real danger is a policy dependent on an intentionally monopolistic (fascist) model of proprietary technology, such as Palantir.
David Hume warned of exactly this in the 1700s, which is very late when you think about how old algorithms really are.