Did Shannon Alone Invent Our Future?

There’s an important bit of knowledge buried in an interesting new article about the history of modern communication:

A portmanteau of “binary digit,” a bit could be either a 1 or a 0, and Shannon’s paper is the first to use the word (though he said the mathematician John Tukey used it in a memo first).

Shannon clearly is reporting working around others and sharing attribution. However, the author of the article starts it off by rather ironically making a narrative of wide communication about a single person:

Mathematics searches for new theorems to build upon the old. Engineering builds systems to solve human needs. The three disciplines are interdependent but distinct. Very rarely does one individual simultaneously make central contributions to all three — but Claude Shannon was a rare individual. …more than 70 years ago, in a single groundbreaking paper, he laid the foundation for the entire communication infrastructure underlying the modern information age.

It reads to me as though the person trying to get us to celebrate importance of communication links being simplified and standardized (to bridge any and all individuals together) at the same time is trying to create a super-human myth.

Was Shannon rare, or was he just the natural progression in an old and well-known theory that groups achieve more by working together and being humble about the steps made?

Take for example this analysis:

His theorems led to some counterintuitive conclusions. Suppose you are talking in a very noisy place. What’s the best way of making sure your message gets through? Maybe repeating it many times? That’s certainly anyone’s first instinct in a loud restaurant, but it turns out that’s not very efficient. Sure, the more times you repeat yourself, the more reliable the communication is. But you’ve sacrificed speed for reliability. Shannon showed us we can do far better.

Sorry but I don’t know anyone who thinks repeating the same message in a noisy place is the first instinct, nor that it makes communication more reliable. The opposite, in fact, I know people who hate repeating messages and wisely give up quickly after just one or two attempts fail.

What if his conclusions were more reflections of reality? What if his big contribution was to make acceptable/formal the things already known and practiced, yet codifying it in a way most easily digested by the communities he served?

And most importantly, perhaps, what if he thought the lack of fame and outsized reward for his work isn’t such a bad thing at all? As the founder of the Internet precursor ALOHAnet purportedly once said “I was too busy surfing to worry about that stuff”.

Another example is my earlier post on attempts to pin down a single inventor of the Roland-808.

Note how this plays out in a 2013 article about the commonality of humans combining things together, just like Shannon:

Alive and awake to the world, we amass a collection of cross-disciplinary building blocks — knowledge, memories, bits of information, sparks of inspiration, and other existing ideas — that we then combine and recombine, mostly unconsciously, into something “new.” From this vast and cross-disciplinary mental pool of resources beckons the infrastructure of what we call our “own” “original” ideas. The notion, of course, is not new — some of history’s greatest minds across art, science, poetry, and cinema have articulated it, directly or indirectly, in one form or another: Arthur Koestler’s famous theory of “bisociation” explained creativity through the combination of elements that don’t ordinarily belong together; graphic designer Paula Scher likens creativity to a slot machine that aligns the seemingly random jumble of stuff in our heads into a suddenly miraculous combination; T. S. Eliot believed that the poet’s mind incubates fragmentary thoughts into beautiful ideas; the great Stephen Jay Gould maintained that connecting the seemingly unconnected is the secret of genius; Gutenberg’s invention of the printing press embodied this combinatorial creativity; even what we call “intuition” is based on the unconscious application of this very mental faculty.

Of course some cultures still can’t resist trying to focus credit onto one person so that 2013 article also tries to make it seem like Einstein’s version was best:

The concept, in fact, was perhaps best explained by Albert Einstein, who termed it “combinatory play.” (Einstein famously came up with some of his best scientific ideas during his violin breaks.)

To be fair that’s giving credit to Einstein for working so hard at combinatory play that he can explain it well to others.

For a different take on credit and combinatory play as innovation, perhaps take into consideration how an ancient African culture was so successful for hundreds of thousands of years.

When a young man kills much meat, he comes to think of himself as a chief or a big man – and thinks of the rest of us as his servants or inferiors. We can’t accept this … so we always speak of his meat as worthless. This way, we cool his heart and make him gentle.

In other words a young hunter killing big meat would face insults when they presented it to those who would be eating it. Major credit instead went towards the almost random person who delivered the arrow (hunters swap arrows before the hunt), for example.

Leisure and innovation were prized, not infinite aggressive aspiration. Centralized credit was not favored given inter-communication and collaboration.

Some psychologists now call the selfish attributes a function of being disrupted, such that technology may create a domain shift that manifests in “a new selfishness, and ultimately to hierarchical societies, patriarchy and warfare”.

Are Coders Poets?

Everyone at some point reaches the obvious conclusion that putting keyboard to screen (pen to paper, brush to parchment, chisel to wood and marble, etc) is very similar across disciplines.

For example, a fresh long article asks us whether coders are similar to poets:

I considered that, despite their difference in earnings, poets and coders followed similar processes in their work, playing with images and symbols to make something happen.

The problem in this article is that “make something happen” is a false equivalence.

That’s like asking is a graphic designer on contract is making something the same as an unconstrained artist.

Both are making something happen, yet one is tasked with a particular outcome on a particular schedule for someone else and the other can make whatever they like.

Does that difference in inherited versus controlled outcomes matter?

Of course! Who is that “something” for?

Unfortunately it is summarized in this fresh article by a poet using a manner that misrepresents both poets and coders:

Poets aspire to use language to uncover intention and surprise, both secrets and revelation. Code, on the other hand, sticks to the program, arriving at a predicted end no matter what innovations have led there.

Consider that early in one’s learning phase the student sticks to the program… and only later after mastering the predicted end (meeting a teacher’s lesson plan, like hitting a product manager’s backlog target) do both advanced poets and coders use their language to uncover secrets and revelation.

This adherence to a plan is somewhat of a contradiction, I realize, to the famous writings by H.P. Lovecraft and his statement:

Our amateurs write purely for love of their art, without the stultifying influence of commercialism.

Amateurs are not so pure, it can be said, hopefully for obvious reasons. Lovecraft seems to have underestimated modern commercialism. Some may choose to be poets or coders because they see others being successful and seek similar ends, whether it be for social entry, influence, money, etc..

Consider also that inherited systems imply someone can be judged right and wrong, whereas controlled systems can never be wrong. Big differences between people operating in one versus the other, whether coders or poets.

Support for Trump is the KKK’s Lost Cause

Update January 2021: “Robert E. Lee and Me: A Southerner’s Reckoning with the Myth of the Lost Cause


The Lost Cause” is the odd phrasing of white nationalists who believe they should still be allowed to continue their mission of slavery and genocide in America, yet don’t want to say so obviously.

The Cult of the Lost Cause had its roots in the Southern search for justification and the need to find a substitute for victory in the Civil War. […] The propaganda the Lost Cause adherents were peddling was not only benign myth, it was a lie that distorted history, sought to rationalize lynching, and created a second class of citizenship for African-Americans.

Losers writing history need a “substitute for victory”.

Hate groups can’t just come out and say they believe in racist violence — without facing massive opposition and ridicule — so instead they fight unfairly by cooking up complex victimization conspiracies painting themselves as victims; they falsely narrate “law and order” claims and give complex plot twists that demand America be run only by “their” man (Trump).

Psychologists suggest not everyone is equal who believes in conspiracies, as some are affected more by closed mindedness than others:

…psychology research has shown greater degrees of certain cognitive quirks among those who believe in conspiracy theories—like need for uniqueness; needs for certainty, closure, and control; and lack of analytical thinking. But the best predictor of conspiracy theory belief may be mistrust, and more specifically, mistrust of authoritative sources of information.

Certainty and control seems like when President Nixon was aiming for when he used phrases “War on Drugs“, “Interstate Highway System” and “Urban Renewal” instead of saying he planned to start an endless race war (which historians since have clearly documented if you follow the links I just provided).

This goes back even earlier in America. When white supremacists in the 1900s sought to murder someone they didn’t say so overtly and instead framed it with possession of an “illegal” substance to cook a “legality” of racism:

The Oregon chapter began when the Klan salesman, Luther Powell, arrived from California looking for new recruits. He sized up the state of affairs in Oregon and decided he would make the lax enforcement of prohibition his first issue. Anti-Catholicism would later prove more productive, but for Powell’s first organizational meeting the prohibition issue was good for 100 new Klansmen, including lots of policemen.

This is the background to those believing a complex series of “unfair” events are “evidence” that a democratic election was won by their unpopular white supremacist leader, despite incontrovertible the proof to the contrary:

…appeal of this conspiracy theory to racists isn’t subtle. It’s a way to deny the legitimacy of Black voters without coming right out and saying it. This isn’t just a conspiracy theory about Trump’s fragile ego. It speaks directly to long-standing right-wing fury at minority voting rights. Historian Jeffrey Herf notes another historical precedent at play, comparing Trump’s conspiracy theory to the ones that rose up in Germany between the first and second world wars…

You might have also noticed that both Woodrow Wilson and Trump ran campaigns of “America First”, which is no coincidence. “The Lost Cause” was a KKK political platform that manifested under “America First” banner of Wilson. Today it represented basically the same platform (remove blacks from government, create a nation ruled by white men only).

Ten U.S. Army bases are still named in honor of Confederate generals. Donald Trump has strenuously resisted any effort to rename these bases, saying that they are “part of a great American heritage.” But what heritage are they commemorating exactly?

Naming these bases was one of the crowning achievements of those who sought to perpetuate the Lost Cause. A revisionist history that gained popularity in the 1890s, the Lost Cause recast the Confederacy’s humiliating defeat in a treasonous war for slavery as the embodiment of the Framers’ true vision for America. Supporters pushed the ideas that the Civil War was not actually about slavery; that Robert E. Lee was a brilliant general, gentleman, and patriot; and that the Ku Klux Klan had rescued the heritage of the old South, what came to be known as “the southern way of life.”

A principal goal of the Lost Cause was to reintegrate Confederate soldiers into the honorable traditions of the very American military they had once fought against. Members of the Lost Cause movement had lobbied to have newly built military bases named after Confederate generals several times without success. But during Woodrow Wilson’s second term as president, they found a more hospitable reception. Thanks to Wilson, the Lost Cause ideology came fully into the mainstream, reaching the apex of its influence as America entered the First World War.

In other words, immediately removing Confederate general names from Army bases would help stop this madness of the white nationalists in America and their Lost Cause revisionism.

Hopefully someone like the American hero Silas Soule would have his name on a base instead, as I find few seem to have heard of him despite his amazing life and service to his country.

At this day and age it should not be hard to argue that slavery and genocide are wrong, yet the Trump family are a symptom of Americans who think of ways to bring them back… (how many Americans have died from COVID19 and was it not an act of genocide?).

Assistant Professor of Epidemiology (Microbial Diseases); Associate (Adjunct) Professor of Law, Yale Law School; Co-Director, Global Health Justice Partnership; Co-Director, Collaboration for Research Integrity and Transparency

Those Americans of the Lost Cause ilk who long for a return to their causes of slavery and genocide are now peddling conspiracy theories as their political ticket back to power.

A Frederick Douglas May 30, 1871 speech comes to mind, eloquently destroying the “Lost Cause” as the wrong side, in opposition to the Right Cause.

But we are not here to applaud manly courage, save as it has been displayed in a noble cause. We must never forget that victory to the rebellion meant death to the republic. We must never forget that the loyal soldiers who rest beneath this sod flung themselves between the nation and the nation’s destroyers. If today we have a country not boiling in an agony of blood, like France, if now we have a united country, no longer cursed by the hell-black system of human bondage, if the American name is no longer a by-word and a hissing to a mocking earth, if the star-spangled banner floats only over free American citizens in every quarter of the land, and our country has before it a long and glorious career of justice, liberty, and civilization, we are indebted to the unselfish devotion of the noble army who rest in these honored graves all around us.

And also let’s not forget when the Senate voted unanimously to simply expel its members who joined a pro-slavery rebellion against it.

January 10, 1862, the Senate voted unanimously to expel Missouri’s two senators, Waldo Johnson and Trusten Polk, for “sympathy with and participation in the rebellion against the Government of the United States.”

Social Media Protocol for Evidence of War Crimes

Berkeley Law announced the first global protocol has been set to aid in using social media for war crime investigations.

Launched in 2016, the center’s Investigations Lab is the first university-based open source investigations unit to discover and verify human rights violations and potential war crimes — with over 75 students who collectively speak 30 languages contributing verified information to international NGOs, news organizations, and legal partners. As the lab’s prominence and role helping human rights lawyers collect and analyze information from social media grew, more questions flowed in from organizations worldwide.

“We couldn’t answer most of them, including specifics about how user-generated content should be collected and stored to ensure its utility in future legal proceedings,” [HRC Executive Director Alexa] Koenig says. “We realized we weren’t alone in struggling to make sense of this relatively new field of practice and that we needed to meet the moment.”