First, Israel has confirmed using drone swarms in combat.
…in mid-May, the Israel Defense Forces (IDF) used a swarm of small drones to locate, identify and attack Hamas militants. This is thought to be the first time a drone swarm has been used in combat.
Second, a June 14th drone swarm in Shanghai suddenly fell apart and dozens crashed, causing injury and damage.
Source: “Dozens of drones on the Bund in Shanghai accidentally fall and hurt people?”, Kanzhaji.com
And speaking of loitering munitions, a third news story confirms the US Marines are adopting Israeli technology.
Manufactured by the Israeli company UVision Air, the system has been selected after the completion of several successful demonstrations, tests, and evaluation processes. The system will provide the Marines Corps with ISR, highly accurate and precision indirect fire strike capabilities.
Sand is a fluid such that driving on it can be hard (pun not intended) even for humans.
It’s like driving on snow or mud, yet it seems to be far less well studied by car manufacturers because of how infrequent it may be for their customer base.
Traction control, for example, is a product designed for “slippery” conditions. That usually means winter conditions, or rain on pavement, where brakes are applied by an “intelligent” algorithm detecting wheel spin.
In sand there is always going to be some manner of wheel spin, causing a computer to go crazy and do the opposite of help. Applying brakes, let alone repeatedly, is about the worst thing you can do in sand.
On top of that the computer regulation of tire pressure sensors has no concept of “float” profile required for sand. When the usual algorithm equates around 40psi to safe driving, deflating to a necessary 18psi can turn a dashboard into a disco ball.
The problem is product manufacturers treat core safety competencies as nice to have features, instead of required. And by the time they get around to developing core competencies for safety, they over-specialize and market them into expensive festishized “Rubicon” and “Racing Design” options (let alone “WordPress“).
In other words core complex or dangerous scenarios must be learned for any primary path to be safe, yet they often get put onto a backlog for driverless. Such a low bar of competency means driverless technology is far, far below even basic human skill.
Imagine it like exception handling cases or negative testing being seen as unnecessary because driverless cars are expected only to operate in the most perfect world. In other words why even install brakes or suspension if traveling parallel to all other traffic at same rate of speed, like a giant herd? Or an even better example, why design brakes for a car if the vast majority of time people don’t have to deal with a stop sign?
Recently I put a new car with the latest driverless technology to the test with dry sand. I was not surprised when it became very easily confused and stuck, and it reminded me of the poem “Dans l’interminable” by Paul Verlaine (1844 – 1896).
Dans l’interminable
Ennui de la plaine,
La neige incertaine
Luit comme du sable.
Le ciel est de cuivre
Sans lueur aucune.
On croirait voir vivre
Et mourir la lune.
Comme des nuées
Flottent gris les chênes
Des forêts prochaines
Parmi les buées.
Le ciel est de cuivre
Sans lueur aucune.
On croirait vivre
Et mourir la lune.
Corneilles poussives,
Et vous, les loups maigres,
Par ces bises aigres
Quoi donc vous arrive?
Dans l’interminable
Ennui de la plaine
La neige incertaine
Luit comme du sable…
File this post under… someone on the Internet is wrong.
I was reading a click-bait titled article on Military.com called “‘The Father of Naval Special Warfare’ Almost Changed the History of the Vietnam War” when I ran into this eye-watering paragraph:
The seaborne infiltrations by communist forces went on for years. Despite the U.S. Navy’s patrols successfully intercepting communist supply runs for eight years, the North still stockpiled what it needed to launch the 1968 Tet Offensive. The surprise attack turned American public opinion against the war for the first time.
Had the United States prevented the Tet Offensive by choking its shallow water supply points, the entire history of the war might have been different from 1968 onward.
Let’s focus down to one sentence in particular.
The surprise attack turned American public opinion against the war for the first time.
The Tet Offensive was January 30, 1968. Right?
In 1967 there were hundreds of thousands of Americans openly protesting against the Vietnam War.
At least a year before the Tet Offensive, nation-wide protests and opposition already were in motion (and being documented).
The Spring Mobilization to End the War in Vietnam was organized on November 26, 1966, to sponsor antiwar demonstrations in the spring of 1967
I’m not sure how the title or the article made it past Military.com editors without someone realizing the entire premise of both is completely broken.
The Ford Pinto engineering design flaws are infamous, thus it has been the car most associated with preventable fire risks until… TESLA (updated July 2nd):
The driver, identified as an “executive entrepreneur”, was initially not able to get out of the car because its electronic door system failed, prompting the driver to “use force to push it open,” Mark Geragos, of Geragos & Geragos, said on Friday. The car continued to move for about 35 feet to 40 feet (11 to 12 meters) before turning into a “fireball” in a residential area near the owner’s Pennsylvania home. “It was a harrowing and horrifying experience,” Geragos said. “This is a brand new model… We are doing an investigation. We are calling for the S Plaid to be grounded, not to be on the road until we get to the bottom of this,” he said.
Hot off the desk of the un-professional PR department at Tesla is the related important story that their cars have a serious acceleration bug forcing a massive recall:
The remote online software ‘recall’ — a first for Tesla cars built in China — covers 249,855 China-made Model 3 and Model Y cars, and 35,665 imported Model 3 sedans.
The 300,000 cars being flagged for a critical safety failure are at risk of sudden acceleration due to problems with Tesla’s self-proclaimed “autopilot” software.
Yes, you read that right, the safety recall is because the very product feature that was supposed to make these cars safer is actually making them more dangerous.
An even deeper read to this story is that Tesla is pushing software updates to cars using an allegedly insecure supply chain. Given that the bug appeared in the first place, what is to prevent an even worse bug from being deployed to cars on the road at any time and in any place?
While some obviously want to celebrate the ability to remotely deploy update code, it may be wishful thinking to believe the update will not make things worse (Tesla’s 2.0 “autopilot” was infamously worse at safety than its 1.0 release).
Indeed, the “Plaid” model in flames above is using a “new version” of the battery for the S/X, which obviously is unsafe.
Tesla seems to regularly exhibit deploying bad code (the official insurance rating now has a “P” for poor safety in Tesla engineering) and pushing the cost of its own failure onto others.
Also worth mentioning is that Tesla’s PR system has been promoting acceleration as its top feature at the very same time that acceleration issues (coupled with handling and braking issues) are being cited in recent deaths of its customers.
This reads to me like Ford promoting the heating capabilities of its Pinto while its customers are dying in gasoline fires from preventable design defects.
Killed in a Ford Pinto: 23 (estimated to be much higher)