There is so much proof now that Tesla is not intelligent, doesn’t learn, and is a scam based on short-cuts… it should come as no surprise they’re defining “good driver” with almost no data.
“If driving behavior is good for 7 days, beta access will be granted.” (The company began selling insurance in its home state of California in August 2019.)
After two years of selling insurance, Tesla will use its own insurance data from 7 days prior to a button being pushed by the driver to define whether that driver is “good”.
This obviously fails to use independent evaluation and gives the driver an obvious way to avoid being judged accurately. It’s just more proof Tesla has no intention of keeping roads safe.
It should be called “autocratic” driving.
More to the point here, look at these quotes from Elon Musk (in my latest presentation).
Then look at this quote, which is obviously full of lies.
Tesla CEO Elon Musk, who called a previous version of FSD Beta software “not great,” cautioned Friday evening that FSD Beta now seems so good it can give drivers a false sense of security that they don’t need to pay attention to driving while FSD Beta is engaged, even though they do have to remain attentive and at the wheel.
With a small group of FSD drivers, there has been a lot of evidence the car is getting worse and it’s manifestly unsafe.
Multiple near-misses are being documented where the Tesla is pushing the driver into crashing.
And this is being turned into a message from the CEO that “seems so good it can give drivers a false sense of security”? It’s the CEO who is giving them this sense, and those who repeat his lies.
It’s completely disingenuous and obviously negligent of the company to even hint that the car is to blame for driver overconfidence, but it also goes back to the CEO arguing people will be killed if they are warned they might be killed.