Tesla is recalling 362,758 vehicles after a new report from the federal National Highway Traffic Safety Administration called out its Full Self-Driving Beta (FSD Beta) software forĀ a reason that, when you think about it, is surprisingly human for a computer-based AI system: it sometimes breaks traffic laws.
Sure, we all do that on occasion, but the difference is that you and I aren’t a $15,000 option that’s often touted as being safer than human driving. Unfortunately, that doesn’t seem to be the case, especially according to NHTSA, which advised Tesla that it has “potential concerns related to certain operational characteristics of FSD Beta in four specific roadway environments.” Sounds like someone is in trouble.
The recall notice describes the “defect” as follows:
FSD Beta is an SAE Level 2 driver support feature that can provide steering and braking/acceleration support to the driver under certain operating limitations. With FSD Beta, as with all SAE Level 2 driver support features, the driver is responsible for operation of the vehicle whenever the feature is engaged and must constantly supervise the feature and intervene (e.g., steer, brake or accelerate) as needed to maintain safe operation of the vehicle.
In certain rare circumstances and within the operating limitations of FSD Beta, when the feature is engaged, the feature could potentially infringe upon local traffic laws or customs while executing certain driving maneuvers in the following conditions before some drivers may intervene: 1) traveling or turning through certain intersections during a stale yellow traffic light; 2) the perceived duration of the vehicleās static position at certain intersections with a stop sign, particularly when the intersection is clear of any other road users; 3) adjusting vehicle speed while traveling through certain variable speed zones, based on detected speed limit signage and/or the vehicle’s speed offset setting that is adjusted by the driver; and 4) negotiating a lane change out of certain turn-only lanes to continue traveling straight.
Essentially, this all just adds up to sloppy driving: not obeying speed limits, rolling through stop signs, going through intersections with “stale” yellow lights (that’s NHTSA’s odd terminology there), and going straight out of turn lanes. The truth is, we’ve seen this sort of behavior and sometimes significantly worse from FSD Beta plenty, which is why personally I think this sort of recall may be overdue.
Because FSD Beta is an SAE Level 2 system, the human in the driver’s seat needs to remain vigilant at all times and be ready to take control at any moment. The fact that this recall exists at all suggests that people are not always able to perform their vigilance task role properly, which is really the biggest flaw of all Level 2 systems, something that I have definitely ranted about multiple times before.
Alongside this problem there’s still the issue of how FSD Beta is marketed and portrayed and how it is understood by the people that use it; in general it tends to be overhyped and its capabilities misunderstood and overestimated, sometimes due to some seemingly intentional overstating by Tesla themselves.
Tesla and its Level 2 semi-automated driving systems have been under investigation by NHTSA for some time. This recall, which affects all Tesla models ā Model 3, Model Y, Model X, and Model S, all ranging from 2016 to 2023 ā and Tesla plans to correct the issue via an over-the-air update on April 15 to alter the FSD Beta software. According to the recall notice, the remedy plan is this:
Tesla will deploy an over-the-air (āOTAā) software update at no cost to the customer. The OTA update, which we expect to deploy in the coming weeks, will improve how FSD Beta negotiates certain driving maneuvers during the conditions described above.
Tesla does not plan to include a statement in the Part 577 owner notification about pre-notice reimbursement because there are no out of warranty repairs related to these conditions.
The remedy OTA software update will improve how FSD Beta negotiates certain driving maneuvers during the conditions described above, whereas a software release without the remedy does not contain the improvements.
Even with these fixes, there are still the issues of the inherent, non-technical but conceptual flaws with all Level 2 semi-automated systems. NHTSA is reportedly assessing the interactions between humans and these Level 2 systems like Autopilot and FSD Beta to see how driver attentiveness is affected, so perhaps we’ll see new NHTSA rulings on that in the future.
At least the fix is easy for Tesla owners: just be ready for that update on April 15, and you should be ā if not good ā at leastĀ betterĀ to go. Just pay attention to what your car is doing, FSD Beta or not, people.
Support our mission of championing car culture byĀ becoming an Official Autopian Member.
Mixing “self driving” cars with cars driven by humans is never going to work. A pack of FSD-B Teslas driving down Interstate 10 between Houston and San Antonio result in a 50 mile long traffic jam.
Now Iām sort of interested in what would happen if you could somehow unleash a pack of 50 or even 100 Teslas on a closed off stretch of highway all in self driving mode. Sure it might end up being perfectly fine but what if it isnāt. That could be interesting to watch from a safe distance for sure.
A big question in the design of self-driving cars (even if they do work properly) is should they be designed to follow the law to the letter, or to drive like everyone else does?
When encountering a stop sign, should they stop behind the sign and then move forward until they can see if it’s safe to go? When turning right on to a four-lane road, should they wind up in the right lane as they’re suppose to, or in the left lane like everyone else? When turning left on to that same road, should they assume it’s okay to turn into the left lane if a car is coming in the other direction and turning right, or should they assume the other car will wind up in the left lane?
Should the self-driving cars follow the posted speed limit? If they do that where I live, they’ll most certainly get rear-ended. Should that behavior change if they are operating with out-of-state plates? Or should known speed traps be programmed into the system?
In addition to all of this, I had a thought while reading this article. A thought so basic I can’t believe I haven’t thought of this yet: Good ol’ Elon has been promising autonomous vehicles for a few years now. What if they aren’t going beyond Level 2 automation because it keeps the onus of operation on the driver, and Tesla doesn’t have to account for the issues all these situations bring up?
It comes back to the basic idea of the Trolley Problem, but even these issues that may cause crashes would have to be insured by Tesla if/when they get to level 3 or 4 and above.
I expected a much stronger “I told you so!” air to this piece, considering how outspoken Jason has been on this topic.
He knows that not being smug about it is just as satisfying.
As I see it, NTSA has a *big* ethical problem; they can’t allow driverless taxis (up to four people) as they are currently doing, then point at Tesla or their drivers and say they can’t operate in the same manner. It’s an obvious conflict of interest. Sadly, in order to suck up to green political utopianism, the NTSA have given legal blessing where they should not, as a *safety* based organization that is supposed to guard people first. This recall shows they are frantically scratching for any legal excuses to curtail and control it before it blows up and they get *e* on their face. Sadly the recall wording does little but cite a few failings in Tesla software to follow some road rules and won’t have much overall effect. It seems more like it’s a vain “Look! We’re doing something!” release meant to placate critics as a recent ad has them a bit tense.
The difference is that AV taxi companies apply for licenses except for Tesla that just hands out FSD based on Musk’s whims.
I expect this issue and the lawsuits to be settled by the time the Cybertruck begins deliveries. Aka, never….Elon is a weasel and his kids have weird names.
I want to say congrats NHTSA for forcing the issue except they should have done this a year or more back !!
Also -and here i’m going on others comments -it sounds like tesla are doing the recall as a workaround, then it’s back to business as usual.
I wonder how many years tesla can drag this out? Not just the safety agencies but a lawsuits from FSD buyers.
We should bet on this* XD
What do americans you call the all in one pot betting method?
*um..if it’s legal
IIRC most of Tesla’s software is built on the back of GPL software which is why things are kept “Beta” and “Trade Secret”. Because if it gets out or becomes final release then a good chunk of it is Open Source under the applicable copyright licenses. Google walks this fine line with Android with AOSP releases. I believe Tesla will have to play the same game if the code goes public through a Federal lawsuit.
IANAL nor do I play one on TV.
The Beta for the actual fix drops on April 1, April 15 is when V2 official release comes, but all this could be delayed because the Buffalo AI site is short on staff from firing pro-union workers.
Will be interesting how the system acts on April 16 and taking bets if it is better or worse at causing accidents.
This is not a forced recall. This distinction is very important.
Quote: “On February 7, 2023, while not concurring with the agencyās analysis, Tesla decided to administer a voluntary recall out of an abundance of caution.”
In other words, this is a bullshit ‘update’ that doesn’t address any of the very real safety issues, but attempts to cut the NHTSA’s investigation off by ‘recalling’ all of the cars. Which they will then rely on to scream “you can’t make us turn it off we already did a voluntary recall.”
Said voluntary recalls also can’t be as thoroughly or rigorously checked by the NHTSA. Which means they’ll likely just push an ‘update’ consisting of a version number change and removing the rolling stop checkbox from the UI.
What prevents NHTSA from saying: “That’s not good enough!” if the voluntary recall proves to be insufficient?
If a certain type of car had a propensity for blowing up unexpectedly and its manufacturer called a voluntary recall to apply a bumper sticker that read: “Be careful.” as a mitigation measure, you can bet that NHTSA would step in and make them try harder. The same should be true here as well.
Except those two aren’t comparable. In your example, the manufacturer is very transparently and obviously doing nothing at all. And everybody knows it.
With a bullshit software update, Tesla asserts that all of the code is a proprietary trade secret and the NHTSA can’t see it. But they made changes, trust them. And the ongoing investigations basically have to reset because the conditions aren’t the same. The software was ‘fixed’ per the manufacturer.
It’s like the notorious, known defective Ford PowerShit transmission. They knew it wasn’t fit for purpose. Know how Ford kept avoiding recalls and judgments? Bullshit software updates. Because the transmission software is a proprietary trade secret, and therefore, nobody can examine it to determine if they did anything more than change a version string.
It is the legal system. NITSA gets what they want and would get because they decide but without a 20 year legal fight. Tesla saves face because voluntary and as such less chance of losing a class action lawsuit worth enough money that Elon would have to start taking public transit.
So this software update is supposedly going to be ready on April 15th. Which is two months away.
Two months to make fairly substantial changes, QC them, and approve them for release? For a system as complicated as FSD?
Yeah, Teslas are still gonna do the same thing on April 16th
Right after laying off a bunch of people to bust a unionizing attempt.
Yeah, I heard about that. That’s gonna be a good lawsuit to try and defend.
I do believe i heard after some public backlashes they backed down under the agreement the scouts didnt do it again. Not positive about that though.
Sorry but having a bunch of union humps is not going to accomplish anything on a deadline. I know people who work in Pittsburgh Pa on the government tit. You are required to use a union electrician to change a burnt out lite bulb. It takes at least 2 months, you can be fired for replacing it yourself. Same union assholes who wouldnt clean up a park because they were too busy but sued the boy scouts for cleaning it up for free. Yeah having owners in charge doesnt work but unions are just as corrupt and flat out pieces of shit.
Absolutely. They’re kicking the can down the road by claiming to fix it. They hope no one notices for long enough for a real update.
I am amazed in the very litigious and highly regulated country, specifically as it relates to transportation and safety, that it took us this long to get here.
Yeah very litigous no one doing without a payoff for them and the people who bought them.
Stale yellow is a pretty common term. It may even be technical.
Also, where is the class action fraud lawsuit? $15k?! FFS!
” Stale ” has been Driver’s Ed terminology since at least the late 70’s and it is an absolutely perfect description .
A “stale green”, one with flashing don’t walk signs, dates back to the Smith System of the 1950’s.
If I catch an amber and it goes red part way through the intersection, I say to myself “the sun was setting on that one”. It is a little bit of a dangerous game with intersection cameras these days.
In many (most i think) states as long as the light was still yellow when the front wheels enter the intersection, you are allowed to continue through the intersection on the red.
Oregon (That one i know, and a few others probably.) requires you to have excited the intersection before the light turns red.
“excited the intersection”
How Portland keeps it weird
The traffic light is the embodiment of the circle of life. What starts out as a fresh beautiful baby light grows in brightness to adulthood, and then become stale losing freshness until death. Only for the universe to reassemble the electrons, which eventually come together to form life again.
So itās a software update that prevents the cars from doing the things that they werenāt supposed to do in the first place with the software but may still do after the update because thereās no way to tell what they are actually āfixingā???
And thinking they might fix it is generous. It’s almost equally likely they make no real change, but release an update “fixing” the software to kick the can down the road and thumb their nose at the NHTSA. After all, it takes time for crash and other reporting to get to the NHTSA, be analyzed, and warrant a recall.
They should force Tesla to refund the money paid for FSD, since it’s always been vaporware
“They should force Tesla to refund the money paid, since itās always been vaporware”
I think this version makes sense too.
Especially as those promises are now hinged on new hardware, meaning current owners will never get their oft promised Ŕef driving car