Home » Newly Released Video Of Thanksgiving Day Tesla Full Self-Driving Crash Demonstrates The Fundamental Problem Of Semi-Automated Driving Systems

Newly Released Video Of Thanksgiving Day Tesla Full Self-Driving Crash Demonstrates The Fundamental Problem Of Semi-Automated Driving Systems

L2prob Top

I’m not sure how much you keep up with bridge-related holiday car crashes, but there was a huge one this past Thanksgiving on the San Francisco Bay Bridge. This was a genuine pileup, with eight vehicles involved and nine people injured. That’s already big news, but what makes this bigger big news is that the pileup seems to have been caused by a Tesla that was operating under the misleadingly-named Full Self-Driving beta software, according to the driver. As you likely already know, the inclusion of the nouns “Tesla” and the string of words “full self-driving” is internet click-catnip, but that’s not really what I want to focus on here. What this crash really demonstrates are the inherent conceptual – not technological – problems of all Level 2 semi-automated driving systems. Looking at what happened in this crash, it’s hard not to see it as an expensive, inconvenient demonstration of something called “the vigilance problem.” I’ll explain.

First, let’s go over just what happened. Thanks to a California Public Records Act request from the website The Intercept, video and photographs of the crash are available, as is the full police report of the incident. The crash happened on I-80 eastbound, in the lower level of the Bay Bridge. There’s five lanes of traffic there, and cars were moving steadily at around 55 mph; there appeared to be no obstructions and good visibility. Nothing unusual at all.

A Tesla was driving in the second lane from the left, and had its left turn signal on. The Tesla began to slow, despite no traffic anywhere ahead of it, then pulled into the leftmost lane and came to a complete stop — on the lower level of a bridge, with traffic all around it going between 50 and 60 mph or so. The results were grimly predictable, with cars stopping suddenly behind the now-immobile Tesla, leading to the eight-car crash.

Vidframe Min Top
Vidframe Min Bottom

Here’s what it looked like from the surveillance cameras:

Timeline1

ADVERTISEMENT
Ad Placeholder Wide Top Banner

…and here’s the diagram from the police report:

Diagram1

According to the police report, here’s what the Tesla (referred to in the report as V-1) said of what happened

“He was driving V-1 on I-80 eastbound traveling at 50 miles per hour in the #1 lane. V-1 was in Full Auto mode when V-1 slowed to 20 miles per hour when he felt a rear impact… He was driving V-1 on I-80 eastbound in Full Self Driving Mode Beta Version traveling at approximately 55 miles per hour…When V-1 was in the tunnel, V-1 moved from the #2 lane into the #1 lane and started slowing down unaccountably.”

So, the driver’s testimony was that the car was in Full Self-Driving (FSD) mode, and it would be easy to simply blame all of this on the demonstrated technological deficiencies of FSD Beta. This could be an example of “phantom braking,” where the system becomes confused and attempts to stop the car even when there are no obstacles in its path. It could have been the system disengaged for some reason and attempted to get the driver to take over, or it could be caused by any number of technological issues, but that’s not really what the underlying problem is.

This is the sort of wreck that, it appears, would be extremely unlikely to happen to a normal, unimpaired driver (unless, say, the car depleted its battery, though the police report states that the Tesla was driven away, so it wasn’t that) because there was really no reason for it to happen at all. It’s about the simplest driving situation possible: full visibility, moderate speed, straight line, light traffic. And, of course, if the driver was using this Level 2 system as intended – remember, even though the system is called Full Self-Driving, it is still only a semi-automated system that requires a driver’s full, nonstop attention and a readiness to take over at any moment, which is something the “driver” of this Tesla clearly did not do.

ADVERTISEMENT
Ad Placeholder Wide Top Banner

Of course, Tesla knows this and we all technically know this and the police even included a screengrab from Tesla’s site that states this in its report:

Report Ap

 

We all know this basic fact about L2 systems, that they must be watched nonstop, but what we keep seeing is that people are just not good at doing this. This is a drum I’ve been banging for years and years,  and sometimes I think to myself: “Enough already, people get it,” but then I’ll see a crash like this, where a car just does something patently idiotic and absurd and entirely, easily preventable if the dingus behind the wheel would just pay the slightest flapjacking bit of attention to the world outside, and I realize that, no, people still don’t get it.

So I’m going to say it again. While, yes, Tesla’s system was the particular one that appears to have failed here, and yes, the system is deceptively named in a way that encourages this idiotic behavior, this is not a problem unique to Tesla. It’s not a technical problem. You can’t program your way out of the problem with Level 2; in fact, the better the Level 2 system seems to be, the worse the problem gets. That problem is that human beings are simply no good at monitoring systems that do most of the work of a task and remaining ready to take over that task with minimal to no warning.

ADVERTISEMENT
Ad Placeholder Wide Top Banner

This isn’t news to people who pay attention. It’s been proven since 1948, when N.H. Mackworth published his study The Breakdown of Vigilance During Prolonged Visual Search which defined what has come to be known as the “vigilance problem.” Essentially, the problem is that people are just not great at paying close attention to monitoring tasks, and if a semi-automated driving system is doing most of the steering, speed control, and other aspects of the driving task, the human in the driver’s seat’s job changes from one of active control to one of monitoring for when the system may make an error. The results of the human not performing this task well are evidenced by the crash we’re talking about.

I think it’s not unreasonable to think of Level 2 driving as potentially impaired driving, because the mental focus of the driver when engaging with the driving task from a monitoring approach is impaired when compared to an active driver.

I know lots of people claim that systems like these make driving safer – and they certainly can, in a large number of contexts. But they also introduce significant and new points of failure that simply do not need to be introduced. The same safety benefits can be had if the Level 2 paradigm was flipped, where the driver was always in control, but the semi-automated driving system was doing the monitoring, and was ready to take over if it detected dangerous choices by the human driver.  This would help in situations of a tired or distracted or impaired driver, but would be less sexy in that the act of driving wouldn’t feel any different than normal human driving.

If we take anything away from this wreck, it shouldn’t be that Tesla’s FSD Beta is the real problem here. It’s technically impressive in many ways though certainly by no means perfect; it’s also not the root of what’s wrong, which is Level 2 itself. We need to stop pretending this is a good approach, and start being realistic about the problems it introduces. Cars aren’t toys, and as much fun as it is to show off your car pretending to drive itself to your buddies, the truth is it can’t, and when you’re behind the wheel, you’re in charge — no question, no playing around.

If you want to read about this even more, for some reason, I might know of a book you could get. Just saying.

ADVERTISEMENT
Ad Placeholder Wide Top Banner

 

Relatedbar

New IIHS Study Confirms What We Suspected About Tesla’s Autopilot And Other Level 2 Driver Assist Systems: People Are Dangerously Confused

Level 3 Autonomy Is Confusing Garbage

Support our mission of championing car culture by becoming an Official Autopian Member.

ADVERTISEMENT
Ad Placeholder Wide Top Banner

Got a hot tip? Send it to us here. Or check out the stories on our homepage.

Facebook
WhatsApp
Twitter
LinkedIn
Reddit
86 Comments
Inline Feedbacks
View all comments
Uncle D
Uncle D
1 year ago

The SAE J3015 Levels of Driving Automations spec clearly states “You ARE driving” in their description of levels 0 though 2. Their is nothing FSD about Level 2 which is what Tesla currently has in their vehicles. Unfortunately, many Tesla drivers treat their Level 2 system as if it is Level 3 or better (You ARE NOT driving).

Several manufactures have started adding lane changing to their Level 2 systems which makes them seem like they are operating at higher level and encourages people to engage less. Level 2 features are for “support”.

Tesla is perfectly happy with the public’s perception that their ADAS can operate at a higher level than it is capable of. Unfortunately, misuse of their system coupled with beta testing their s/w on public roads has had things more dangerous for everyone rather than the stated purpose of making things safer.

The only true Level 3 (You ARE NOT driving) system available in the US is from Mercedes-Benz (certified for public roads in NV and CA with more states to come), but that system will only engage below 40 mph. Anything above 40 mph and it’s back to Level 2 (You ARE driving).

Jeff C-C
Jeff C-C
1 year ago

The lane tracking in my 2020 Kia Forte can follow curves on the highway. It will slow down if there is a car in front. I can be hands off for 30 seconds before it starts yelling at me to stop being a slacker and retake the wheel. Do that with the cruise control on, and it’s pretty autonomous.

Somewhat Like a Rock
Somewhat Like a Rock
1 year ago

100% this driver was asleep, woke up, took the weight off the steering wheel, and lied to the cops about it.

archimus
archimus
1 year ago

This is the whole logical fallacy of L2 – manufacturers and their defenders claim that you will be safer because of driver assists but then say that you cannot rely on driver assists and must be constantly prepared to take over in order to keep yourself safe. Imagine if other safety equipment were like this:

“Full Seat-Belt (FSB) has many advanced safety functions, but you cannot rely on Full Seat-Belt to keep you restrained – it may unexpectedly disengage at any time, so you should always keep your hands ready to brace yourself or be thrown from the vehicle in the event of an accident. Full Seat-Belt is a driver safety-assistance system. It does not provide seat belt functionality. Also, the manufacturer bears no responsibility if Full Seat-Belt should suddenly disengage during regular driving or in an emergency event or if it performs unexpected maneuvers – the driver should be constantly monitoring how FSB is behaving and be ready to immediately disengage and take over Full Seat-Belt’s safety functions.“

Oh, also there’s a whole toxic sub-group of Full Seat-Belt drivers who deliberately stage unapproved and unintended scenarios on public roads in order to back up the claim that FSB is actually a fully-functional seatbelt system, with the tacit approval of the manufacturer, who cannot legally make those claims themselves.

I’ve said it before and I’ll say it again – if a seatbelt can disengage at any time, unexpectedly, then it isn’t ready yet and it certainly shouldn’t be tested on public roads with other unsuspecting humans around.

nlpnt
nlpnt
1 year ago

1: “Autopilot with Full Self-Driving” is terrible branding, just asking for trouble. There’s a reason why legacy automakers with CEOs who follow legal advice and aren’t edgelords all brand their L2 systems as something inferring it’s cruise control and a little bit more.

2. Beta testing by the general public in their private road vehicles is dumber still.

Brummbaer
Brummbaer
1 year ago

Last year I bought a 2019 Cadillac CT6 3.0 litre twin turbo, primarily for the performance. In the last six months I have become more aware of much of its electronic frippery. Part of that awareness was the group of things that made it comparable to the Tesla. Lane centering? Check. Auto braking? Check. Positional awareness? Check. I am truly afraid of all these “helpers”. What assurances do I have that they won’t overrule my perfectly aware decision to change lanes for example?

I still love the performance of this magnificent beast!

hugh crawford
hugh crawford
1 year ago

Not to say that the Tesla did a bad thing, but the drivers in the other cars, or at least some of them included the rearmost driver, screwed up too. You should drive prepared for any damn stupid thing to happen. I’ve seen a couple of ladders, and a couch fall of pickups right about there. Further along in Oakland the car in front of me swerved to avoid a complete V8 engine in the center lane and there were cars on either side so I just slammed on the brakes. Cars break all the time. Weird random crap happens all the time.

Short version of the story is car suffers some sort of mechanical failure, perhaps the computer diagnostics indicated it was not safe to continue, the car pulled to the side, and a bunch of cars hit it.

Richard Caywood
Richard Caywood
1 year ago
Reply to  hugh crawford

But from the video stills, it is unclear if the brake lights are on the Tesla. I am unfamiliar enough with the one pedal driving to know fi the brake lights light up when the care is coasting. If the drivers behind didn’t have that visual signal, it can take longer to realize the car in front of you is slowing and then react to it appropriately. Still don’t see enough for me to definitively say the fault lies with one side or the other.

Uncle D
Uncle D
1 year ago

There’s no coasting with one pedal driving. If you (or the ADAS) let up on the accelerator, regen starts and the brake lights come on to let other drivers know you are slowing. If the car slowed from 50 to 20 mph, the brake lights should have been on. If not, something failed.

The Ultracrepidarian
The Ultracrepidarian
1 year ago

Technology just for its own sake is useless. I have some audio books, and when I start playing them to the headphones, they keep on going even when I remove the headphones. Now the book is well advanced a couple of chapters and i don’t know what happened. So its all craptastic stuff.

shneeplepool
shneeplepool
1 year ago

When you remove all the hype from L2 systems, you realize that in addition to driving, you now also have to babysit some crappy AI. So it is more work, not less. And people pay extra for this?

SYKO
SYKO
1 year ago

Garbage

Black_Peter
Black_Peter
1 year ago

FoLoWInG DiStAnCe!! really? The car changed lanes and went from 55 to 20. pick any other car, ANY, other than a Tesla and would anyone be talking “following distance”? No.. You would all be saying the driver “was on the phone” or fell asleep, or something all pointed directly at the car/driver not any of the other cars. If you are not, at the very least, concerned people are BETA TESTING land based missiles and failing, you’re either stanning for Tesla or being willfully ignorant.

“Enough already, people get it,” Obviously not Jason, obviously not..

nemebean
nemebean
1 year ago

For all the people blaming following distance for this, I wonder if any of us actually follow far enough behind the car in front to deal with a situation like this. Because it’s not just a sudden stop, it’s a sudden stop that doesn’t make any sense and will require an indeterminate amount of time to process that they didn’t just tap their brakes, as often happens on the highway, they’re panic braking. IIRC, the recommended following distances are based on the assumption that you’ll recognize the need to stop in a pretty short timeframe, and I’m just not sure that’s going to be valid in this case.

I’m not sure how you would study this, but it would be interesting to know what the difference in reaction times between “deer ran in front of you” to “random and unexpected hard braking from car in front” are. I’d be willing to bet the latter is much, much worse.

Trust Doesnt Rust
Trust Doesnt Rust
1 year ago
Reply to  nemebean

Now, I’m not a fancy, big-city researcher, but a person’s reaction time to a vehicle stopping suddenly depends on environmental context. For example:
Vehicle stops because of blowout: Noise, visual indication of a blowout, weaving
Vehicle stops because of a stall: Gradual slowdown, hazards, movement towards the right shoulder
Vehicle stops because of road hazard: Multiple cars slow down, weave around obstacle.
In these situations, as the other driver, you have environmental clues that sudden breaking is about to happen and why. Your brain is already trying to figure out what to next.
In this situation, to paraphrase Jason, the car just stopped for no reason. As the other driver, while you should be prepared for anything, your reaction time is going to be slower while your brain assesses the situation and figures out what to do next.

strangek
strangek
1 year ago
Reply to  nemebean

I always want to follow at a safe distance, but it doesn’t really work that way in congested city driving. If I leave a space, another car will simply move into it. If I leave a space behind that car, then a third car will simply move into it. On and on it goes like that so I’m constantly on someone’s ass whether I want to be or not.

Harmanx
Harmanx
1 year ago

FWIW, a Tesla doesn’t/can’t currently use the FSD beta when driving on a highway/freeway. It might start out using FSD on regular roads leading to the freeway, but it switches to Autopilot during the onramp (and then switches back when exiting to regular roads). If you’ve ever heard mention of their future “full stack” software (which is supposedly to be released in the coming weeks), that’s the version where all autonomous features are supposed to use the FSD software.

Dan Finkelstein
Dan Finkelstein
1 year ago
Reply to  Harmanx

Yes, this. What would have been active at the time of this incident would be Autopilot or Navigate on autopilot, which is a 5+ year old software stack at this part. Certainly doesn’t excuse why the driver allowed the car to do what it did but the plan, as you said, is to upgrade the highway driving to use the same codebase as the FSD beta, which is the new, actively developed stack.

jb996
jb996
1 year ago
Reply to  Harmanx

I’m confused by this point, as it just sounds like semantics. I don’t have a Tesla, and haven’t studied their features. But, you’re saying it wasn’t one type of Level 2 autonomy, it was another type of Level 2 autonomy. Not “Full self driving” but only “Autopilot”. I think you’re just clarifying terminology, but am I right in saying that in either case, the car screwed up and the driver wasn’t paying attention. Right?

marmau
marmau
1 year ago
Reply to  jb996

Elon stan detected

Harmanx
Harmanx
1 year ago
Reply to  marmau

Who was detected? Elon or Stan?

Harmanx
Harmanx
1 year ago
Reply to  jb996

Right. I was clarifying which product was likely engaged, since they’re different. Autopilot has only ever been intended for L2 autonomy — and only for highways. (It’s old code that doesn’t get major updates, since it’s intended to be replaced.) It’s much like what is used with other companies’ L2 highway systems. FSD is what the article says was engaged — and which probably wasn’t. It’s the intended replacement to Autopilot. By current standards it’s a true AI, intended to do much more complex driving, learning and improving over time — and to handle all driving situations, not just highways. Aspirationally, it is intended to become as good or better than decent human drivers. (It may well never get there, but that’s been the goal!)

86
0
Would love your thoughts, please comment.x
()
x