Home » That Anti-Tesla Full Self-Driving Super Bowl Ad Wasn’t Fair To Tesla, But Not For The Reasons You Think

That Anti-Tesla Full Self-Driving Super Bowl Ad Wasn’t Fair To Tesla, But Not For The Reasons You Think

Teslasuperbowl Top

Like many, many other Americans, yesterday I parked myself on a couch and watched the gripping and often brutal action of America’s greatest sporting event, the Puppy Bowl. This year, the 19th Puppy Bowl proved a dazzling spectacle as always, though I lost a crippling amount of money when those miserable hounds of team Ruff metaphorically and possibly literally shit on the carpet and lost to team Fluff. Dammit. There was some sort of follow-up to the Puppy Bowl, a scaled-up version played by humans called the Super Bowl. I’m told this is a popular event as well, so I sat through it, which was interrupted about halfway through with what looked like a live-action Super Smash Brothers game. Also interrupting the action were a number of commercials, including one very interesting one that was an anti-Tesla Full Self-Driving ad. The ad was interesting because I think it was actually a bit unfair to Tesla; not because anything that it showed was particularly wrong, as such, but because the real target of the ad shouldn’t be just Tesla. It should be a bit bigger.

For those of you that turned off your large CRT television sets and closed the large oaken doors on the furniture-like cabinet it is built into immediately after the Puppy Bowl and didn’t see the ad, here you go:

Vidframe Min Top
Vidframe Min Bottom

Wow, that’s a lot of fake kid-smacking going on there, isn’t it? This all feels pretty alarmist, though it’s not like any of those things don’t happen; for example, Teslas were one of a number of cars that smacked into kid-shaped mannequins in emergency automatic braking system tests about a year ago. And yes, there’s plenty of examples of Tesla’s FSD doing dumb things like driving the wrong way down roads or ignoring/mistaking warning signs, or just stopping in traffic, or any number of other errors, small and large. Getting a car to drive itself is a wildly complex undertaking, so it’s not surprising at all that these systems aren’t close to perfect yet.

But you know what is surprising? Not that Tesla is getting away with any of this, but that any of this is okay at all, and by “this” I mean Level 2 semi-automated driving systems, because that’s the real issue here. Just look at all of the issues pointed out in that commercial about the horrible things Tesla FSD is capable of: running over kids and babies, ignoring Do Not Enter signs, ignoring school bus stop warnings, driving on the wrong side of the road – all of these things are bad, sure, but they’re mitigated by the fact that Tesla FSD (and Tesla Autopilot, and GM SuperCruise, and Nissan ProPILOT and Ford BlueCruise and so on) are all Level 2 systems, which means that a human being must be alert and monitoring everything the car is doing at all times.

ADVERTISEMENT
Ad Placeholder Wide Top Banner

The problem isn’t that Tesla’s Full Self-Driving has, as the commercial states, “woefully inept engineering,” because it doesn’t. There’s a lot of incredible engineering that has gone into that system, but the problem of automotive self-driving is so complex and involved that it’s simply not finished yet. Which is why it’s a Level 2 system, because it requires a human to oversee it as it works, because it will make mistakes, sometimes severe ones, no question.

Now, I’m not really giving Tesla an out here, but I do think they shouldn’t necessarily be singled out. I mean, maybe a bit, because they are among the worst offenders when it comes to confusing the general public about what their system is capable of, sometimes even deliberately so, as was the case with the deceptive Autopilot demonstration video that remains on their site to this day. That video implied that the car could drive itself, and the human in the seat was only there for “legal reasons,” which is very much not the case.

So, sure, Tesla is by no means innocent, but it’s not productive to focus on Tesla and their technical failings regarding FSD, because the real problem is that we let systems like this, systems that require a human to monitor a system that does most of the work, on the road at all. Level 2 systems are the problem, for conceptual, rather than technical reasons.

I’ve gone over this many times before: humans are simply bad at monitoring systems like these, and it’s been shown and tested since at least 1948, when N.H. Mackworth’s 1948 study called The Breakdown of Vigilance during Prolonged Visual Search came out and described what we know today as the “vigilance problem.” Essentially, it just says we suck at having to monitor things that do almost everything on their own, and are bad at leaping in to take over when the automated system requires us. This is compounded in the case of a self-driving car, which may be moving at speeds of well over a mile-a-minute when it suddenly needs a human to intervene. It’s just an inherently flawed and confusing approach to this technology.

The technology behind Tesla’s FSD and any of the other Level 2 systems could be very readily adapted into something that could genuinely improve safety, by making cars with a system that works sort of the opposite way current Level 2 systems do. Instead of the computer driving and the human watching, the roles would be reversed, with the car only operable under full human control, but with the AI watching, ready to make corrections if the human makes an error, due to distraction, inebriation, fatigue, health reasons, or whatever.

ADVERTISEMENT
Ad Placeholder Wide Top Banner

Such a reversed-Level 2 setup wouldn’t be able to call itself self-driving, but it would eliminate the problems of the human not paying attention, and would add a level of safety to driving never previously seen. Currently, I only know of Volvo as working on such a system, but this should become the default moving forward. All of the safety benefits, none of the bullshit that can get people hurt. Sure you’d have to give up pretending you live in the future and can argue with people on Reddit while your car whisks you to the finest Taco Bell in the Bay Area, but that’s a sacrifice I’m willing to make.

Here’s the hard truth: there’s really only two acceptable “levels” of self-driving. Those levels are yes or no. What we call Level 2 is a no. Level 3 is so confusing it should only be considered no, if it should exist at all (it shouldn’t), and Level 4 (full autonomy in pre-selected areas) and 5 (magic driving robot) are both yesbut I’m not really sure we’ll even ever actually see full Level 5, drive anywhere anytime automation anytime soon. But that’s fine; a good Level 4 network of vetted roads would be close enough.

Dan O’Dowd, the leader behind the Dawn Project that produced this commercial, has been an outspoken critic of Tesla and Full Self-Driving for quite a while, and while I think there are certainly plenty of good reasons to criticize Tesla FSD and Autopilot, the near-exclusive focus on Tesla I think is misguided. The Dawn Project’s time would be better spent looking at the issues of Level 2, conceptually, and lobbying for something like the reverse-Level 2 concept.

Of course, there’s plenty of speculation among Tesla fans about O’Dowd’s motives, and now, thanks to a certain new owner of Twitter, his tweets are appended with many qualifiers and counterarguments. I can’t speak to what O’Dowd’s motives are, but I can say that from where I sit, while he has some good points, the focus on Tesla is the wrong approach. Level 2 itself is the real problem, and it’s not about technology. It’s about humans, and how humans interact with technology.

ADVERTISEMENT
Ad Placeholder Wide Top Banner

This is something that needs to be solved for all automakers working in the driving automation space, not just the loudest and most obvious one.

Relatedbar

Nearly 70% Of CarMax’s Tesla Inventory Disappeared In Three Days

Tesla Owners Are Incredibly Unhappy About Recently-Announced Discounts

Tesla Massively Lowers Prices Overnight, Makes Model Y And 3 A Crazy Deal

Newly Released Video Of Thanksgiving Day Tesla Full Self-Driving Crash Demonstrates The Fundamental Problem Of Semi-Automated Driving Systems

 

Facebook
WhatsApp
Twitter
LinkedIn
Reddit
69 Comments
Inline Feedbacks
View all comments
LongCoolLincoln
LongCoolLincoln
1 year ago

“the finest Taco Bell in the Bay Area”

This exists, canonically: https://www.atlasobscura.com/places/linda-mar-taco-bell

Chaostrophy
Chaostrophy
1 year ago

I fully agree with you. We are distractable monkeys, wildly unsuited to monitoring something that almost works. Either level 4, or things to keep us focused on driving and keep us from being dumb.

MazdaLove
MazdaLove
1 year ago

Perhaps there should be a financial penalty for crashing while FSD is engaged
Humans respond to incentives. Maybe insurance companies should incentivize remaining in control of your vehicle by decreasing the payout they are willing to make, and shifting the financial burden back to the responsible party, the driver with FSD engaged. Not sure how popular this would be, or if it has ever even been considered. If insurance companies can refuse to insure entire generations of korean cars due to increased possibility of theft, why not increase insurance premiums or decrease payout for a system that makes such a mess of publiv and personal safety? Make FSD the social equivalent of texting and driving.

Happy Walters
Happy Walters
1 year ago
Reply to  MazdaLove

I don’t know why it might not be a legal penalty, including a criminal charge. I think there’s an argument that not paying attention with a Level 2 system is reckless, which is criminal in my state. I think financial penalties are less likely to get anyone’s attention.

Supporters might say that the insurance companies would RATHER have Level 2 in use, because overall there would be fewer accidents. That might turn out to be true, and they’d eventually have the data, but a lot of bad things could happen before there’s a sufficient sample size.

nemebean
nemebean
1 year ago
Reply to  MazdaLove

This would just incentivize the car companies to ensure that their L2 system shuts off shortly before impact. I strongly suspect that’s already happening so they can claim amazing safety records by ignoring all of the times their system caused an accident but had technically handed back control to the driver.

Harmanx
Harmanx
1 year ago

I own a Tesla with the Autopilot/FSD software — so, my two cents, if anyone is interested in hearing from an actual user of the products. (Specifically — I’ve been regularly using Autopilot for over 4.5 years, and FSD beta for 1.5 years.)

I’m not positive about Autopilot (since there haven’t yet been instances of people walking in front of my car on the highway), but with FSD, the car absolutely recognizes humans and is usually overly-cautious with them. It slows down if it looks like they’re heading into the street from the sidewalk, slows way down and veers around them if they’re on the shoulder of the street, and stops altogether if they’re in the lane beyond the shoulder. I’ve never had my car be completely oblivious to people like in the Dan O’Dowd demonstrations. (FSD has occasionally been too cautious on some occasions while not quite cautious enough on others, however.) O’Dowd’s demos have been dissected with results that call them into serious question (and his motives, given ties to competing interests, have been as well). People have recreated O’Dowd’s videos and found that the cars stop before hitting people or human-like stand-ins — as well as some other objects in the path of driving (although, in my experience, the software is still needing work with reliably identifying various non-human obstacles — but often sees animals, fwiw).

FSD beta has been used by thousands of cars for years — and was expanded to about 160,000 cars a half year ago. There have been a lot of miles driven with it now — and no injuries or fatalities that I’ve heard of. As for Autopilot, the well-publicized accidents actually show the software in a favorable light when seen relative to miles driven — especially comparing to accidents vs. miles driven by cars not using it. Both products keep getting better.

Human drivers can be tired or under the influence or distracted by all manner of things. Some serious work is definitely still needed by Tesla’s software — but it is always observing in all directions with it’s singular driving objective, and without distractions.

ddavison512
ddavison512
1 year ago
Reply to  Harmanx

Software developer, been at it for a while.
I wont attempt to contest your data.
I get the arguements about the human element in driving.

Consider, though. Im sure you have been driving along and you think you see something ahead. Think night time, not a straight road, weird stuff going on on the sides. You use your senses and your mind to figure out what it is. Oh, it is a black cat, got an image of the eyes reflecting back for a moment, better slow down and keep an eye on it. Or it is nothing, so full steam head. You have a system in your head that has evolved over a long time to detect and interpret what “is” from all your eyes see. And a lifetime of learning to use it.
The programmers coming up with this stuff are doing their best, and I think it is amazing what they have managed to do. But. The machine is building a model in memory of what reality is. If a sensor has things wrong, is misreading, missing something, etc, that model may not be reality. The computer does not have your biological systems and imperative to avoid injury and pain. If it misses the mark, it could run you into something. And I dont know how well the sensors and software define and detect problems. I hope they have sensors cross checking each other, but what if they do, but it determines that the working sensor is wrong and the bad one is right?
Consider further, the developers are continually working on the software and system.
It is not outside the realm of possibility that they take something that works OK and makes it work worse, at least in some cases. They cant QA all edge cases. I work on a commercial system that our retail offices use to track business stuff and I get worked up about what did we miss when we roll a new version out.
Which is what I should be taking care of instead of typing this, so, I leave you with the above to consider.

amberturnsignalsarebetter
amberturnsignalsarebetter
1 year ago
Reply to  Harmanx

Harmanx, thanks for the feedback from an actual user – as someone who’s never driven a Tesla it’s good to hear both sides of the argument. With that said, a couple of things you said in your comment concern me:

Saying that your car was “not quite cautious enough” is in my opinion analogous to saying “I had a near miss that would have been avoided if I was in control”. Obviously I don’t know the circumstances and maybe there wasn’t really a high probability that your Tesla put you (or anyone else) in danger, but if my car has a different risk threshold than me I consider that to be a problem. (Imagine if your brakes didn’t actually engage when you decide you need to step on the pedal, but instead wait until the car decides that you need to slow down – I would find that a little unnerving.) At the other end of the spectrum, I hear a lot of folks complain that FSD is sometimes “too cautious”, but in essence it’s the same issue – the decisions the car is making are not consistent with the ones I would make in the same circumstances – at the very least that’s inconvenient, but overly cautious driving can be a safety issue too.

The second point is the observation that FSD/Autopilot have lower accident/mile rates than human drivers. That may well be objectively and statistically true, but the problem I see is a lack of accountability. With all L2 systems, it’s not the software that caused the crash, it’s the driver who was not able to retake control safely when the software couldn’t handle the circumstances that led to the accident were presented. Humans frequently make terrible decisions when driving that lead to accidents, and they should be held accountable for those decisions – in my opinion the same is true if your car happens to have a L2 driving aid that made a bad decision for you.

As Torch has said in this article (and many times before), L2 systems have the potential to be hugely valuable driving aids that make roads safer and save lives, but if they’re advertised as something they are not (i.e. self-driving) then we put all of those benefits in jeopardy and encourage folks to misuse the technology and create new unnecessary risks.

Harmanx
Harmanx
1 year ago

Torch’s hypothesis that L2 or L3 ADAS are dangerous (since the driver becomes inattentive while those are engaged and therefore isn’t prepared to take over at a moment’s notice) really does sound valid at the outset. However lots of hypotheses that sound reasonable or valid as described don’t stand up to actual numbers. I have not, in fairness, read his book (I hear it’s very good) so don’t know to what extent he gives real data to support the assertions (and if the data available when written is still similar to what stats are now available) I have read a lot of his excellent articles, though — and don’t recall any data deep-dives when talking about ADAS. I have heard Tesla’s take on the numbers — which could well be skewed to make their systems look favorable, of course. Those numbers say that there are fewer accidents when using Autopilot. I’m not sure if much data has been made available about FSD beta, though — although given how well-broadcast any Tesla accidents have been (sometimes when ADAS was in use, sometimes presuming it was used without any info to support that), I think we would have heard something by now if there were any injuries or deaths with FSD engaged. So I’m comparing what data I have seen to my personal experience using the products — which is generally supportive of it not being more dangerous than when not using Tesla’s ADAS. In fairness, if it wasn’t made very clear (by Tesla and others) that ADAS systems require full alertness and preparation to take over, it could well be more dangerous for me and anyone else — at least until drivers had experienced it long enough to learn potential dangers for themselves.

shneeplepool
shneeplepool
1 year ago

Sad, we had the whole L4 thing figured out 100 years ago. And then they got rid of the horse.

Nayrbflat6aficionado
Nayrbflat6aficionado
1 year ago

I have the answer:

Take a bus, taxi, Uber, donkey cart, bicycle, or walk to your destination.

Screw the self-driving bullshit that automakers are attempting to foist onto the masses.

tacotruckdave
tacotruckdave
1 year ago

This is a direct result of some president, probably Republican, overturning truth in advertising laws that used to allow lying advertisers ads to be pulled, hit with big fines and forced refunds. Before you yell i vote R as much as D.
But now advertising is like a big swimming pool. Everyone pisses in it. But accuse others while claiming they dont. Or defend it because it is for a good cause. Ala you cant blame a baby for peeing he doesnt know better but his hippy parents do. Or my pee is pure it improves the water tainted by my enemies pee.

OnboardG1
OnboardG1
1 year ago
Reply to  tacotruckdave

We still have that in the UK. It’s the one regulation Maggie didn’t shred.

ghostpedalsyndrome
ghostpedalsyndrome
1 year ago

I didn’t catch that commercial.
I think pretty much every modern car commercial is chock full of dangerous, bloated BS.
Did anyone catch the Jeep 4xe hybrid “Electric Boogie” spot.
That was disturbing.
A bunch of morons driving their “Eco Friendly” 4×4’s at high rates of speed, tearing up ground through deserts and jungles while the supposed animal inhabitants of those ecosystems dance around to the shitty music the Jeep people are blasting at an obnoxious volume.
So stupid.
Every car commercial I see comes off as an ad for terrible, unnecessarily aggressive and destructive driving.
The few that don’t are the ones touting their safety features by showing them save the clueless, inattentive type of people I call “doorway standers” from an accident because they aren’t paying attention.
They are all bad. All catering to the “Me Monsters” with social media rotted brains.

GranulatedMiscarriage
GranulatedMiscarriage
1 year ago

“High rate of speed” is fluffy cop language that doesn’t mean what they think it means.

If you mean “fast,” say “at high speeds.”

A rate is a ratio, and a “rate of speed” might mean acceleration, but it doesn’t mean plain old speed.

almatic
almatic
1 year ago

Tesla is asking to get called out, because they called it “autopilot” something that it definitely isn’t. And levels of self driving? That’s a made up term also, it should be levels of driver assistance, because unless the car is actually self driving, it’s just driver assistance.

The public understands that. The NHTSA said force Tesla to recall the name autopilot, and the automobile journalists should stop using the term “self driving” for cars that can’t drive without a human in the driver’s seat.

Mercedes Streeter
1 year ago

We once again request readers refrain from resorting to personal attacks in their responses to one another. There’s enough of that at other sites and honestly, conversations are no longer productive when mud starts slinging.

If you have a strong argument, the point itself will do the legwork.

2manybikes
2manybikes
1 year ago

Being nice is objectively more persuasive anyway. Sugar and vinegar etc.

Frank Carter
Frank Carter
1 year ago

Agreed Mercedes. Also when, if ever, has anyone actually swayed an opinion of another commenter with their words? Seems like everyone is in “if you don’t agree with me your an idiot” mode. Plus I doubt my comments are complete enough to actually represent my actual opinion because there are always variables that need to be accounted for. This comment is an actual example… lol.

Frank Carter
Frank Carter
1 year ago
Reply to  Frank Carter

Or as ChatGPT says:
Engaging in arguments in the comments section can be a bad idea for a number of reasons.

First, arguing with people in the comments section can be a time-consuming and emotionally draining activity. Online arguments can often be unproductive and unfulfilling, and can leave you feeling frustrated and upset.

Second, arguments in the comments section are often public, and can reflect poorly on your online reputation. Other users may see your comments and judge you based on how you conduct yourself in online discussions. If you engage in aggressive or disrespectful behavior, it may damage your personal or professional relationships.

Third, arguments in the comments section rarely lead to a resolution or a change in someone’s beliefs. In many cases, people are set in their ways and are not open to changing their opinions based on what someone says in an online comment. Instead of trying to change someone’s mind, it may be more productive to simply state your opinion and move on.

Fourth, engaging in arguments in the comments section can distract you from other, more important activities, such as work or spending time with friends and family. Spending too much time arguing online can lead to a loss of productivity and can harm your relationships with those around you.

In general, it is usually best to avoid engaging in arguments in the comments section. If you have a strong opinion or a disagreement with someone, it may be more productive to have a private conversation with them, or to simply express your opinion in a respectful and non-confrontational manner.

Irving Warden
Irving Warden
1 year ago

Perhaps the answer is for the government to buy a train load of Jason’s book, “Robot, Take the Wheel,” and distribute a copy to everyone involved in manufacturing, selling, regulating, or operating a car with a system of this type. It was published four years ago, but my recollection is that the issues raised are the same as those involved in this discussion. Jason could use the extra money to buy buckets of electrons for his Changli.

GranulatedMiscarriage
GranulatedMiscarriage
1 year ago
Reply to  Irving Warden

Nice try, Jason, but we’re on to you

sentinelT
sentinelT
1 year ago

What are the chances someone involved with this is shorting Tesla stock?

RidesBicyclesLovesCars
RidesBicyclesLovesCars
1 year ago

Yes, level 2 overall is a big problem. It works just well enough that people develop too much confidence in it.

Both my current vehicles have adaptive cruise. One radar and one vision based. Both are susceptible to phantom braking and accelerating around corners while following other vehicles. Neither system sees far enough ahead to react appropriately to a traffic back-up. I’ve learned when to hover my foot over the accelerator or the brake. I also know when it’s better to just drive the car myself since there are some serious system limitations.

I take delivery of my Tesla this week. It comes standard with the basic autopilot, which is really just adaptive cruise with lane centering. I don’t expect any better or worse performance than my other two cars. I’m just happy that the FSD is locked behind a $15k paywall! That feature is nowhere near ready for mass adoption.

my goat ate my homework
my goat ate my homework
1 year ago

Hot take… Tesla makes misleading statements about its technology all the time. I’m fine with someone making misleading statements refuting them.

Also, Torch, you are too reasonable in your suggestions. They can’t sell reasonable stuff to investors. You just don’t understand, they need to break things.

JaredTheGeek
JaredTheGeek
1 year ago

I daily a Tesla and I agree. Calling it Autopilot is bad and Full Self Driving is dangerous and misleading.

rootwyrm
rootwyrm
1 year ago

It’s not misleading to state actual test results. That’s the kind of take you get from someone who fires people over their not being the main character any more.

Sure. There’s some sensationalism in the presentation and cinematography. Because that’s how you get attention. But nothing they said or did hasn’t already been proven, repeatedly, in other similar testing conditions. It’s not like they threw balls into streets or launched glitter cannons at LIDAR before pulling the NIST standard dummy in front of it.

Here’s a Tesla cultist – being cheered by a bunch of other cultists – who posted videos of the exact same type of failures.
Watch as FSD ignores a do not enter sign, a road closed sign, a metal pole, and then drives off the road and attempts to steer straight into a boulder on a simple roundabout: https://www.youtube.com/watch?v=EHljMEQA5WA
And that’s a totally different source, one which is well beyond friendly to Melon.

nlpnt
nlpnt
1 year ago

“Go fast and break things” was never meant to be literal, and it was certainly never meant to be “go fast and break kids”.

my goat ate my homework
my goat ate my homework
1 year ago
Reply to  nlpnt

That’s one of those sayings that we’ll look back on and say “wow, what a bunch of morons”

ddavison512
ddavison512
1 year ago
Reply to  nlpnt

“Go fast and break things” is a development mantra.
It means dont stop and think about the effects of the change you are making.
Make it. Accept that things ( the software’s functionality ) will be broken, for a while.
Intended to get developers over the analysis paralysis we sometimes get into.
Then QA is supposed to step in, determine what is broken, and send it back to get it fixed.
So, within that context, yes, it is literal.

mber
mber
1 year ago

There’s a reason that the ad buyer targeted Tesla. It’s because the other automakers- in particular, GM and Ford- limit Level 2 autonomy to roadways that don’t have pedestrians, traffic signals, or driveways. In other words, Interstate (or comparable) highways.
Since Tesla’s management insists on playing up “self-driving” in order to manipulate stock price, I think the ad buyer was spot-on.

Frank Carter
Frank Carter
1 year ago

This was only aired in certain markets. It’s easy to understand why he is doing this… just like everything else in the world… follow the money. He wants to SELL his system as the only one to use. It’s pretty obvious to owners of Teslas that these systems aren’t perfect. Hell, regular “cruise control” would have the same results give the same scenarios. Ultimately the “driver” of a vehicle is responsible for its safe operation. I’ve seen cars without FSD or Autopilot being driven worse than the Teslas as people are becoming more and more distracted.

Lady at a convenience store: “You fell safe letting that car drive itself?”
Me: (looks around) “I feel safer with this car driving than letting 90% of the people in this parking lot drive! At least the car is not staring into a phone as it attempts to stay in the lane.”

Frank Carter
Frank Carter
1 year ago
Reply to  Frank Carter

BTW – First time caller, long time lurker.

rootwyrm
rootwyrm
1 year ago
Reply to  Frank Carter

“He wants to SELL his system as the only one to use.”

Wow, spout Muskovite reactionary bullshit much there son?

Editor note: There is a way to debate without insulting people, we recommend giving it a try.

Dan O’Dowd doesn’t have a goddamn thing to do with ‘self-driving’ snake oil.
Green Hills Software makes general operating systems and development tools for embedded systems. Like Nintendo consoles. And military satellites.
They don’t make the software. They don’t make the ‘AI’ bullshit. They make operating systems and tools for others to build software on and with.

So they sure as shit are not trying to sell you a product that they do not fucking make.

Frank Carter
Frank Carter
1 year ago
Reply to  rootwyrm

So he spent 600k to air commercials to help the public? Sure. I guess now he is our savior?

Canucksalaryman
Canucksalaryman
1 year ago
Reply to  Frank Carter

He also rented a booth at the Transportation Research Board Conference in DC last month. Didn’t see many people actually stopping to talk to him.

UK2TX2CA
UK2TX2CA
1 year ago
Reply to  rootwyrm

Green Hills Platforms for Automotive:
Automated Driving Systems

ttps://www.ghs.com/products/auto_adas.html

Frank Carter
Frank Carter
1 year ago
Reply to  UK2TX2CA

UK2TX2CA – Well played…

rootwyrm
rootwyrm
1 year ago
Reply to  Frank Carter

Yes, played himself like the pair of disingenuous trolls you are.

Editor note: Being a fan of Tesla doesn’t automatically make you a troll.

Hey Melon, you wanna come collect your employees? Aren’t they supposed to be fixing your Twitter view count?

MaxPoodling
MaxPoodling
1 year ago
Reply to  rootwyrm

Ew. I come here, and am a member here, because it is very much NOT this kind of community. Be better.

rootwyrm
rootwyrm
1 year ago
Reply to  UK2TX2CA

Before opening one’s mouth to try and look smart, don’t. Because you aren’t. Also, try this crazy new concept we have called “READING.”

Editor note: Again, please don’t insult your fellow readers. It’s entirely possible to debate without doing that.

Here, I’ll help you.
The Green Hills Platform for Safe and Secure Automated Driving Systems enables OEMs and their suppliers to achieve their business and technology goals in designing and manufacturing Levels 1-5 systems”
Since you seem to be struggling:
ENABLES OEMS AND THEIR SUPPLIERS TO ACHIEVE … GOALS IN DESIGNING AND MANUFACTURING
Still unclear on the concept?
THEY ARE SELLING A SOFTWARE DEVELOPMENT PLATFORM THAT CAN BE USED TO DEVELOP ADAS SYSTEMS. WHICH IS NOT FUCKING SELF-DRIVING SOFTWARE.

Frank Carter
Frank Carter
1 year ago
Reply to  rootwyrm

Geez, grandpa, your don’t have to be an ass about it. But I guess this is the only place you get to be a badass anymore. lol.

tacotruckdave
tacotruckdave
1 year ago
Reply to  Frank Carter

Cmon 2 aholes dont create a set of titties.

Duke of Kent
Duke of Kent
1 year ago
Reply to  tacotruckdave

That expression made me lol. I will be stealing it and using it.

Ruivo
Ruivo
1 year ago
Reply to  tacotruckdave

That is horrible and gross. I loved it!

Trinch888
Trinch888
1 year ago
Reply to  tacotruckdave

Oh my! I never thought of that- but you are correct.

Frank Carter
Frank Carter
1 year ago
Reply to  rootwyrm

So he is selling “something else”?

Ruivo
Ruivo
1 year ago
Reply to  Frank Carter

Welcome to the commentariat! And, I’m sorry, but your first answer is a disagree, on every level.
Tesla owners are NOT as well educated as you presume
The argument here is exactly about why the driver isnt in control (the vigilance problem)even though they are responsible
“Worse drivers” is not really applicable – we’re talking about two entirely different kinds of driving. You are comparing a railway with an airline. Completely different challenges to the same end result.
I’m in the “not ready for prime time just yet (and maybe never)” field. I do not consent on sharing a road with a drunk driver. I do not consent on sharing a road with a beta program that is still buggy, but is driving around anyway to line the pockets of one guy. Both of these can swerve in front of me and kill me. Only one of these is actually illegal. Not great!

Ruivo
Ruivo
1 year ago
Reply to  Ruivo

*but the first answer you will get to a comment will be a disagree.
You see, our commenting system is still at Level 2, of course!

Manwich
Manwich
1 year ago

I’m gonna guess “The Dawn Project” is mostly funded by oil companies, dealers and probably legacy car companies.

Supposedly their mission is “Making Computers Safe for Humanity
We Demand Software that Never Fails and Can’t Be Hacked”

But I’m calling BS on that.

As you said, their criticisms can be applied to all the other car makers offering systems like this.

And if they are soooo concerned about hacked software, why don’t I see anything about the exceptionally poor security on Hyundai/Kia products that makes them easy to steal on their website?

Nah… this “project” is just another group of shills… just like “CNW Marketing” group was (the jackasses who claimed a Hummer was more green than a Prius).

I predict that ‘the Dawn Project’ will go away eventually just like “CNW Marketing” did when they start getting too much heat for being shills for spreading FUD, lies, half-truths and general bullshit.

Frank Carter
Frank Carter
1 year ago
Reply to  Manwich

Agreed.

SquareTaillight2002
SquareTaillight2002
1 year ago

I have written weapons-aiming software currently operating in fighter jets that makes potentially life or death decisions so I know what kind of rigor must be applied to ensure it works every time. I just don’t trust the ‘move fast and break things’ bros to build things to that standard. Not to mention these AI-trained models are amazing and adaptable but also unpredictable.

I would love a self-driving car for mind-numbing commutes and long drives. Then I get in a modern vehicle with adaptive cruise control. At the first delayed decision point I shut if off. I just don’t trust it to stop or follow the road.

rootwyrm
rootwyrm
1 year ago

I have experience in life-critical systems. (The inverse of that; if our shit goes down, people die. Obviously not what you want in a system for killing people.) Also in machine learning. And there isn’t a single ‘best practice’ at any of the ‘most brilliant innovative genius 20-somethings who don’t need to learn anything from history’ that passes even the lowest bar.

First, the ‘AI-trained models’ aren’t amazing or adaptable. They’re just GIGO – Garbage In, Garbage Out. Shit data, shit ‘models,’ and a whole lot of bullshit. ChatGPT isn’t ‘amazing’ or ‘adaptable.’ It’s literally just a Bayesian plagiarism engine. ‘Self-driving’ is even worse.
And those adaptive cruise systems? They’re not ‘AI.’ They’re not ‘smart.’ It’s literally “if sensor detects X under Y.min, reduce Z or apply A until Y over Y.min.” That’s it.

bertfrog
bertfrog
1 year ago

I agree with you 100% on this Jason. My fear is that the vast majority of people driving wouldn’t give a damn about reverse Level 2, ie extra protection human driving.

I don’t know how many times I’ve heard people say they are just waiting for “self-driving cars.”

What people really want is to ride in their own car (not a cab, tram or bus) and have essentially a robot chauffeur. They want to be completely checked out in the ride yet get there safely. Maybe they should hire that unemployed neighbor kid to drive them to the store or home from the bars. It would still be way cheaper than a mystical Level 4 Tesla (or other).

nickex55
nickex55
1 year ago
Reply to  bertfrog

Lincoln will send you someone to chauffeur you around for $30/hr! If you have a Lincoln. And live in Miami.

TriangleRAD
TriangleRAD
1 year ago
Reply to  nickex55

Did somebody watch the latest Donut video today?

Nayrbflat6aficionado
Nayrbflat6aficionado
1 year ago
Reply to  bertfrog

comment image

ColoradoFX4
ColoradoFX4
1 year ago
Reply to  bertfrog

My wife is one of those people just waiting for self-driving cars. She says she wants a box to pick her up and take her places. I tell her it already exists: a bus.

Beer-light Guidance
Beer-light Guidance
1 year ago

Unless the reverse level-2 system is fixed in a way to only intervene in a situation where the driver has been incapacitated in some way I’m not sure it really is much better. Having the computer decide it knows better half way through the moose test isn’t very helpful

jesterspawn
jesterspawn
1 year ago

I was going to say the same thing regarding a “reverse Level 2” system. If I’m fully engaged in driving, with my eyes on the road, the last thing I want is my vehicle spontaneously deciding to do something different, which already happens to some degree with standard safety systems.

I already have minor annoyances with my Odyssey occasionally trying to gently nudge my steering because it thinks I’m drifting out of my lane when I’m not. (Always on the same road, by the way; I suspect it may have something to do with the shadows from the powerlines at certain times of day being interpreted as a curve in the road.)

And I’ve had one case when I was driving (at an already reduced speed) around a parked bucket truck with his stabilizers protruding partially into my lane (while watching very carefully for any workers who might pop out from behind it) when my van braked hard like full ABS braking, for no obvious reason. I assume it freaked out about the truck’s stabilizers, though I had plenty of room to get around, but I’m just glad the vehicle behind me wasn’t following closely enough to rear-end me.

GranulatedMiscarriage
GranulatedMiscarriage
1 year ago
Reply to  jesterspawn

Then there’s the Chicxulub sized potholes in my area that require not being entirely in your lane lest you destroy your suspension.

Mr. Asa
Mr. Asa
1 year ago

I was a bit concerned with this commercial, solely because the road in the lead-in shot is a crappy road.
If you are going to do this, you need to come at them 100% correct. Roads with lines down the middle and sides, clean edges, no gravel, etc. Don’t give them one single ounce of ability to side step it.

That being said, the videos from other points in the commercial were fantastic. But this is your lead-in shot. Do better Mr O’Dowd, for all our sake

rootwyrm
rootwyrm
1 year ago
Reply to  Mr. Asa

Or accept that no test results no matter how valid, controlled, over-generous, or audited will change the mind of the cultists. And show the car on the roads you actually drive on.

Rest assured, if the video showed a completely uncut shot displaying the FSD version, an NHTSA test course, a dozen independent observers verifying the course measurements and pulley speeds, and the car plowing into the test dummy? The Cult of Musk would endlessly scream that it’s doctored, that it’s faked, they’re out to get Melon, etcetera.

MP81
MP81
1 year ago

This didn’t even play for us – watched the whole game, never saw that…

Data
Data
1 year ago
Reply to  MP81

I took a nap during halftime, woke up just in time to see the teams coming back onto the field prior to the second half kick-off. I also never saw this commercial, unless it aired during halftime while I was in a queso induced coma.

TxJeepGuy
TxJeepGuy
1 year ago
Reply to  MP81

It was bought on a local market buy so not everywhere got it.

MaximillianMeen
MaximillianMeen
1 year ago
Reply to  MP81

Huh, I saw it twice, once in each half. I’m not sure what local stations do to decide when to play local ads, but your station may be run by a Tesla fanboy who ran a local ad in its place. I know we missed the Willie Nelson Bic lighter ad, which is kind of surprising as Willie is as revered here (central Texas) at the same level as God, football, and pick-ups.

jgustin113
jgustin113
1 year ago
Reply to  MP81

It wasn’t a national ad but did appear in Washington, D.C., Austin, Tallahassee, Albany, Atlanta, and Sacramento (according to CNN).

Dave_Hudson
Dave_Hudson
1 year ago

Excellent. Enjoyed your book, by the way.

69
0
Would love your thoughts, please comment.x
()
x