1/ Another FSD thread

Several $TSLAQ have DM'd me asking for my thoughts on this video, so let's do this.

This is a video of a THEORY of how $TSLA might get to level 4 self driving.

But it's just that: a theory. A hypothesis. A test.

And it will fail.
https://twitter.com/ThemeTeamWP/status/1260416170060120064
2/ Skip the first 6 minutes. It's all just PR spin and nothingness. $TSLAQ seen it before.

The next 8 minutes are detailing why stop signs (and computer vision overall) are hard.

And yeah, stop signs are pretty hard. They're surprisingly not uniform and have lots of edge cases.
3/ 14 minutes in is where things get interesting.

Sorry, did I say "interesting"?

I meant "fluffy."

These numbers are not impressive.
4/ One funny thing he says on this slide is that the predictions can "never regress," which is a huge red flag: how do you know you're not at a local maximum?

Another funny thing he says:

"a few dozen"

that's how many people $TSLA has working on FSD
5/ So "a few dozen" people that are years away from a product in acompetitive space are somehow worth billions in enterprise value. LOL.

Another objection: throughout this video, it's weird that MOST of his examples deal with STATIONARY objects. Stop signs, curbs, etc

Moving on
6/

This slide is yet another absurd way to approach Full Self-Driving.

He says here that they've standardized how someone might make a "new task," such as detecting that this police car has his caution lights on.

The flaw?

Scale. There are too many tasks.
7/

This means that $TSLA must come up with an exhaustive list of every single thing they want to detect when operating the vehicle.

Again, THIS IS WHY LIDAR IS NEEDED.

Imagine an object that $TSLA has never seen before is in the middle of the road.

Like a deer.
8/

Camera: "Oh, $TSLA has never trained me to see deer, so ima keep driving and hit the deer mmmmk?"

LIDAR: "FUCK THERES SOMETHING BIG IN THE MIDDLE OF THE ROAD RIGHT IN FRONT OF US FUCK FUCK FUCK FUCK FUCK HARD STOP RIGHT NOW"
9/

Camera: "Oh I see a new lane opening up between these two lines. No way for my vision system to detect the crash attenuator / median but I see the two lane lines so ima keep going."

RIP Walter Huang
10/

Other challenging things he conveniently skips over:

1) What do you do when vision is occluded?

2) What do you do with camera blind spots?

3) How much compute power are you using TODAY with the tasks you have trained TODAY and how will you scale this?
11/

This slide has absolutely no content.

"what's funny about software 2.0 is you can take code from your software 1.0 codebase and put it in to your 2.0 codebase"

LOL? ARE YOU FOR FUCKIN' REAL?

SHOCKER. You mean I can REUSE CODE? FOR REAL? OMG!
12/

Neat. You're making a BEV model. Welcome to every other FSD stack.

How much compute power did you use to generate this?

How'd the pedestrian at 22:02 feel when you failed to yield? LOL ARE YOU FUCKING KIDDING ME?
13/

Seriously, $TSLAQ, look at this

This guy almost died.
14/

Hey $TSLAQ

See how many pedestrians you can spot in this image.

LIDAR sees ALL of these pedestrians. Their camera doesn't.

And HOLY FUCK THE CAR IS EVEN MAKING A LEFT TURN.

You'd think seeing the pedestrian in the crosswalk would be REALLY important.
15/

And before you go "hey it's just a car demo," the pedestrians do briefly light up as pedestrians.

Notice that the pedestrian crossing the crosswalk is visibly marked just a few frames before.

So clearly, they got a busted pedestrian "task model."
16/

God I just love this demo.

The car is ACTIVELY rolling into the intersection and still doesn't see the pedestrian that's DIRECTLY IN ITS PATH.
17/

He prefaces this piece saying "anyone can learn how to get depth from images. Easy stuff. Learn it in CS class."

Which raises the question...

... if this is AI 101, why is EVERY OTHER COMPANY using LiDAR for depth projection?

What, did they all fail their CS masters?
18/

Alright, what's funny about this segment (around 23:00 - 24:00) is the presumption that other autonomous driving companies WANT the LiDAR.

News flash:

No one LIKES LiDAR. No one business executive at least.
19/

LiDAR is expensive. It's expensive to build. It's expensive to calibrate. It's expensive to label and train on.

If Waymo, Cruise, Zoox, etc. could get rid of LiDAR, they absolutely would.

And, I absolutely GUARANTEE that all of these companies regularly R&D it.
20/

It's not hard for them. As mentioned in previous threads, all these companies take a segmented approach. Karpathy himself even says "further down the stream" in here.

Just train models without LiDAR, unplug the LiDAR, see how the car works.

This is easy simulation too.
21/

FSD systems, even $TSLA's, work this way:

- See the world around you
- Predict what will happen next
- Use those predictions to drive

Just input real world raw data of what the car "saw" and save as simulation. See how car performs.
22/

I absolutely guarantee that every FSD company has a regular R&D hackathon/investigation to see if a camera-based only approach would work.

ALL of them want to get rid of the LiDAR. Doing so would be MAGICAL from a cost/business perspective.
23/

They don't because it won't work.

Moving on.

At 24:45ish, Karpathy calls human drivers "data labelers."

This is a lie.

I'm biased, but I think I see it in his body language too. He knows this isn't factually accurate.
24/

But really, it's just not true. Again, the data isn't coming off the routers. Anyone with a $TSLA can easily monitor their car's MAC address and upload/download bandwidth usage.

Most of it is boring log files.

The car has 8gb of storage.

There just isn't enough data here
25/

Another objection to this presentation:

At no point does the video demonstrate predictions about an object's movement.

For example, there's never an arrow pointing out of a car showing which way it's moving and how fast.

There's never an arrow showing the FSD car's plan
26/

So the presentation is on "AI for Full Self-Driving"

And yet all they've seriously demo'd is "object identification in a single frame"

No demo of how the car predicts the world around it

No demo of how the car actually navigates itself

A completely fluff demo
27/

So even if I grant $TSLA that they can do sufficient image recognition (which they fucking can't)

but even if I grant them that

all FSD companies got that part. All of them obviously have immense perception capabilities.

Where's the demo of the next step?
28/

Where is the car predicting how the objects around it will move?

Might be something you want to know at a stop sign: is the car to my left moving? Are they stopping? Can I go?
29/

And where is the car evaluating what maneuvers it can do?

You say that you're getting tons of human driver data labelers:

where's the results of training on that data?
30/

$TSLAQ, or any $TSLA investor, or anyone seriously looking for a robust financial analysis of this,

go to one of those firms that gets you an hour on the phone with an AI expert

show them this video

ask them how confident they are in $TSLA's approach
31/

Because I guarantee they will go

"ok, so they have a deep understanding of computer vision

how are they going to get the car to drive?"
32/

I find it so funny, because I really think @karpathy is in over his head.

He might be a smart guy, and maybe he is making some breakthroughs in computer vision that look impressive to Elon

but computer vision is one piece of the FSD monster

where's the rest of it?
33/

Christ, just look at how much time they're wasting just to reliably show they can recognize objects and the world around them.

Meanwhile, other companies are racking up the miles on ordinary streets.
34/

I suppose "local maximum" is a good way to describe $TSLA's approach, and why they're just so screwed.

Elon is attached to eeking out tiny performance gains at a local maximum.
35/

Zoox is testing in Las Vegas
Waymo is testing in Mountain View + Phoenix
Cruise is testing in San Francisco
Nuro is testing in Houston

All of these companies have ACTUAL self-driving cars on ACTUAL roads TODAY.

Why doesn't $TSLA?
36/

After all, @elonmusk LOVES hyping up the stock price and fucking $TSLAQ in the ass.

Stock would rally 500 points tomorrow if $TSLA announced "we are testing 100 FSD cars at location X today."

But a fraud of that scale is impossible to fake.
37/

$TSLA's approach to FSD is fundamentally flawed.

Their contractual promise of full self-driving capabilities they've made over the last 5 years can never be fulfilled

It's Theranos right before they pulled the trigger and started testing real patients.
38/38

It's a hole Elon can't get out of.

It was fraud the moment he recognized FSD revenue.

He will get caught.

(end)
You can follow @ValueDissenter.
Tip: mention @twtextapp on a Twitter thread with the keyword “unroll” to get a link to it.

Latest Threads Unrolled: