🔍
Why This Self-Driving Tesla Car Hit That Truck | Bumper 2 Bumper | Donut Media - YouTube
Channel: Donut Media
[0]
- In 2019, at Tesla's
(upbeat music)
[1]
Autonomy Day for Investors,
[3]
Elon Musk made a bold declaration.
[6]
- LiDAR is a fool's errand.
[10]
Anyone relying on LiDAR is doomed.
[12]
- For context, LiDAR is a type
of object detection sensor,
[16]
and it's used by almost every manufacturer
[18]
that's developing a self-driving car.
[20]
Every manufacturer, that is, except Tesla.
[23]
But just six months ago,
this video was released
[26]
showing a Tesla Model 3 barreling straight
[29]
into a completely stationary
overturned semi truck.
[32]
The driver wasn't harmed,
but claimed that the car
[34]
was on autopilot before
and during the crash.
[37]
So why did the Tesla fail
to detect the semi truck?
[41]
How does an autonomous vehicle
[42]
actually see the world around us?
[44]
And would a car using LiDAR sensors,
[47]
the sensors that Mr. Musk
called a fool's errand,
[49]
have been more likely
to avoid the same crash?
[52]
(music ends)
[54]
We're gonna get into it. Let's go.
[56]
(upbeat music)
[61]
(electricity buzzes)
[62]
Thanks to Omaze for
sponsoring this week's episode
[64]
of "Bumper 2 Bumper."
[66]
Not yet, Doug. Not yet.
[68]
Next week.
[72]
Love you, though. You're my homie.
[74]
If you guys couldn't tell already,
[75]
we love teaming up with Omaze
(soft music)
[77]
because they give you,
the fans, chances to win
[81]
once-in-a-lifetime dream cars,
[84]
all while supporting amazing causes,
[86]
like the Ronald Reagan
UCLA Medical Center,
[89]
the same place that saved
our very own Kentucky Cobra
[94]
Mr. James Pumphrey's life,
so we love them over there.
[97]
The cars that Omaze offer are sick.
[99]
I'm talking about Porsche
Cayenne GTS Coupe.
[102]
A Ford F-250 that's fully
customized by LGE-CTS.
[107]
And how about this sweet Dodge Demon?
[111]
And you could win.
[111]
Just ask Sebastian, who
won the Corvette Stingray
[114]
we helped Omaze give
away earlier this year.
[116]
Hey, Sebastian.
[118]
So don't miss out on the
chance to win your dream car
[121]
and support a great
cause at the same time.
[123]
Head on over to omaze.com/cars
[127]
to check out some of the sickest cars.
[129]
And while you're there, make a donation.
[131]
'Cause who knows? You could
win the car of your dreams.
[135]
Let's get back to some "B2B."
[138]
There are four main sensors
(soft upbeat music)
[140]
that autonomous cars use
[141]
to detect and analyze their surroundings.
[143]
Before we dive into exactly what
[145]
might have caused the Tesla accident,
[146]
we need to understand how
each of these sensors work.
[149]
Probably the most common
object-detecting sensor
[152]
found in cars today is
the ultrasonic sensor.
[154]
Now, these sensors work by emitting
[156]
a pulse of sound waves
(device beeping)
[157]
and measuring the time
it takes for that pulse
[159]
to reflect back off an object
and return to the sensor.
[162]
The more time it takes
for the sound to return,
[163]
the further away the object is.
[165]
It's literally how bats work.
(device beeping)
[167]
We, of course, don't
hear these sound waves
[169]
because they're outside of
the human's audible spectrum.
[172]
And ultrasonic sensors, they're
cheap and often reliable,
[174]
so it's probably the first
type of detection system
[177]
you would opt for if
you were building a car.
[179]
However, they do have one major drawback.
[182]
They don't have a very long range.
[184]
The reason sonar is so popular
for marine applications
[187]
is because sound travels much
more effectively through water
[190]
than it does through air.
[191]
It's like this line of pool balls.
[193]
If they're tightly packed
together like water molecules,
[195]
when you hit the ball on one end,
[197]
that energy is quickly and
efficiently transferred
[199]
to the ball on the opposite end.
[201]
However, if you space them
out like molecules in air,
[204]
when you try the same thing,
(balls clinking)
[205]
that energy is quickly dispersed.
[207]
The energy from our initial
hit can't travel very far.
[210]
For this reason, ultrasonic
sensors are most useful
[212]
for detecting objects within
about three meters of a car.
[216]
Great for parking and blind-spot detection
[219]
and understanding immediate surroundings,
[221]
but not so great for seeing
a car slam on its brakes
[224]
100 meters in front of you.
(metal clanking)
[225]
If only there were
something like ultrasonics,
[228]
but instead of sound, it used
a signal that could travel
[231]
through air over further distances.
[233]
(whooshes)
Hello?
[234]
It's called radar?
(letters crash)
[236]
Oh, thanks, Mom.
[237]
Radar, or radio detection and ranging,
[240]
works a lot like ultrasonic sensors,
[241]
but they use radio waves
in place of sound waves.
[244]
Because radio waves have long wavelengths,
[247]
they can cut through fog, dust, and rain
[249]
with little interference,
allowing radar systems to work
[251]
no matter the weather conditions.
[253]
Now, the systems are a bit more
expensive than ultrasonics,
[256]
but they can detect objects
from a very far distance,
[259]
which is why you'll usually
see them on the front of cars
[261]
detecting objects further down the road.
[264]
Radar is great at determining
[266]
an object's location and velocity,
[267]
but it's not the most accurate
[269]
in determining its size or composition.
[271]
Because of the nature of radio waves,
[273]
something highly reflective and
small, like an aluminum can,
[276]
can generate a similar signal
[278]
to something larger but not so bright,
[280]
like your mom.
(record scratches)
[281]
A radar sensor can be like,
[282]
"Hey, there's something over there,"
[283]
or, "Oh, there's something over there,
[285]
something down there."
[286]
But it can never be like,
[287]
"Hey, that's a car,
that's a guy on a bike."
[290]
It's just not possible.
[291]
Radar just doesn't have the resolution
[293]
to differentiate objects
to that level of accuracy.
[296]
If only there was a system like radar
[299]
that used such precise signals
of electromagnetic waves
[302]
that it could recreate an
accurate three-dimensional reading
[305]
of its entire surroundings.
[307]
(whooshes)
Oh, hello?
[308]
My insurance rates are about to go up?
[310]
That's a scam. I don't have insurance.
[311]
Thought that was gonna be my mom, huh?
[313]
Well, luckily, there is a
system that does just that.
[317]
LiDAR.
(lasers buzzing)
[319]
LiDAR is a combination of
the words, light and radar,
[323]
but it is now also accepted
[324]
to mean light detection and ranging.
[326]
It basically substitutes the
radio waves of radar with.
[330]
- [Both] Lasers.
[331]
- Yeah, actual lasers, for real.
[333]
A LiDAR sensor usually sits
on the roof of the vehicle,
[335]
and it emits millions of pulses
of light in a radial pattern
[339]
to build a 3D model of its surroundings.
[342]
This high-resolution model
can help decipher objects
[344]
in a way that would be
impossible with radar systems.
[348]
So while a radar or ultrasonic
system can recognize
[350]
that there's an object alongside you,
[352]
a LiDAR system can recognize
that it's a motorcycle
[355]
and whether or not the rider
is even wearing a helmet.
[357]
However, because the lasers
must use electromagnetic waves
[361]
with shorter wavelengths, the
light can't cut through things
[364]
like heavy fog or rain.
[366]
I also have to say they
kinda look pretty ugly
[368]
on top of a car.
[369]
I mean, I don't know if you've seen them,
[370]
but they're an expensive sensor
[372]
that is not pretty to look at.
[373]
(melancholy music)
[377]
Kinda look pretty ugly on top of a car.
[381]
An expensive sensor that
is not pretty to look at.
[383]
(melancholy music continues)
[388]
(soft upbeat music)
But probably the biggest
[389]
drawback of all three
of these systems so far
[392]
is that they can't actually see anything.
[395]
If your car is going to drive itself,
[397]
it needs to be able to read signs
[399]
and tell if a light is red or green.
[402]
If only there could be some
sort of device that could...
[406]
(whooshes)
Hello?
[406]
- [Jerry] What are you
talking to right now?
[409]
- Well, I'm talking to you.
[410]
- No, not me. What are
you talking to right now?
[415]
You're looking at it.
[416]
- Well, I guess I'm
talking to a video camera.
[418]
Oh ho!
[420]
- There you go.
- That was good.
[421]
- Oh, my gosh. How are we related?
[423]
- That was good, Uncle
Jerry. Yeah, thanks for that.
[426]
Okay, bye.
[428]
Cameras.
(letters crash)
[429]
Almost every autonomous vehicle
[431]
integrates some sort of camera system.
[433]
The reason cameras are so useful
[435]
is that they're very
similar to the human eye,
[437]
which is what our current
road network is built around.
[440]
We don't use sound to
tell us when to yield,
[443]
we don't use radio waves to indicate
[445]
where the turning lane is,
[446]
and we don't use different 3D shapes
[448]
to tell us when a light
is about to turn red.
[451]
And because of this,
cameras are the first step
[453]
when it comes to seeing our road systems
[455]
in a very human way.
[457]
The computer can use camera footage
[459]
to detect lane lines, street signs,
[460]
♪ I like that ♪
and if it's smart enough,
[462]
just about anything else.
[464]
But getting from a 2D image
to a 3D interpretation
[467]
takes a lot of work.
[469]
Remember, an image has no
three-dimensional data on its own.
[473]
However, there are a
couple tricks we can use
[475]
to get us some three-dimensional data
[477]
out of a bunch of two-dimensional images.
[479]
Look at these two images.
(camera shutter clicking)
[480]
They were both taken by two cameras
[482]
offset from each other by one meter.
[485]
And notice as we switch
between the two images
[487]
that the objects in the foreground
[488]
move more than the
objects in the background.
[491]
This is called stereo vision,
[493]
and it's how humans use both
of our eyes to perceive depth.
[495]
♪ I like that ♪
[496]
And it also shows how autonomous
cars with multiple cameras
[499]
can tell how far away an object is.
[502]
Now, look at these two images.
[503]
These were taken by the same camera.
[505]
However, in the second image,
[506]
the camera has moved forward a bit.
[508]
Notice how objects closer to the camera
[510]
once again moved a greater distance
[512]
than the objects further away?
[513]
Well, this form of linear
perspective can be used
[515]
by a single camera as it
travels through space.
[519]
But these tricks alone
can only get you so far.
[521]
They won't help you read a street sign
[523]
or tell the difference between
a plastic bag and a tire,
[526]
which is, I guess, a common
problem in autonomous cars.
[529]
Making those types of interpretations
[531]
requires something
you've probably heard of
[533]
called machine learning.
(letters trilling)
[534]
And we don't have enough time
to get into the nitty-gritty
[537]
of how machine learning works,
[539]
but it allows a computer program
[540]
to learn and evolve over time.
[543]
And if you've ever wondered
why those little CAPTCHA tests
[545]
always involve street signs and
different types of vehicles,
[548]
it's because you're helping
train these AI systems.
[551]
♪ I like that ♪
[552]
Frickin' stealing your data, dude,
[553]
and you didn't even know it.
[554]
I honestly just found out
(laughing) about this.
[556]
The fact that camera systems
[558]
rely so heavily on machine learning
[560]
and are more difficult for
computers to analyze in general
[563]
is where the whole debate
between LiDAR and cameras
[566]
really kick off
[567]
and where Tesla and seemingly
everyone else disagree.
[572]
A LiDAR sensor, it generates
data that doesn't require
[575]
a ton of interpretation to be useful.
[577]
It immediately can
inform the car's computer
[579]
of an object's size and distance
[580]
(fingers snap)
right off the bat.
[582]
And because of this, most
autonomous cars developed
[584]
are using LiDAR as their primary means
[587]
to interpret the car's surrounding
[588]
and hope to rely on the cameras
only to interpret signs,
[592]
lane markers, and traffic signals.
[594]
Now, Elon Musket,
[595]
on the other hand,
(slurping)
[596]
he's banking that, with machine learning,
[598]
the car's cameras can essentially do
[600]
most of the heavy lifting
[602]
with some radar and ultrasonic sensors
[604]
to help with general surroundings.
[606]
It seems like his belief
is that we are trying
[608]
to replace human drivers who
have two eyes and a brain,
[612]
so we might as well use the
technological equivalent
[614]
of two cameras and a neural network.
[617]
So back to that accident
(soft music)
[618]
that we talked about in the intro.
[620]
Why did this Tesla crash?
[623]
And if it had a LiDAR system,
would it have stopped in time?
[626]
So the car in question
here is a Tesla Model 3,
[629]
and it has 12 ultrasonic
sensors, eight cameras,
[632]
and one forward-facing radar system.
[634]
With a range of 160 meters,
[636]
it is unlikely that the
forward-facing radar
[639]
failed to produce a detection.
[641]
The issue was more related
to how the computer
[643]
interpreted that detection.
[645]
Cars using radars have some
issues with stationary objects.
[648]
One theory suggests this is because we fly
[650]
past stationary objects on
the freeway all the time.
[653]
Usually, they're side barricades
or overpasses or signs.
[656]
So the car's computer
might have interpreted
[658]
the overturned truck as consistent
[660]
with one of these common unmoving objects.
[662]
I mean, I can see how, with
the low resolution of radar,
[666]
that truck would generate a
signal similar to an overpass.
[669]
But as long as you have
another reliable system
[671]
to cross-reference, the computer
should be able to determine
[674]
if the approaching object has
the potential for collision.
[677]
And in this case,
[678]
that system should have
been the car's cameras.
[680]
So why didn't Tesla's computer
analyze the camera footage
[684]
and realize that there was
an overturned semi truck
[686]
in the road?
[688]
That's the million-dollar
question, and I can't tell you.
[691]
Maybe it just hadn't been
trained in many situations
[694]
that involved an overturned truck,
[696]
so it couldn't make sense
of what it was seeing.
[698]
Now, if Tesla had been using a system
[700]
that more precisely detected
objects, like LiDAR,
[704]
might it have been able to tell
[705]
that the motionless object
was actually a threat?
[708]
I think so. LiDAR's pretty frigging good.
[711]
There's a reason people are using it.
[712]
But I really hate to make any of this
[714]
sound like Tesla's fault.
[716]
When you're on autopilot mode,
[717]
you're supposed to still
have your eyes on the road.
[720]
And there are way more videos out there
[722]
of self-driving cars actually
saving people from accidents
[725]
than there are of these very rare hiccups.
[727]
So I think it's up in the air
[728]
whether LiDAR will come out on top
[730]
or whether machine learning
will advance enough
[732]
that just a couple of cameras
and a powerful computer
[734]
will be able to navigate any road
[736]
or scenario you throw at it.
[739]
It's like iPod for Zoon.
[740]
(crew laughing)
Zoom.
[742]
So let me know what you guys
think in the comments below.
[744]
Thank you guys so much for
watching this episode of "B2B."
[747]
You can follow us on
Instagram here at Donut,
[749]
all the Donut guys, all
the Donut fun, @donutmedia.
[753]
You can follow me on
Instagram @jeremiahburton.
[757]
If there's a topic you
guys are interested in
[759]
that you want to see here on "B2B,"
[761]
put a comment down below.
[762]
We'll see if we can make it happen, cap'n.
[765]
And until then, bye for now.
Most Recent Videos:
You can go back to the homepage right here: Homepage





