Autonomous driving technology is currently in its infancy, but evolving rapidly. More companies are testing autonomous vehicles than ever. Tesla and Waymo are two prominent names competing to capitalize on the opportunity of becoming the industry leader in the self-driving future, but employ different technologies to achieve those goals.
One of the major difference between Tesla and Waymo is the sensors they use for self-driving. Tesla cars relies on computer vision and Waymo uses LiDAR.
LiDAR stands for Light Detection and Ranging, which is a system that uses lasers for detection. There are sensors on-board that detects the reflected laser light and measures the distance to a target.
A computer vision system uses cameras to capture the environment and processes images in real-time through a machine learning neural network.
Advantages of Computer Vision
The main benefit of a camera based system is that it costs a lot less to install cameras on a car and it can be trained through computer learning to recognize objects. With this system you can drop the car in an unfamiliar environment and it could still find its way.
Can read traffic signs
Another benefit of a camera based system allows the car to read signs. The current road infrastructure is set up for human vision and the camera system is based on this vision. It read signs in real-time and respond accordingly, like a human would.
Looks like a regular car
There are no odd looking domes popping out of the body of a car in camera based setup. The built-in cameras allows for a very sleek design which preserves the aesthetics of the car.
Maybe the only disadvantage of this system is that it takes a lot more machine learning and software development, requiring a lot more computing power.
Waymo currently uses LiDAR sensors on their cars to navigate around the environments. There are sensors on front, top and back of the car. There is also a vision system that aids in detection of the surroundings. In addition, there is an audio detection system that can hear police and emergency vehicle sirens.
Waymo’s LiDAR based system must first map the complete route before the car is able to drive on it. It requires detailed 3D maps of the area. Currently their driverless territory is limited to Phoenix metro area but they hope to expand in the future.
Cost of LiDER is of-course the major concern. Whether they implement these radars in their taxi network or they decide to sell these vehicles to the public, the cost is going to be huge. For example, the retail price of Chrysler Pacifica hybrid minivan which the company uses for autonomous conversion is $40,245. Now add the cost of LiDER sensors which is around $7,500 per sensor, plus the computer system and all the rest of the tech needed for the conversion. The end price would be over $100,000.
Tesla Model 3 vs Waymo’s Chrysler Pacifica
If we compare it with Tesla, the Model 3 Standard Range Plus is currently retailing for $40,000, almost same price of Chrysler Pacifica hybrid minivan but the Autopilot system is included with Tesla. It comes standard with eight cameras, twelve ultrasonic sensors and a front radar. The purpose of ultrasonic sensors is to measure the distance to the target by using ultrasonic waves. So $100,000 Waymo vs $40,000 Tesla Model 3, this huge price difference alone should make it clear who is on the right path.
Waymo currently has around 600 self-driving cars on the road and they have an order in for 62,000 more Chrysler Pacifica hybrid vehicles. They have also placed an order for 20,000 Jaguar I-PACE.
Tesla has well over 600,000 cars on the road right now with hardware version 2 and above that have the ability of being fully autonomous, once Tesla releases those feature through a software update. This is based on third quarter of 2019, when fourth quarter sales figures are released that on-road figure is expected to go up to over 700,000 cars with v2 hardware. Higher number of cars on the road is important as it will result in higher amount of data collected by Tesla from its fleet which is vital for machine learning.
Autonomous miles driven
Waymo has recently publicized that they have driven well over ten million miles on public roads autonomously since 2009, while that seems impressive particularly in comparison to other autonomous companies, but Tesla cars have reached around two billion miles driven on autopilot. This is huge amount of data being collected and is very beneficial for the Tesla engineers that are developing computer vision software.
Tesla has recently released a software update which enabled navigation features allowing vehicles on autopilot on the highway to change lanes and also able to take exits without any user input.
Another feature recently enabled by Tesla is called smart summon which allows the driver to summon the car to and from a garage or parking spot. The car can drive by itself to the owner’s location with no one inside. This is a pretty extraordinary feature and it also paves the way towards full self-driving.
Data is king
So in the race towards autonomous driving, data is king. The real world is very unpredictable, people do very irrational things and they sometimes don’t follow the traffic laws, for example, traffic signs, or traffic lights. So because of this unpredictability the more real-world data available, the better the computer algorithms can perform.
Waymo has to pay their engineers to drive all these miles and map it out and get areas geo-fenced which is very expensive and time intensive. So based on all these data points, who is going to win the race? Well it becomes very obvious when you think about all the data, when you think about the features, and when you think about the number of cars on the road with the hardware necessary for autonomous driving. Its highly likely that Tesla gets there quicker than any other company.
Waymo is currently operating at level 4 according to the SAE. Its an international organization that has defined autonomous driving levels which are from 0 to 5. Tesla is operating at level 2 officially at this time. Since Waymo is operating at level 4, it may look like Waymo is ahead in the game, but this level 4 is in geo-fenced areas only that have been very intensively mapped using very expensive sensors. The cost of mapping out all areas just in major cities will be substantial.
Once Tesla solves computer vision, all they have to do is simply flip the switch and the entire fleet becomes autonomous.
Elon Musk on LiDAR vs Computer Vision
This is what Elon Musk said on autonomy day earlier this year.
“They’re all gonna dump LiDAR is my prediction, mark my words. I should point out that I don’t actually super hate LiDAR as much as may sound, but SpaceX Dragon [a reusable cargo spacecraft] uses LiDAR to navigate to the space station to dock.”
“SpaceX developed its own LiDAR from scratch and I spearheaded that effort personally, because in that scenario LiDAR makes sense, but in cars it’s frigging stupid. It’s expensive and unnecessary and as [Tesla AI director] Andrej [Karpathy] was saying, once you solve vision, it’s worthless. So you have expensive hardware that’s worthless on the car.”
“We do have a forward radar which is low-cost and is helpful especially for occlusion situations. So if there’s like fog, dust or snow, the radar can see through that. If you’re going to use active photon generation, don’t use visible wavelength because with passive optical you’ve taken care of all visible wavelength and stuff. You want to use a wavelength that is occlusion penetrating, like radar.”
“So LiDAR is just active photon generation in the visual spectrum. If you’re going to do active photon generation, do it outside the visual spectrum in the the radar spectrum. So like 3.8 millimeters versus 400 to 700 nanometers gonna be a much better occlusion penetration and that’s why we have a forward radar.”
“And then we also have 12 ultrasonic sensors for near field information in addition to the eight cameras and the forward radar. You need the radar in the forward direction because that’s the only direction you’re going real fast. I mean we’ve got over this multiple times, like are we sure we have the right sensor suite? Should we add anything more? No.”