LIDAR as Secret Sauce for Self-Driving Cars

1220

By Dr. Lance B. Eliot, the AI Insider for AI Trends and a regular contributor

Is LIDAR the secret sauce for self-driving cars? I’ll explain what LIDAR is, and also offer insights about the two different camps that fervently believe either that LIDAR is an absolute must for the advent of self-driving cars or believe that LIDAR is optional and likely overly expensive so as to not be needed for self-driving cars. Besides the technology underlying LIDAR and what it does, I’ll also bring you into the world of mystery spying and intrigue that has recently been emerging around LIDAR as evidenced by the lawsuit between Google’s Waymo and Uber, along with the recent ranking of self-driving car makers that put Tesla at 12th position, a much lower ranking than what most would assume and rated low due to Elon Musk’s posturing that LIDAR is not needed.  Get ready for a wild and engaging ride on the story of LIDAR.

Have you ever seen a picture of a Google self-driving car? If so, you’ll notice that there is a kind of “hat” on the top of the car that looks like a flashing beacon or siren light, akin to what you might see on top of a fire truck or maybe an ambulance. Most people assume that this beacon or cylinder is there to warn other drivers that a self-driving car is in their midst. It seems almost like the kind of warning signs you see on a car being driven by a teenager that is learning to drive. Watch out, get out of the way, neophyte driver is here on the roads! Well, in the case of the Google self-driving car, you’d be wrong that the beacon is there for you. It is there to provide a crucial sensory capability to the self-driving AI, namely it is a device that emits a laser light beam and then receives a return that helps the system identify nearby objects that are out and around the car.

Called LIDAR, the beacon is an essential sensory device for most self-driving cars. It sits purposely on the top or roof of the car so as to have an unobstructed view. Originally called LIDAR as a mishmash of the words Light and Radar, it eventually became also known as Light Detection And Ranging, but some also refer to it as Light Imaging Detection And Ranging. You will also see it spelled in different ways, such as some people use LIDAR, while others use Lidar, LiDAR, LADAR, and other variations. Whatever way you want to spell it or say it, the end result is that it is a sensory device that emits a laser light beam which then gets back a reflection and can try to ascertain the distance between itself and whatever objects the light bounces off.

It is a range detector.

This range detector can be used to create a 3-dimensional mapping of what surrounds a self-driving car. The laser beam can be rotated in a circle, 360 degrees, and as it does so it is detecting the distances to nearby objects. The system then can reconstruct each of these range detections to try and create a kind of mental map of the surrounding area. There’s a large standing object over to the right, and a squat object over to the left of the car. Piecing together a jigsaw of these puzzles pieces, the system figures out that the large standing object on the right is a telephone pole, and the squat object to the left is a fire hydrant. Once the system figures this out, it can then use higher-level logic to determine that it should avoid hitting the telephone pole and avoid hitting the fire hydrant, but it “knows” those objects are there and in case the self-driving car needs to take a sudden evasive maneuver and wants to go off-the-road to avoid a head-to-head car collision.

Rather than the laser beam rotating, modern versions use mirrors that rotate instead. This can speed-up the range detection and also allow for gathering more data at once.  There are single-lens LIDAR and there are multi-lens LIDAR, of which the former is less expensive and easier to data process while the latter is more expensive and takes greater data processing to handle. The amount of algorithmic processing of the data being collected is tremendous. You need to get the data and reduce the noise and distortions, you need to do feature extraction to identify the skeletons of objects, you need to deal with the geometric facets and cope with the reflections from the objects, etc.

LIDAR has been around since the 1960s. This is a surprise to many in the self-driving car field since they seem to think that LIDAR was invented just for self-driving cars. Nope. It has been used for all kinds of purposes, and a great deal of the time was used in airborne applications. There are lots of terrestrial applications too, including for example in archaeology and for farming. This is pretty much tried and true technology. That being said, there are continual advances taking place. Let’s discuss the impact of those advances.

One advancing aspect of LIDAR is that it is getting less expensive as a sensory device. The early versions on self-driving cars like Google’s car were typically around $100K in cost (they were using a now older LIDAR model of Velodyne, a vendor that makes LIDAR’s, and it was the HDL-64E LIDAR sensor at the time). As you can imagine, we are not going to have self-driving cars for the masses if the cost of one sensor alone on a self-driving car costs $100K. This would cause the cost of a self-driving car to go into the hundreds of thousands of dollars, after adding up all the other sensory devices and specialized software involved. Only the very wealthy could afford such a car. Furthermore, from the perspective of the car makers, they would only have a tiny market size to sell the self-driving car into.  The Holy Grail of self-driving cars is to sell into the masses.  There are currently around 250 million cars in the United States and about 1 billion cars worldwide. Car makers are eyeing that they could ultimately replace all those cars with self-driving cars and so that’s a huge market. Game on!

Another advancing aspect of LIDAR is that it is getting better and faster. If you are dependent upon LIDAR as a means to guide a self-driving car, you need the LIDAR to work very quickly. Realizing that a car is moving along at say 80 mph, you need to have a sensory device that can grab the range detections in real-time, and accurately, so that the AI of the self-driving car can figure out what is going on. With each second that passes, your car has moved forward about 120 feet. Think about that for a moment. In one second, your car has moved forward over one hundred feet in distance. As your car moves along, it needs to rapidly ascertain what is ahead of it, what is the right, what is the left, and what is behind it.

Keep in mind too that the other objects around the self-driving car are not necessarily stationary, and thus you need to have the LIDAR detecting that another car is coming at you or veering toward you. The speed of the LIDAR detecting objects is crucial, since otherwise your self-driving car is “blind” as to what is happening. Suppose the LIDAR hiccups for even a brief second of time, it would be like you are driving your car and suddenly closing your eyes or look away from the road. This split-second diversion could cause a life-or-death aspect of your car hitting someone else or going into a ditch.

LIDAR is notorious for not being able to reliably detect close-in objects very well and so Google even mounted conventional radar black-boxes to the front and rear of their self-driving car. The LIDAR also can be obscured by other areas of the roof of the self-driving car, and so if you were to mount ski racks or something else on your self-driving car, you need to make sure that LIDAR still has an unobstructed view.  Moisture in the air has often been troubling for LIDAR too. If there is rain, snow, or fog, it can cause the laser light to bounce oddly and so you won’t get back clear and usable reflections from objects. This is gradually being dealt with in newer versions of LIDAR.

The speed of processing is also being enhanced. Some believe that conventional silicon-based chips can’t handle in a speedy manner the huge volume of the range detections. Researchers and startup high-tech firms are exploring the use of Gallium nitride (GaN) transistors, which can potentially process at faster speeds than silicon. Price is a factor again, and so if you get faster in one tech but the cost goes up, you need to balance against slower tech that is less expensive. Indeed, there are LIDAR’s that are down into the mere hundreds of dollars cost range, but those are slower and tend to be such low-resolution that few believe they are tenable for use in a true self-driving car scenario.

Now that I’ve covered the fundamentals about LIDAR and its use for self-driving cars, we can shift into the intrigue part of the story.

You might assume that everyone believes that LIDAR is necessary for self-driving cars. It is usually used in combination with cameras and other sensory devices such as conventional radar. You might think of this as a human that combines a multitude of their sensory capabilities for driving a car, such as your eyes, your ears, etc. The self-driving car fuses together the data from a multitude of sensors and then tries to map the world around the car and the AI then figures out what the car should be doing. Get ready to be shocked when I tell you who isn’t using LIDAR.

Are you sitting down? Tesla is not using LIDAR. Furthermore, Tesla appears to have no interest in using LIDAR. According to Elon Musk, he doesn’t believe that LIDAR is a capability needed for self-driving cars. His comments about LIDAR have drawn both criticism and praise. Those that praise his views believe that LIDAR is a misleading path and that we don’t need it for self-driving cars. We can do what is needed with the other sensory devices, they say, and using LIDAR is unnecessary. Why bother with something that you don’t need and will only increase the cost of the self-driving car?  On the other hand, the camp that says LIDAR is essential is just about all the other self-driving car makers.  Yes, Tesla is pretty much alone in their view that LIDAR is unnecessary.

Notice that Elon Musk has not said that LIDAR is bad or wrong. He believes that LIDAR is applicable for other kinds of applications, such as for his spaceships. He just doesn’t think it is worthy for self-driving cars. Cynics say that he wants to avoid having to retrofit all of the existing Tesla’s to have LIDAR, which would be quite costly. He will supposedly claim that LIDAR is not needed due to not having used it at the start, and now that he’s far along on his self-driving cars that it would be costly and also look like he was “wrong” that he didn’t earlier adopt LIDAR.

A recent ranking of self-driving car makers even put Tesla into the lowly position of 12th place, primarily because Tesla is not using LIDAR. The camp that believes in the importance of LIDAR has cheered this ranking and kind of thumbed their nose at Elon Musk. The camp that believes LIDAR is not needed has suggested that the ranking was biased by techies that favor LIDAR and so it is an unfair ranking. If you were ranking baseball batters and believed that the use of an aluminum bat was better than a wood bat, and your ranking was based on the type of bat used, you can imagine the rancor that would come out after the ranking was published. Does it really make a difference as to which bat you use? Shouldn’t the batter be judged based on the outcome of their batting? Some believe that a ranking that assumes the use or non-use of LIDAR is a crucial factor ought to be tossed out, for the same logic as the use of the bat when ranking baseball batters seems questionable.

How important is LIDAR? You might recall that in a previous AI Insider column that I had mentioned the lawsuit of Google’s Waymo against Uber. In that lawsuit, a Google contends that self-driving car executive had left Google and founded a self-driving truck company, Otto, which was then bought up by Uber, and furthermore Google alleges that the former executive downloaded a bunch of documents before he left Google. Those documents were purportedly about LIDAR. Google is doing their own proprietary research into LIDAR and trying to advance LIDAR technology, which as I’ve mentioned here is an especially crucial element of the Google self-driving car strategy.

Recently, Uber went into court and denied that they have used anything that might have been taken from Google. Uber seems to be claiming that they could only find one document that might have been taken from Google, out of the alleged 14,000 that were supposedly taken. Uber also indicates that the Google research was about single-lens LIDAR, while Uber is forging ahead with multi-lens LIDAR, and so it is the case that Uber has not tried to leverage the Google propriety LIDAR, even if they had it, so Uber says. Uber has also tried the classic “their lawsuit is baseless” tactic by throwing other aspects into the mix. So far, the judge doesn’t seem to be buying into Uber’s positions and it appears that Uber is going to have a lot more explaining to do.

Beyond the intrigue, the point is that LIDAR is a secret sauce for some self-driving car makers. In fact, pretty much for nearly all of the self-driving car makers. The potential for LIDAR is gigantic in that if the preponderance of self-driving cars are built to require LIDAR, it will mean that you’ll be needing LIDAR devices on ultimately say 250 million cars in the United States and maybe 1 billion cars worldwide. For those that see big dollars ahead, many are investing in LIDAR makers right now. This is a bit of a bet that you are taking, though, because if Tesla is right that we really don’t need LIDAR for self-driving cars, ultimately the market will likely want to keep the cost of self-driving cars as low as possible, and so maybe chuck out the LIDAR due to its added cost. This is reminiscent of the 1980s when there was a war between Beta and VHS formats. For those that bet on VHS, they won, while those that betted on Beta took a hit. Should you load-up your stock portfolio with LIDAR makers?  You decide, and about five to ten years from now, we’ll know if you were right in your decision.

This content is original to AI Trends.