State of government reporting on AI self-driving cars: we need a test

661

By Dr. Lance B. Eliot, AI Insider for AI Trends and regular contribuor

The new phonebook is here! The new phonebook is here! You might recall that comedian Steve Martin made that famous exclamation in his now-classic movie The Jerk. He was referring to seeing his name in-print (well, his movie character’s name), and being excited to know that he had finally made it to the big time. Likewise, last week the California Department of Motor Vehicles (DMV) released its eagerly awaited collection of so-called “disengagement” reports that were filed by companies that had registered to test their self-driving cars in California. The collection covered the time period of December 2015 to November 2016, and eleven companies filed reports: BMW, Bosch, GM’s Cruise Automation, Delphi Automotive Systems, Ford, Google – Waymo, Honda, Nissan North America, Mercedes-Benz, Tesla Motors, and Volkswagon Group of America.

So, I’ll say it, the new disengagement reports are here! The new disengagement reports are here! The reason these are worthy of rapt attention is that they help reveal the latest status of self-driving cars. The numbers provided are self-reported by the companies, and presumably forthright (else they’d be violating the DMV regulations), though there isn’t any independent third-party verification per se of their reported claims. I am not suggesting that any of the reports are false. Actually, the bigger issue is that the DMV opted to allow for flexibility in reporting, and so it is not readily feasible to compare the numbers reported by the companies. It’s a shame that a regulator (the California DMV) on the one hand has insisted on annual reporting, which I argue is handy, but at the same time the regulator did not provide clear-cut standards for doing the reporting. It is like a football game involving teams that pretty much get to make up the rules and therefore you cannot compare them on the basis of touchdowns, since some of teams are using 6 points for a touchdown while others are using say 4 points for a touchdown.

The California DMV needs to tighten up the reporting requirements. That being said, California’s DMV should be applauded for even requiring such reporting, while other states such as Michigan, Florida, Arizona, and Nevada are not.  I don’t want to seemingly besmirch California for failing to offer clearer reporting requirements and somehow lose sight of the fact that most of the other states don’t require any reporting at all. To me, those states deserve an even harsher lashing. California has done the right thing, forcing the companies that want to test their self-driving cars on the public roadways to come forth with how well it is coming along and how much of such testing they are doing.  I realize that some might argue that this is an over-intrusive requirement on these companies, and some might say that it discourages those companies from testing. Given that some states aren’t requiring the reporting, companies that want to keep secret their testing are sliding over to those states and avoiding having to publish their status. Or, in California, they stay off the public roads and do their testing only on private lands (a somewhat clever or some say sneaky way around the rules).  I won’t focus further on the public policy implications here, and just note that it is a factor to keep in mind about how self-driving cars are evolving and what the states are doing about it.

What do the numbers show? Google’s Waymo reported that their self-driving cars logged about 636,000 miles in California on public roadways during the reporting time period. That’s a staggeringly high number in comparison to the other ten companies, which combined together were a mere fraction of that number of miles. For total miles logged and reported, Waymo was about 97% of the miles and the other ten were about 3%. GM’s Cruise Automation came in at the #2 spot in terms of most number of miles, indicating about 9,850 miles during the reporting time period.  Some companies, such as Honda and Volkswagon, reported zero miles and indicated that they had not done testing on California’s public roadways during the reporting time period (at least as far as they interpret what the DMV regulation encompasses).

This brings us to one of the reporting metrics problems. The number of miles driven is a quite imperfect measure since it is important to point out that a mile of open highway is not the same driving complexity as a mile of inner city street-clogged bumper-busting traffic. Not all miles were created equal, some aptly say. If I have my self-driving car use open highways where there is little human-like navigation required, this is a far cry from the human-like capability needed in more onerous traffic conditions. For example, much of Waymo’s mileage is apparently in the Mountain View area, along suburban streets. I would contend that these kinds of miles are a lot “easier” than say downtown San Francisco driving.

I’ll also point out something else that I’ve observed while in the Mountain View area. I’ve seen the Waymo cars quite often. I have also noticed that other human-driven cars are tending to give a lot of latitude to the Waymo cars nowadays (which are standouts due to their now well-known iconic shape and sensory gear). I mention this aspect because the next metric I am going to discuss is the disengagements counts. Waymo reported that they had a relatively tiny number of disengagements, indicating they had 124 in total, which seems pretty darned good when you also consider that their self-driving cars went 636,000 miles or so. It is a good sign, but at the same time, I wonder how much of this is due to the fact that human drivers are changing their behavior to allow the self-driving car to drive without having to deal with true traffic conditions (i.e., humans not driving as wildly as they normally do). And, these cars are driving repeatedly on the same roads, over and over. This is vastly different from having to navigate more unfamiliar territory and figure out the idiosyncrasies of roads you’ve not been on before.

What does the word “disengagement” actually mean, you might be wondering? According to the California DMV, a disengagement involves the test driver taking over immediate manual control of the vehicle during a testing activity on a public road in California, and doing so because the test driver believed the vehicle had a technological failure or because they thought the self-driving car was not operating in a safe manner.  This might seem like an airtight definition.  It is actually full of tons of loopholes. Let’s start with taking over immediate manual control. For some companies, their viewpoint is that if the test driver waits say more than a few seconds then this is not considered an “immediate” circumstance and so is not counted as a disengagement. Is this a fair or unfair interpretation? Again, it should not be open to interpretation and a clearer standard should exist.

I can also imagine that there might be pressure placed on the human test drivers to avoid doing a disengagement. In essence, if you are a self-driving car company and you know that ultimately the whole world will be examining your number of disengagements, you would probably want to seek the minimal number that you can. This is not to suggest that anyone is telling test drivers to allow themselves to be put in jeopardy or jeopardize others on the road. It is simply another element to consider that the test drivers will each vary as to why and when they think a disengagement is warranted. Thus, again, the disengagement metric is not a reliably standardized metric. Tesla reported 550 miles in total, and 182 disengagements, which suggests that they had about 1 disengagement every 3 miles driven.  This would seem at first glance like a scary number. But, you need to keep in mind the number of miles, the conditions involved, how the disengagements were counted, etc.

I have been especially irked by some of the national and worldwide reporting about the California DMV disengagement reports. One bold headline was that numbers don’t lie and that the reports presumably prove that self-driving cars are nearly ready to hit the roads unattended. This claim that numbers don’t lie is a sadly simplistic suggestion, and I think that all readers should recall the famous line of British Prime Minister Benjamin Disraeli: “There are three kinds of lies: lies, damned lies, and statistics.”  Another irksome headline was that the disengagement reports show that there are 2,500 or so problems with self-driving cars. This is a total of the number of disengagements reported, but it is disingenuous to suggest that somehow the count implies that self-driving cars had 2,500 “problems” per se.  Yes, a disengagement might have occurred because a self-driving car had a system failure, but it also could be that the test driver felt uncomfortable entrusting the self-driving car in a dicey traffic condition and so decided to take over manual control.

Please be cautious in interpreting the disengagement reporting. I will end on a more macroscopic note about self-driving cars, namely that we need to establish the equivalent of a Turing Test for self-driving cars. In AI, we all know that the Turing Test is a handy means to try and ascertain whether a system appears to embody human intelligence, doing so by trying to see whether humans can distinguish whether or not a system is interacting in a human intelligence-like fashion. Though imperfect as a test, it nonetheless is a means to assess AI-based systems.  A Level 5 self-driving car, which is considered the topmost of self-driving cars and implies that a self-driving car can be driven by automation in whatever same manner that a human can, there isn’t any specific testing protocol for this. Do we just let a self-driving car drive for a thousand miles and if it does not need any disengagements, do we safely conclude it is a true Level 5? Or do we let it drive for 100,000 miles? As I’ve mentioned herein, miles alone is not sufficient, and disengagement is imperfect too. We need a more comprehensive, standardized test that could be applied to self-driving cars.  If someone else doesn’t do it, I will, and I’ll probably call it the Eliot Test.  You could be immortalized if you come up with one instead.