Are Airplane Autopilot Systems the Same as Self-Driving Car AI?

2211

By Dr. Lance B. Eliot, the AI Trends Insider

As a frequent speaker at AI automated vehicle and self-driving car conferences, and as Executive Director at the Cybernetic Self-Driving Car Institute, I often get asked about the nature of airplane autopilot systems and how they compare to what is going on with self-driving car AI systems. I like the questions since it gives me a chance to explain the similarities and differences between the two, plus it also provides an opportunity to burst some bubbles about the myths associated with both.

Here’s the types of questions that I get asked, and for which I will answer herein:

  •       Is an airplane autopilot the same as a self-driving car AI system?
  •       Can’t we just clone an airplane autopilot and use it to have ourselves a self-driving car?
  •       Flying a plane takes years of training and experience, so certainly the airplane autopilot must be many times more sophisticated than what is needed for a self-driving car?
  •       Anybody can drive a car, so it must be much easier to develop AI for a self-driving car than it is for an airplane?

Let’s take a look at these questions and figure out what’s what.

First, let’s begin by reviewing what an airplane autopilot system does. There are a number of myths involved and the public perception is a far cry from the reality of what plane automation actually achieves.

A plane has various sensors around the plane to help gauge the speed of the airplane, its altitude, and other flight related factors. You could say this is somewhat similar to the need for sensors on a self-driving car (see my article about sensor fusion on self-driving cars).

A self-driving car has perhaps radar, LIDAR (see my article about LIDAR), cameras, ultrasonic sensors, and other various sensory devices around it. Airplane sensors collect data during the flight of the plane, as likewise the sensors of the self-driving car collect data during a driving journey. So far, they seem pretty much alike, a plane and a self-driving car. Planes do though have some different sensors than a self-driving car, and indeed a self-driving car has some sensors that are not normally included on a plane, but we’ll ignore that difference and just gentlemanly agree that both have sensors to collect important data while underway. That seems fair.

The sensory data is collected and computer processors do sensor fusion, using the sensory data for purposes of guiding and controlling the plane and likewise the same for a self-driving car. In a self-driving car, there is a need to control the accelerator and brake pedals, the steering wheel, and the like, thus directing the car. Similarly, the airplane autopilot needs to be able to control and direct the plane, adjusting its direction, altitude, speed, etc.  Once again, it seems like the two are about the same.

Currently, even if there is an airplane autopilot available on a plane, a pilot or flight certified crew member must be present in the cockpit at all times. The human pilot is considered ultimately responsible for the operation of the plane. This is equally true for Levels 1 to Level 4 for a self-driving car (see my article on the Richter scale for self-driving cars). For those levels of self-driving cars, there must be a human driver present and the human driver must be properly qualified to drive the car. The human driver needs to be ready to intervene if the self-driving car AI asks them to do so, or if the human driver perceives the need to take over the controls from the self-driving car.

Now, for a Level 5 self-driving car, the rules change. A true self-driving car is a Level 5, which is a self-driving car that can entirely drive itself and there isn’t ever any human intervention needed. Simply stated, it is not necessary to have a human driver available. There is no equivalent right now for airplanes. Airplanes are considered always to be watched by a human pilot.

Will we someday change that rule? Maybe, but it will probably be much later than after we have Level 5 self-driving cars. The reason perhaps is that flying a plane that has 300 passengers is considered a much more serious task than someone being in a car with a single occupant or a few more. We will likely for a long time continue to insist that a human pilot needs to be ready to take over the controls of an airplane autopilot, right or wrong in our perception of what the airplane autopilot can or cannot do.

Things start to get more interesting as we move further into the details about what an airplane autopilot currently does.

Let’s begin by identifying what steps occur when we want to have a plane take us on a flight. Normally, the plane is parked at a terminal, and it needs to somehow move away from the terminal and taxi to a runway position where it can be ready to takeoff, once at that position it needs to take off from the ground and get airborne. Once airborne, the plane needs to climb up in the air and reach a desired altitude. After achieving a desired altitude, the airplane will usually stay at the altitude for a period of time and be considered at a cruising or level flight position.  Eventually, the plane will need to start to descend. Once the descent has reached a low enough position and the plane is near a runway, the plane is taken into its approach. At the conclusion of the approach is the landing of the plane, and it then usually needs to taxi to a place where it will be parked.

In recap: Taxi -> Takeoff -> Climb -> Cruise -> Descend -> Approach -> Land -> Taxi.

Today’s airplane autopilots rarely ever do the taxiing and it is expected that the human pilot will do so. This is kind of interesting because of course a self-driving car is all about “taxiing” in that the self-driving car must drive on a road and be able to do so without human intervention for Level 5 cars. Some say that after we’ve perfected self-driving cars, we should port over the same AI capability to airplanes.

Most airplane autopilots are not able to land the plane, and those that do have such a landing capability are rarely used. Normally, a human pilot will land the plane. The exceptions are typically under very adverse weather conditions. This at first seems counter-intuitive since you would assume that the automation would do the easy flight landings and you’d only have the human handle the tricky landings involving bad weather. The reason that the autopilot might be used for bad weather is that it has instruments or sensors that can tell it things that the human pilot cannot necessarily as readily ascertain by looking outside the plane and by looking at the gauges. This though is a judgment call and I’d wager that most experienced pilots would rather be at the controls over using the autopilot in the adverse weather conditions.

Let’s now review what the airplane autopilot situation is:

Taxi:  Not today

Takeoff: Can do, but rare

Climb: Can do, but rare

Cruise: Most usage

Descend: Can do, but rare

Approach: Can do, but rare

Land: Can do, but very very seldom

Taxi: Not today

In essence, the bulk of the use of an airplane autopilot is when the plane is cruising along at level flight. When you have nearly anything else happening, the human pilot takes over the controls. Even at cruising flight the human pilot might take over if there is a lot of turbulence or anything out of the ordinary happening.

I know that many movies and the public perception is that a human pilot pretty much sits back and simply lets the autopilot fly the plane from end-to-end of a flight journey, but this is a myth. Another myth is that even once the autopilot is engaged during the cruising part of the flight that the pilot is reading a newspaper or otherwise doing something that allows them to be completely unobservant about the plane status. This is considered a forbidden aspect and it is fully expected that the human pilot must be always aware of the status of the plane and be instantly ready to take over the controls.

In theory, the same is true for the self-driving cars at levels 1 to 4. Though some people falsely think that at those levels the human driver can be playing cards, it is not what the definition indicates. The human driver is still responsible for the car. The human driver must be ready to intervene in the driving task. The only viable way to be able to intervene involves paying attention to the driving journey and the status of the self-driving car. We won’t be able to sit back and read the newspaper until we are making use of Level 5 self-driving cars, which is still a ways ahead in the future.

In fact, anyone that knows anything about airplane autopilots always says this: The airplane autopilot does not fly the plane, the human pilot is flying the plane via the use of automation.

Notice that an important distinction is that the human pilot is always flying the plane, and he or she is merely using automation to assist. You need to think of levels 1 to 4 of self-driving cars the same way. It is the human driver that is driving the car and using automation to do much of the driving task. Only once you get to level 5 can you then say that it is no longer the human driving the car, and instead it becomes the automation driving the car.

The airplane autopilot is mainly intended today for handling long stretches of a flight that are somewhat boring. Nothing unusual should be happening. In one sense, this is handy for a human pilot because they might become overly bored themselves during long stretches and begin to mishandle the plane. By allowing the autopilot to deal with the monotony, you pretty much know that the automation can remain alert and steady. This is good.

Of course, what can be bad involves situations when a plane that is cruising for a long stretch and suddenly out-of-the-blue has an unexpected emergency. It can be tough for the human pilot to instantly re-engage in the flight. Many of the most famous flight crashes while cruising are due to the Human-Machine Interface (HMI) issues of when a plane startlingly asks the human pilot to intervene. It is easy for a human pilot to become inadvertently complacent during a long and mundane stretch of a cruise.

Pilots that I know are often upset to hear the public say things like an autopilot is better than the human pilot, or that the autopilot flies the plane entirely and the human pilot is nothing more than an overpaid glorified baby sitter for the automation.  If you want to get a human pilot really angry, go ahead and say this to their face. Dare you.

In fact, pilots often prefer to refer to the autopilot as an auto flight system. They think that by using the word “pilot” in autopilot that it misleads the public into believing that the automation is more far reaching than it really is. I would argue that most autopilot systems aren’t even much in terms of AI. We’ve had the basics of autopilots for many years. These autopilot systems predate the latest advances in AI. Few of the more complex AI capabilities are currently involved in autopilots.

Tesla has gotten itself into some hot water by deciding to call their self-driving car capabilities an “Autopilot” (I’ll capitalize it to distinguish the brand name from the common use of the word). Elon Musk seems to think that the phrasing of Autopilot is apt because he wants people to leverage their myth-like understanding of airplane autopilots into assuming that his Tesla cars are equally as impressive in their automation. There have been various agencies and governments that have wanted to get Tesla to change the name of their automation, because it is felt that the Autopilot is a misleading moniker.

I’ve already predicted in my articles about product liability in self-driving cars that Tesla might eventually regret having used the Autopilot naming. At some point, once more self-driving car crashes happen, and I’m not saying that Tesla will be alone in having car crashes (since all self-driving car makers will have them once we have more self-driving cars on the roads), but once there are more crashes of Tesla’s, someone harmed or killed is going to have a family member press the case that Tesla misled the public about what the automation could do. As evidence, the family could try to show that the word Autopilot and autopilot are purposely intended to confuse and mislead buyers and drivers of the Tesla’s. Don’t know whether they can make that case stick, but I am sure that some lawyers will try.

For human pilots of airplanes, they affectionately refer to autopilot systems often as “George” and it is a kind of wink-wink pet name. They know that the autopilot is rarely even one system, and instead a collection of several subsystems. The human pilot tends to act like an orchestra conductor and make sure that each subsystem is doing what it is intended to do. An analogy sometimes used by pilots is that they are like brain surgeons in a highly advanced and automated surgical operating room. The human medical doctor is still doing the operation, even if they might have highly sophisticated microscopes and biological cutting tools.

There are AI proponents that feel like pilots are trying to keep their head in the sand and refusing to accept that airplane autopilot technology could be better. Or, some cynically say that the pilot unions are worried about job losses of pilots. The unions supposedly would prefer that an autopilot not be able to completely handle a plane from end-to-end. Imagine the massive layoffs of pilots and that we might eventually lose the skill to manually fly planes. That’s a future doom and gloom picture that is often portrayed.

In terms of cloning an airplane autopilot for purposes of aiding self-driving car AI, the answer there is that it is not particularly the case that we can get much from doing so. As mentioned, the autopilot generally handles the cruising aspects of the plane. Today’s self-driving cars are somewhat doing the same thing, in that most of the current self-driving cars are only able to do cruising down an open highway. They are simply doing tricks of following lane markings and the car ahead of them (see my column on the pied piper approach to self-driving cars).  Anything out of the ordinary requires human intervention. The plane autopilot and today’s self-driving cars match on that sense of simplicity of the ability to control the vehicle.

One would say that the plane is even less complex an environment than what faces a self-driving car. Sure, an instrumental panel on a plane is baffling and overwhelming to anyone not familiar with flying, but keep in mind that planes are somewhat traveling like a train. A train has train tracks that force it to go certain ways. In the skies, for most flying and especially cruising, there are defined lanes in the sky. A plane is given coordinates to fly in a certain direction at a certain speed, and the air traffic controller tries to ensure that no other plane is in that same path.

When you drive your car, you aren’t given the same kind of clear path for where your car goes and what other cars around you are doing. A plane is normally steering clear of another plane, doing so as guided by the air traffic controllers. Cars are in a free-for-all most of the time. Yes, I realize that we have lanes on freeways, but there isn’t anything or anyone telling that car next to you to stay back from your car, or opening up the lane to let you make a lane change. How many times do planes crash into each other? It’s very rare. When it happens, there is a big news blitz, and so maybe you think it happens all the time, but it is actually very rare. Cars crash all the time.

Cars are faced with motorcyclists that can come within inches of your car. Pedestrians can jump in the front of your car. Kids can throw bricks off an overpass and the projectile can smash into your front windshield. Your tire can go over a nail and get punctured, with rapid and sudden loss of steering control of your car due to a flattened tire. On and on. Though things can go awry on planes, they are generally carefully maintained and they are flown away from areas that could harm the plane. We’ve all heard about the occasions when birds got sucked into a plane engine and a plane had to make an emergency landing, but these are rare and memorable because they are rare.

In the above manner, developing AI for a self-driving car is much harder right now than for an airplane autopilot. I know that some flight software developers will get irked at this statement, and so let me qualify it. If we want an airplane to be more like a Level 5 true self-driving car, we definitely have an uphill battle of creating software for planes that is that good. Having an autopilot that could do everything a human pilot might do, and cover all the permutations of things that any plane can encounter, this is a very hard problem, I agree.

Human pilots require extensive training and experience so that when the 1% of the time something goes awry, they are ready. They need to save their lives and the lives of the 300 passengers on the plane. For cars, we put teenagers through some pretty slim training and then toss them onto our roads. Heaven help us. They though eventually seem to figure it out, and we are not overly beset with teenage mutant driver killers.

One aspect of autopilots that I really like is that the autopilot hardware and software is extensively designed and built for redundancy and resiliency. You often have multiple redundant hardware processors on planes, allowing one processor to take over if another one falters. You have redundant software such as for the Space Shuttle that was developed with multiple versions, and each version double-checks the other. Few of the self-driving car makers are doing this.

Self-driving car makers are not being as rigorous as those that have developed autopilot systems. This is kind of crazy since self-driving cars are going to be immersed in places and situations of greater complexity than what autopilots do of today. I realize that the thinking is that if an autopilot falters that then a plane falls out of the sky, while if a self-driving car AI falters it is not going to fall from the sky, plus the human driver can just take over the controls. Keeping in mind that we are heading to Level 5 self-driving cars, we will need AI systems that have the rigors of what we expect for plane autopilots.

I hope this discussion about airplane autopilots and the AI of self-driving cars is helpful to you, and maybe when a friend or even a stranger asks you the questions about the similarities and differences, you’ll now be ready to answer their questions. By the way, we don’t yet have a pet-like name for self-driving car AI. Recall I mentioned that autopilots are often referred to as George by insiders. I think we need to have a contest to determine a catchy insider name for self-driving car AI.  I’ll start the voting, and offer that we either call it Michael or Lauren.

This content is original to AI Trends.