“Safety drivers” are a sham and a scapegoat: Uber and Desert Bus

This past weekend, one of Uber's autonomous vehicles struck and killed a pedestrian in Arizona, Elaine Herzberg. Video reveals that even as Elaine crossed in front of the car, the car did not stop or even slow. She was well within range of the vehicle's LIDAR, and local residents have posted their own videos showing that there was plenty of light to see by as well.

It's clear that Uber's reckless and callous company culture has produced an unsafe vehicle; in retrospect, it was always going to be Uber that took the first life. But what of the so-called "safety driver" sitting in the driver's seat? Dash video shows that she was looking down at the time of the accident (perhaps at a phone). Is she not to blame?

I submit that blaming the safety driver is both useless and dangerous.

What is a safety driver for? We are told that they are there to take over control from the car if the car fails to act properly. Fair enough! But what does that actually entail? What is the actual, human experience of performing this task?

Admittedly I have no experience as a safety driver, but as a human who has done many human things including a fair bit of driving, I've got a pretty good guess.

Learning to do nothing

If the car performs reasonably well, the safety driver's correct behavior (when observed from the outside) is to do... nothing. And if the car is being allowed onto public streets, one would hope that the car does perform well 99.999+% of the time. So picture yourself at the wheel: The car is cruising along, braking for stoplights, merging, doing all the usual driving things. OK, you're coming up to a stoplight, and it's just turning yellow: Do you hit the brakes?

Maybe if you're brand new to this, you hit the brakes, because you're used to it. A few hours in you've learned to suppress your reflexes. But sometimes you're not sure if the car has got it—you and the car are going to make different judgments and have different speed reduction curves. Sometimes the car will play it more conservatively, and there's no problem—each time this happens, you learn "the car's got this". At other times, you're more worried than the car is, responding to social cues it doesn't pick up on. So you brake, or steer, and then (presumably) have to take over and restart everything, and it turns out the car actually was doing just fine. "Maybe I'll just trust the car next time."

The experience of being a safety driver, if the car mostly works, must inevitably become "trust the car, do nothing, suppress your reflexes". Even if you spot a hazard, you're going to have to fight your learned passivity, costing precious centiseconds.

Desert Bus

But it gets worse. If you haven't heard of Desert Bus, allow me to quote from Wikipedia:

The objective of the game is to drive a bus from Tucson, Arizona, to Las Vegas, Nevada, in real time at a maximum speed of 45 MPH. The feat requires eight hours of continuous play to complete.

The bus contains no passengers, there is little scenery aside from an occasional rock or bus stop sign, and there is no traffic. The road between Tucson and Las Vegas is completely straight. The bus veers to the right slightly, and thus requires the player's constant attention. If the bus veers off the road it will stall and be towed back to Tucson, also in real time. If the player makes it to Las Vegas, one point is scored. The player then has the option to make the return trip to Tucson for another point, a decision which must be made in a few seconds or the game ends. [...] Although the landscape never changes, an insect splats on the windshield about five hours through the first trip, and on the return trip the light fades, with differences at dusk, and later a pitch black road where the player is guided only with headlights. The game cannot be paused.

Does playing Desert Bus sound like fun? No, of course not. It's not supposed to be fun, it's a joke game. Driving is exhausting. Highway hypnosis causes regular drivers to literally fall asleep at the wheel.

But somehow, safety drivers get paid to do something arguably even more mind-numbing than Desert Bus: Sure, the scenery might be more interesting the first few times around, but in Desert Bus you get to steer. The safety driver isn't even allowed that small liberty, that small engagement. It's Desert Bus, but if you look away, someone dies.

It's the worst of both worlds: As a regular driver, you're engaged, active, in a heightened state of awareness as you watch for dangers and plot a course; as a passenger, you're free to relax, read a book, make conversation, flip through your phone. The safety driver has neither outlet.

The safety driver is doomed to failure. No matter how carefully they watch the road at first, after the first hours, days, weeks... their attention will drift. The car seems to know what it's doing, but more importantly, the safety driver is dangerously bored and suffering from attention fatigue. Humans just don't work this way. We can't sustain attention for long periods, or even for relatively short periods if it's just like all the other times. [Update: A few people have noted a similarity to the TSA, who are expected to be vigilant for hours and experience a continuous stream of false positives. And the results are abysmal.]

The safety driver is doomed to fail.

Who is to blame?

Is the safety driver at fault in this incident? Certainly. She was looking at her lap for 5 seconds, according to people who have viewed the rear-facing dash cam video; she was negligent as the nominal operator of the vehicle. But I would contend that, depending on her training and time already spent in the car, the alternative might have been her staring blankly into space through the windshield, and still failing to notice the pedestrian.

(What of the pedestrian? "She wasn't using the crosswalk" I've heard people say. First, there aren't crosswalks available in that area. Second, in any case the punishment for jaywalking is not the death penalty. Enough said on this.)

The safety driver was being paid by Uber to sit in the driver's seat of a car that she had been told was pretty damn good at driving. She had observed this for herself. The primary safety mechanism in place was the autonomous vehicle's software, and the safety driver was supposed to be just that, a safety. A backup. The car failed, and failed miserably. Whether or not it understood that Elaine was a pedestrian, or became confused by her outline because she was walking a bicycle, the fact remains: The car failed to stop or even slow for a large object crossing its path. Elaine was visible and on a road-crossing trajectory for (I hear) at least 1.4 seconds before the collision. Human drivers frequently are called upon to act in much less time than that. It's a tremendously long time at road speeds, and if a self-driving car can't slow to within sub-fatal speeds or take evasive action within that time, it is a danger to the public and must not be permitted on the road.

From the little I know, I feel that criminal law has not adapted well to software issues, or even to corporations. (Civil law is doing better, I think, but seems to largely come down to monetary settlements.) I don't know what's actually feasible here. I don't know how deep the NTSB's investigation will go, or how much power they will have to dig into Uber's records, culture, and technology. But what I want is this: For there to be a full sociotechnical incident analysis, and as appropriate to the findings, for Uber to be fined; for Uber's autonomous vehicle program to be barred from public roads pending some condition; and for decision makers within Uber to be charged with criminal negligence and contribution to vehicular manslaughter. If we permit fault to come to rest solely on the safety driver, the corporate culture will not change, corporations will continue to literally get away with manslaughter, and more deaths will follow.

Sidebar: Liability

There's another role safety drivers can play, a way they can be useful to the company. Uber is known for liability shifting. Look at how they manage to shift liability onto their drivers, who must pay their own health insurance, car insurance, and fines for operating unlicensed taxis. I have a dark suspicion that part of the role of the safety driver, unbeknownst to them, is to be thrown under the bus (so to speak) in case of a fatality. We'll see what happens next.

Note that Uber moved their testing from California to Arizona for the low-regulation environment. Note also that the Tempe police department has been extremely quick to comment on the case and even dismiss the idea that Uber was in any way to blame, breaking with common practice of not commenting on ongoing investigations or opining publicly on fault. I believe it is likely that Arizona's government will find ways to take the heat off of Uber so as to keep their business, and that these are just the early signs of it.

Update: Worse than I thought

Shortly after posting this, I found an NYTimes article: Uber’s Self-Driving Cars Were Struggling Before Arizona Crash. A salient quote:

Waymo, formerly the self-driving car project of Google, said that in tests on roads in California last year, its cars went an average of nearly 5,600 miles before the driver had to take control from the computer to steer out of trouble. As of March, Uber was struggling to meet its target of 13 miles per “intervention” in Arizona, according to 100 pages of company documents obtained by The New York Times and two people familiar with the company’s operations in the Phoenix area but not permitted to speak publicly about it.

Now, "average miles between interventions" is a very crude metric. It would be affected by driving conditions, driving setting (highway vs. city), and operator training. But holy cow, 13 miles? I don't even know what to do with that. It certainly undercuts my post, since Uber's safety drivers are certainly getting practice. And then the article goes on to talk more about inattention, and about Uber's decision to make one operator responsible for the duties of both safety driver and data gatherer—which often took the form of the safety driver poking around on an iPad while at the wheel. Fantastic. Of the dashcam video from the recent crash, « It also appeared that the driver’s hands were not hovering above the steering wheel, which is what drivers are instructed to do so they can quickly retake control of the car. » Here, try an experiment: Sit down, and pose your hands over an imaginary wheel. How long can you hold that posture? For me, fatigue started to set in after just a minute or two.

So maybe all of what I wrote above doesn't even apply to Uber, but to the other companies. It sounds like Uber had far more immediate and unsubtle problems than safety driver attentional fatigue.

Bonus link, which I haven't read (it's pretty long), but discusses some of these issues: The Challenges of Partially Automated Driving (Communications of the ACM, 2016).

Update 2: Science

I'm also curious if any of these companies have done any research on what makes a safety driver effective. Have they done controlled experiments to measure what kind of training actually produces effective results? What sort of duty cycle or schedule allows the operator to remain sharp, focused, and able to perform an intervention? It seems like an important bit of research.


No comments yet. Commenting is not yet reimplemented after the Wordpress migration, sorry! For now, you can email me and I can manually add comments. Feed icon