NTSB Releases Preliminary Report on Fatal Uber Crash; Vehicle 'Saw' Victim 6 Seconds Before Impact

Steph Willems
by Steph Willems

The Volvo XC90 that hit Elaine Herzberg on a darkened Tempe, Arizona street was travelling 43 mph at the time of impact. Guided by a combination of cameras, radar sensors, and lidar designed to cut through the gloom, the two-ton SUV “saw” the victim 6 seconds before impact, according to a preliminary report released by the National Transportation Safety Board.

The Volvo, operated by Uber Technologies, applied the brakes 1.3 seconds after impact. However, it wasn’t autonomous software that ended up sending pressure the front and rear pistons. A human did that.

It isn’t the NTSB’s job to assign blame in its preliminary report. The agency simply wants to nail down the facts of what occurred in the lead-up to, and aftermath of, the fatal March 18th collision. Here’s some key findings:

Herzberg entered the road 360 feet south of a marked crosswalk, dressed in dark clothing, in an area with no direct illumination. She didn’t turn her face towards the vehicle until the last moment, and the bicycle she was walking across the road had no side reflectors. A toxicology test turned up methamphetamine and marijuana use.

The vehicle blended Volvo’s own collision avoidance system (which includes automatic emergency braking) with Uber’s forward and side-facing cameras, radar, lidar, and navigation sensors. However, “The Volvo functions are disabled only when the test vehicle is operated in computer control mode,” the NTSB noted.

From the NTSB:

The report states data obtained from the self-driving system shows the system first registered radar and LIDAR observations of the pedestrian about six seconds before impact, when the vehicle was traveling 43 mph. As the vehicle and pedestrian paths converged, the self-driving system software classified the pedestrian as an unknown object, as a vehicle, and then as a bicycle with varying expectations of future travel path. At 1.3 seconds before impact, the self-driving system determined that emergency braking was needed to mitigate a collision. According to Uber emergency braking maneuvers are not enabled while the vehicle is under computer control to reduce the potential for erratic vehicle behavior. The vehicle operator is relied on to intervene and take action. The system is not designed to alert the operator.

In the report the NTSB said the self-driving system data showed the vehicle operator engaged the steering wheel less than a second before impact and began braking less than a second after impact. The vehicle operator said in an NTSB interview that she had been monitoring the self-driving interface and that while her personal and business phones were in the vehicle neither were in use until after the crash.

Three sentences here stand out: “According to Uber emergency braking maneuvers are not enabled while the vehicle is under computer control to reduce the potential for erratic vehicle behavior. The vehicle operator is relied on to intervene and take action. The system is not designed to alert the operator.”

We’ve heard reports that Uber was concerned about the high number of “false positives” — objects in the vehicle’s path that needn’t require sudden braking, such as a wind-blown plastic bag. If the vehicle was indeed programmed to initially ignore many objects on the roadway, why wouldn’t there at least be an alert sent to the human driver whose foot is hovering over the brake pedal? A simple rapid beep wouldn’t lead to erratic vehicle behavior, but it would increase safety.

Instead, we have a vehicle operator tasked with avoiding sudden emergencies whose eyes are monitoring a bright multimedia screen, not the darkened road. And no warning to alert the driver to impending danger.

An initial warning could have been sent out six seconds before impact. True, it’s possible the driver wouldn’t have been able to see Herzberg given the factors mentioned earlier, but at least the driver would be looking for something. Brake pressure might even have been applied prior to the vehicle’s conclusion that an emergency maneuver was necessary.

A video released by Tempe police shows the driver staring at the screen for four or five seconds immediately prior to the collision, then looking up and reacting to the sight of Herzberg in the car’s path, presumably less than a second before impact.

In the wake of the crash, Uber shut down on-road testing of its autonomous vehicle fleet, and just yesterday decided to pull out of Arizona altogether. The company recently hired a safety official to scrutinize its program.

“We have initiated a top-to-bottom safety review of our self-driving vehicles program, and we have brought on former NTSB Chair Christopher Hart to advise us on our overall safety culture,” Uber said earlier this month. “Our review is looking at everything from the safety of our system to our training processes for vehicle operators, and we hope to have more to say soon.”

The NTSB’s investigation into the crash continues.

[Image: Uber Technologies, via Tempe Police Department]

Steph Willems
Steph Willems

More by Steph Willems

Comments
Join the conversation
2 of 42 comments
  • Wheatridger Wheatridger on May 24, 2018

    In this case you had a car's sophisticated, multi-sensored systems detecting a hazard ahead, but it had no way to alert the driver/monitor and no way to apply the brakes itself. So what good is it doing? Functionally, this sounds like the Google vehicle had nothing but the typical lane-following and speed control operating, plus a presumptive illusion of infallibility.

  • THX1136 THX1136 on May 25, 2018

    Many mentions here - and from prior articles - about the lighting in the area and the pedestrians dark clothes. What is being tested? The human safety driver or the vehicle? If it is the safety driver, was she told to monitor the vehicle, the road and vehicle or the road? It changes the way the safety driver behaved in the situation. If it's the vehicle, the lighting of the area and the color of the pedestrians clothes is moot. The original video and the later video posted by a driver in the same area (at the same hour of the evening with a camera set to better represent the actual conditions) show two very different views of the area in question. Again, if we're testing the vehicle the lighting conditions in the area are largely moot. It's systems do not primarily rely on visibility for it's correct operation. My guess based on how this is being reported is the safety driver was told to monitor the vehicle's "read outs" primarily and the actual road secondarily. That would be the mindset of someone testing a vehicle in my mind. Observe the vehicle first as it will alert you to when you need to shift your attention to the road being traveled. I believe the human driver, had she been monitoring the road primarily or received inputs from the vehicles systems, could have avoided the accident or, at the very least, would have changed the severity of the accident.

  • MaintenanceCosts It's not a Benz or a Jag / it's a 5-0 with a rag /And I don't wanna brag / but I could never be stag
  • 3-On-The-Tree Son has a 2016 Mustang GT 5.0 and I have a 2009 C6 Corvette LS3 6spd. And on paper they are pretty close.
  • 3-On-The-Tree Same as the Land Cruiser, emissions. I have a 1985 FJ60 Land Cruiser and it’s a beast off-roading.
  • CanadaCraig I would like for this anniversary special to be a bare-bones Plain-Jane model offered in Dynasty Green and Vintage Burgundy.
  • ToolGuy Ford is good at drifting all right... 😉
Next