This past Sunday night, an Uber autonomous vehicle struck and killed a pedestrian in Tempe, Arizona. The local police have reported that the victim was 60 yards from the nearest crosswalk when she stepped out of the median into the vehicle’s lane, leaving no time for the vehicle, traveling about 40 mph, to avoid her. Nor did the human backup driver intervene.

While Uber is likely to settle any resulting lawsuits — fast — to protect its proprietary data, this accident raises significant questions about the safety of autonomous vehicles, and those questions won’t go away as easily. One way or another, there’s bound to be a considerable collection of data from the vehicle. How can ediscovery lawyers prepare for the onslaught?

Expect a dizzying array of data from the vehicle’s sensors. In addition to standard video cameras, Uber’s cars may also use nonvisual sensors, such as microphones or thermal imaging cameras, to detect objects that may intersect a car’s path. Radar, lidar, and GPS sensors also record important data points; most new vehicles already record their direction, speed, acceleration, and braking rates and, in many cases, detect nearby objects to warn of or prevent accidents, so many of these will be familiar.

The bigger concern is how to collect, review, analyze, and interpret the data and algorithms that underlie these autonomous vehicles’ driving decisions. Autonomous vehicles learn from not only their own experiences but also each other’s. Their decisions are often based on information that they’ve gleaned from hours of collective driving, not that anyone ‘told’ them. It will be daunting just to figure out what combination of driving experiences and programmed rules this vehicle drew from in deciding not to slow down when approaching a person in the median (assuming, of course, that the vehicle sensed her presence). Collecting potentially relevant data and reviewing it will pose an unprecedented challenge for ediscovery professionals.

After the accident, Uber immediately suspended its autonomous vehicle testing and announced its cooperation with the local police. This incident will create an early model for the collection and analysis of autonomous vehicle data, for legal culpability, and for regulatory oversight.