Skip to main content
Back to Blog
Educational

Autonomous Vehicle Accidents: Who's Liable When Self-Driving Cars Crash?

Bond LegalFebruary 18, 202610 min read
Autonomous Vehicle Accidents: Who's Liable When Self-Driving Cars Crash?

Autonomous vehicles (AVs) are no longer a futuristic concept — they're on public roads across the United States right now. Tesla's Autopilot and Full Self-Driving (FSD), Waymo's robotaxis, and Cruise's autonomous fleet have collectively logged billions of miles. But with that mileage has come a growing number of accidents, injuries, and fatalities that raise unprecedented legal questions.

The Current State of Autonomous Vehicles

The Society of Automotive Engineers (SAE) defines six levels of driving automation, from Level 0 (no automation) to Level 5 (full automation). Most vehicles currently on the road operate at Level 2 (partial automation — like Tesla Autopilot) or Level 4 (high automation in limited areas — like Waymo robotaxis). No Level 5 vehicles are commercially available.

The NHTSA's Standing General Order (SGO) requires manufacturers to report crashes involving vehicles equipped with automated driving systems. Since 2021, hundreds of crashes involving AVs have been reported, including fatal incidents. Tesla alone has been linked to hundreds of crashes involving Autopilot or FSD, with NHTSA investigations ongoing.

Who Is Liable in an Autonomous Vehicle Accident?

This is the central legal question — and it's genuinely complex. Potential liable parties include:

The vehicle manufacturer — for defects in the autonomous driving system's hardware or software. This is a product liability claim under strict liability, negligence, or breach of warranty theories. The software developer — if the AV software failed to detect a hazard, misinterpreted sensor data, or made a dangerous driving decision. The vehicle owner/operator — if they were supposed to be monitoring the system (Level 2) and failed to intervene. Tesla's terms require drivers to maintain attention even when Autopilot is engaged.

The sensor/component manufacturer — if a LiDAR sensor, camera, or radar unit was defective. The mapping/data provider — if outdated or inaccurate map data caused the AV to navigate incorrectly. Government entities — if road infrastructure (signage, lane markings) was inadequate for AV systems to interpret safely.

Tesla Autopilot: A Special Case

Tesla occupies a unique position because its marketing aggressively promotes 'Full Self-Driving' capabilities while simultaneously requiring drivers to remain attentive. This creates a tension: if the system is truly capable of self-driving, why must the driver pay attention? And if it's not, is 'Full Self-Driving' false advertising?

Multiple lawsuits and NHTSA investigations have focused on this gap. For victims of Tesla Autopilot crashes, the legal strategy typically involves: product liability claims against Tesla for a defective autonomous system, negligence claims against the Tesla driver for failing to monitor the system, and potentially consumer fraud claims based on Tesla's marketing of FSD capabilities.

Waymo and Cruise Robotaxis

Fully autonomous robotaxis operating without a human driver present even starker liability questions. When a Waymo vehicle causes an accident, there is no human driver to blame — liability falls squarely on the technology company. Waymo and Cruise carry substantial insurance policies (typically $5M+) specifically to cover autonomous vehicle accidents.

For victims, this can actually simplify the claims process: there's no dispute about driver negligence, no comparative fault argument against the victim for the AV's actions, and a well-funded corporate defendant with ample insurance coverage.

Critical Evidence in AV Accident Cases

Autonomous vehicles generate massive amounts of data that can be pivotal in your case: Sensor logs — LiDAR, camera, radar, and ultrasonic sensor data showing exactly what the vehicle 'saw' before the crash. Decision logs — records of the AI's decision-making process, including what actions it considered and why it chose a particular response. Software version history — which version of the autonomous system was running, and whether known bugs or updates were pending.

Over-the-air (OTA) update records — Tesla and others frequently update their software remotely. Was the vehicle running outdated software with known safety issues? Disengagement reports — records of when the autonomous system handed control back to the human driver, or when the driver took over. Telemetry data — speed, acceleration, steering inputs, and braking data in the seconds before the crash.

What to Do If You're Hit by a Self-Driving Car

1. Call 911 and specifically report that an autonomous or self-driving vehicle was involved. 2. Document the vehicle — photograph the AV's make, model, license plate, and any visible sensors or cameras. Note the company name (Waymo, Cruise, etc.) if it's a robotaxi. 3. Identify the vehicle's autonomy level — was there a human driver present? Were they attentive? 4. Preserve dashcam or phone footage of the AV's behavior before the crash. 5. Contact an attorney immediately — AV cases require significant experience, and manufacturers will move quickly to control the evidence.

Bond Legal stays at the forefront of autonomous vehicle litigation. As AV technology expands across the states where we practice, we're prepared to hold manufacturers accountable when their technology fails and innocent people are hurt.

autonomous vehicleself-driving carTesla AutopilotWaymoproduct liabilitycar accident
Share

Injured? Get a Free Case Review

Our attorneys are ready to fight for you. Contact us today for a free, no-obligation consultation.