FRIDAY, MAY 15, 2026VOL. XXVI · NO. 17
Cars

Autonomous Vehicles Keep Failing the Unscripted Parts

Three stories, three different failure modes — and one industry still pretending edge cases are exceptions.

By Chasing Seconds · MAY 15, 20263 minute read

Photo · Carscoops

There's a version of the self-driving future that exists only in press releases. Clean intersections, orderly merges, polite handoffs between machine and road. Then there's the version playing out in real American streets right now: a Waymo getting swallowed by a Texas flood, dozens of empty robotaxis circling a confused Atlanta neighborhood like they forgot why they came, and Tesla quietly unredacting seventeen crash reports it had been calling confidential business information.

That last one is worth sitting with for a second. Confidential business information. Crashes. Filed with federal safety regulators. Hidden.

Three Failures, One Pattern

According to Carscoops, Waymo has issued a recall affecting thousands of vehicles after one of its robotaxis drove into a flooded road and was carried away by the water. The company has since paused its Texas operations while it works on a fix. The vehicle apparently couldn't reliably judge whether a road was passable — one of those scenarios that a sixteen-year-old with a learner's permit would handle by instinct, by the look of the water, the angle of the current, the way other drivers were behaving around it. The machine read none of that. It drove in.

Meanwhile, Jalopnik has been tracking something stranger and somehow funnier in suburban Atlanta, where an endless loop of empty Waymo vehicles keeps returning to the same neighborhood. No passengers. No apparent destination. Just dozens of autonomous cars appearing, circling, and reappearing in a residential area that didn't ask for them. The image is almost too good as metaphor — software confident enough to navigate but not wise enough to know it has nowhere to go.

And then there's Tesla. Electrek reports that the company has finally unredacted all seventeen of the crash narratives it filed with NHTSA related to its autonomous driving system — making it, until now, the only ADS operator to have blacked out every single one. The newly visible data is mixed: most incidents apparently weren't the fault of the autonomous system. But some of what's in there is genuinely concerning, and the fact that it took this long to see any of it is its own kind of story.

What the Industry Keeps Calling Edge Cases

The phrase you hear most often when autonomous vehicles fail is edge case — as if the world is mostly tidy and only occasionally weird. But flooded roads aren't rare in Texas. Traffic loops around residential neighborhoods aren't exotic. And crashes, wherever the fault lies, are exactly the kind of thing regulators exist to examine without a company's redaction stamp on top.

What these three stories share isn't incompetence, exactly. Waymo's technology is genuinely impressive in controlled conditions. Tesla's system handles the majority of scenarios without incident. The engineering behind all of it represents decades of serious work by serious people. That's not the issue.

The issue is the gap between what these systems can do and what the roads actually demand — and the industry's persistent tendency to treat that gap as a communications problem rather than a technical one. Pause the service. Unredact the reports. Tell the neighborhood you're working on it. The response to each failure is managed, measured, careful. The failures themselves keep arriving unmanaged, unmeasured, unannounced.

A car that drives itself into a flood isn't a software bug. It's a reminder that the road doesn't negotiate. It doesn't wait for the next update. It doesn't care how good the demo looked.

The unscripted parts of driving — the weird intersection, the water you can't gauge, the neighborhood that suddenly looks like every other neighborhood — are not the exception. They are the whole job.

End — Filed from the desk