Federal Agency Opens Safety Review Into Self-Driving Car Crashes by Startup Avride
Federal regulators are investigating 16 crashes involving self-driving cars made by startup Avride. The vehicles had problems changing lanes safely, recognizing slower traffic ahead, and avoiding obst

Federal Agency Opens Safety Review Into Self-Driving Car Crashes by Startup Avride
The National Highway Traffic Safety Administration, the federal agency that oversees car safety, has begun investigating 16 crashes involving self-driving vehicles made by a startup called Avride. The crashes happened in Texas. The investigation is looking at three specific problems: the vehicles changing lanes into other cars, failing to slow down or stop for cars ahead of them, and hitting stationary objects that partially block the road.
These are not minor issues. They represent basic skills that any self-driving car needs to perform safely. A car that cannot change lanes safely, cannot follow other vehicles properly, or cannot spot obstacles in its path should not be carrying passengers.
Why These Crashes Matter
Self-driving cars work by using cameras, radar, and other sensors to see the world around them, much like how you use your eyes and other senses to drive safely. The three crash types the federal investigators are examining suggest problems with different parts of that sensing and decision-making process.
When a self-driving car changes lanes into another vehicle, it means the system did not accurately know where the other car was or how fast it was moving. This is a failure in understanding the immediate surroundings.
When a self-driving car fails to slow down for a car ahead, it suggests a problem with recognizing slower-moving traffic or deciding to brake. Most modern self-driving systems have backup safety features designed to slam on the brakes if the car ahead gets too close, but something appears to have failed in Avride's vehicles.
When a self-driving car hits something partially blocking a lane — like debris or a broken-down vehicle — it faces a choice: change lanes around it or stop and wait. The system needs to decide quickly and correctly. If it cannot, crashes happen.
What Happens Next
The federal investigation is the standard first step when regulators suspect a serious safety problem. The agency will ask Avride for detailed technical information and data from the crashes. If the evidence shows a widespread safety defect, the investigation could expand and potentially lead to a recall — an order requiring the company to fix or remove the vehicles from service.
Avride, which operates some of its self-driving vehicles through Uber's ride-hailing service, said it welcomes the investigation. This is the typical response from companies under federal scrutiny — it signals willingness to cooperate.
Other self-driving companies, including Tesla and Waymo, have also faced similar federal investigations in recent years. As more self-driving vehicles move from test tracks onto public roads with real passengers, regulators are paying closer attention.
The Real-World Challenge
Avride's vehicles in Texas are doing something harder than traditional testing. They are picking up and dropping off passengers in unpredictable places, navigating varied traffic conditions, and operating under real-world time pressure. That is fundamentally different from a self-driving car that operates only on a carefully mapped route or in controlled conditions.
This is something we have encountered before with other technologies. Early voice assistants in smartphones worked well when you spoke clearly in a quiet room, but stumbled when you had an accent, there was background noise, or you phrased your command differently. The leap from a controlled environment to the messy real world is where many technologies reveal their weaknesses.
The difference with self-driving cars is stark: if a voice assistant fails to understand you, you repeat yourself. If a self-driving car fails, people can get hurt.
The technical challenges Avride faces are not unique to the company. Deciding when to change lanes safely, following other cars at the right distance, and spotting obstacles remain difficult for the entire industry. Some companies like Waymo have tackled these problems by using detailed maps, limiting where their cars can operate, and being more cautious overall. Avride's crashes suggest the company may not have solved these problems as thoroughly.
Looking Ahead
The federal investigators will examine Avride's sensors, software, testing procedures, and safety measures. The company will need to provide detailed documentation and explain how it ensures its vehicles are safe.
For the broader self-driving car industry, this investigation sends a message: regulators expect companies to thoroughly test self-driving systems before putting passengers inside them. The crashes Avride's vehicles have had — failing to change lanes safely, missing slower traffic, hitting obstacles — should have been caught and fixed before the cars ever drove a paying customer.
This investigation also shows how federal regulators are becoming more skilled at evaluating self-driving car claims. As the technology moves from experimental projects to commercial services, government oversight is becoming more systematic and demanding.


