
The promise of autonomous vehicles has always revolved around the same message: Removing humans from the steering wheel to reduce accidentsHowever, the first official data on Tesla's robotaxis in the United States points in the opposite direction and suggests a much more uncomfortable scenario for Elon Musk's brand and for the future of this technology.
Public records from the U.S. highway safety authority show that the Tesla's robotaxis fleet in Austin, Texas It is involved in accidents significantly more frequently than the average driver. After just a few months of service, the vehicles have accumulated 14 recorded accidents and a collision rate that, according to the company's own parameters, It quadruples the risk associated with a car driven by one person.
A worrying tally: 14 accidents in just a few months of service
Tesla's experiment with robotaxis in Austin begins in June 2025, when the company puts into circulation a small group of modified Model Y vehicles to operate as autonomous taxis with a safety driver on board or a follow-up vehicle. From then until early 2026, the files submitted to the National Highway Traffic Safety Administration (NHTSA) collect 14 incidents related to these vehicles.
The last five accidents occurred between December 2025 and January 2026, a very short period that has set off alarm bells among regulators and analysts. In several cases, these were low-speed collisions with stationary objects, such as posts, trees or elements of the road, including reversing maneuvers in which the car impacts at just a few kilometers per hour.
In another recent accident, a robotaxi It crashed into a bus while the Tesla vehicle was stopped.And on another occasion, a collision with a heavy truck was recorded at about 6 km/h. There is also a record of an impact with a fixed object at 27 km/h in a straight line. On paper, these are minor impacts, but Its repetition reveals possible systemic failures in the software's decision-making..
Exact mileage figures vary depending on the source, but data compiled by specialized media outlets such as Electrek and Futurism, based on Tesla documentation and the NHTSA database, places the fleet's mileage between between 800.000 and 1,2 million kilometers accumulatedTranslated into accident rates, that means one accident approximately every 92.000 kilometers.
For comparison, Tesla's own safety reports state that an average American driver It suffers a minor collision approximately every 368.000 kilometers, and other statistics from the regulatory body raise that distance between incidents even further. Thus, robotaxis would be having a risk of accident four times higher than that of a standard human driver, and even higher if more conservative metrics are used.
Musk versus Waymo: two very different models of autonomous driving
The contrast with Waymo, Alphabet's self-driving subsidiary, is particularly significant. While Elon Musk has spent years arguing that his camera- and computer vision-based approach is sufficient to achieve full automation, Waymo has opted for a more traditional system that combines cameras, radar and LiDARand has gradually rolled it out in cities like San Francisco, Phoenix, and Austin itself.
According to data cited in US media, Waymo has already surpassed 200 million kilometers traveled in fully autonomous modewith an average of one accident every 157.000 kilometers. While not a perfect figure, the proportion is significantly more favorable than Tesla's, which with far fewer kilometers accumulated has a higher incident rate.
In Austin, the contrast is also reflected in usage levels. Reports indicate that Waymo has recorded 51 incidents in the city over a similar period, but with more than 6,3 million miles (about 10 million kilometers) completed without a human driver. Meanwhile, Tesla would have traveled around 800.000 miles (about 1,2 million kilometers) with only 42 robotaxis operating intermittently, and yet he has accumulated 14 crashes.
This difference in scale makes Waymo's relative accident rate per kilometer much lower. Furthermore, accident analyses indicate that their cars offer 80% fewer accidents with injuries and 91% fewer incidents with serious injuries Compared to human driving, this is exactly the kind of statistic you would expect from a technology created to improve safety.
For Tesla, the comparison comes at a delicate time, because the company is strongly pushing the narrative that its Cybercab, the first production robotaxis without a steering wheel or pedalsThey are on the verge of starting commercial operations. In fact, the The first production Cybercab has already rolled off the assembly line at the Gigafactory in TexasHowever, mass production will not begin for several months, and there are still doubts about even using the name due to branding issues.
On paper, Musk insists that his solution will be more scalable and efficient than that of his rivals, but the data from Austin paints a picture in which, at least for now, Waymo appears to be a more solvent and transparent operator. Regarding security and incident reporting.
Censored reports, version changes, and regulatory reprimands
Beyond the raw figures, one of the aspects that most worries regulators is the the way Tesla presents information about its accidents and the need for security audits for connected carsIn the NHTSA database, many of the narratives associated with robotaxis accidents are heavily redacted or marked as "confidential commercial information," making it impossible to know precisely what happened in each case.
This approach contrasts with that of other operators, such as Waymo or ZooxThese systems provide fairly detailed descriptions of incidents, including road conditions, vehicle behavior, and possible causes of the crash. This openness allows independent investigators and the public to access the data. better evaluate the actual behavior of the technology and detect problematic patterns.
In the case of Tesla, moreover, at least one has been detected striking review of the severity of an accidentAn incident that occurred in July 2025 was initially reported as a collision with "only property damage" following a low-speed impact with an SUV. Months later, in December, the company updated the report and reclassified it as an incident with "minor injuries requiring hospitalization"In other words, someone had to be treated at a medical center, something that was not reflected in the first version.
This late change has fueled suspicions that the company It is not communicating the real impact of its tests with the desired transparency.The NHTSA itself has opened investigations into Tesla's repeated delays in reporting some accidents and the company's tendency to hide details under the umbrella of trade confidentiality.
From a regulatory perspective, these practices complicate oversight and raise further doubts about the wisdom of deploying a robotaxis service in complex urban environments with vulnerable users, as pedestrians, cyclists or schoolchildren, without a clear and timely flow of information.
Another element that stands out in the files is that All the latest reported incidents took place during the day and in good weather conditions.In other words, these incidents occurred in a context that, in theory, should be the simplest for an autonomous driving system. This reinforces the criticism of Tesla's strategy of relying almost exclusively on cameras, without LiDAR and with very limited use of other sensors.
A deployment that arrives in Europe amid a heated debate on security
Although the incidents are occurring in the United States, the implications are being felt throughout Europe. The system Full Self-Driving (FSD) It is taking its first steps in the Old Continent —for now in very limited modes and under constant driver supervision—, while the company He speaks openly about future robotaxis services outside of North America as well..
In the EU, the regulation on autonomous vehicles is more strict and fragmented than in the United States. Each country maintains its own testing and authorization framework, and Brussels is working on common safety and liability standards. The data emerging from Austin therefore comes at a crucial moment, because They fuel the skepticism of European regulators on the true maturity of Tesla's technology.
For markets like Spain, where the deployment of autonomous vehicles is still in a very preliminary stage, figures such as one accident every 92.000 kilometers and the fact that an operator hide details from your official reports These factors can weigh heavily when authorizing more ambitious tests in urban environments. The stated priority of the authorities is that Automated systems have proven to be clearly safer than human drivers. before allowing widespread commercial use.
Furthermore, the Tesla incidents are occurring alongside ongoing investigations into other robotaxis operators. The NHTSA has taken action, for example, in a case where A Waymo vehicle struck a child near a school in Santa Monica.and studies whether these cars respond adequately to school buses and other particularly sensitive situations. The implicit message is that No actor in the sector is exempt from scrutiny.
In this context, Austin's data can become a compelling argument for European regulators to demand more evidence, greater transparency, and more stringent safety protocols before allowing a fleet of driverless taxis to circulate freely in their cities.
Transparency, reputation, and the challenge of convincing the public
Beyond the charts and graphs, the problem with Tesla's robotaxis is also one of public trustWhen a technology is presented as safer than humans and the numbers show the exact opposite, the narrative suffers. And if, in addition, information about incidents arrives late, is incomplete, or has undergone significant changes afterward, that distrust multiplies.
In the United States, the controversy surrounding FSD and robotaxis is not new, but the accumulation of incidents in Austin and the way they have been reported have opened an additional front. Tesla maintains that this data is part of a continuous learning process and that system performance improves with each software update, but for now the official records do not fully support that claim.
Other companies in the sector are aware that their image is at stake with every kilometer traveled. Waymo and Zoox, for example, have opted for a more open communication policyThis facilitates access to detailed descriptions of crashes for researchers, the media, and the public. This strategy doesn't eliminate accidents, but it helps create the perception that There are fewer things to hide.
For Tesla, the situation is even more delicate because its brand is closely linked to the figure of Elon Musk. very ambitious messages about the immediate future of the autonomous carWhen official data shows that its robotaxis crash four times more often than humans, the discourse of technological superiority loses steam and opens the door for regulators, competitors, and users to question the timelines and the way in which the company is deploying its services.
In the end, the discussion about robotaxis in Austin ends up being a stress test for the entire industry. Real security, transparency in reporting, and the ability to correct errors quickly These factors will determine whether driverless taxis become an everyday reality in Europe as well, or whether they remain in a monitored experiment phase for longer. For now, the data published by the NHTSA puts Tesla in an awkward position: its autonomous vehicles crash more often than humans, and its reporting of incidents doesn't exactly help to dispel doubts.

