Getting stuck inside a glitching robotaxi sounds like sci-fi filler until you line up the incident reports, safety manuals, and federal investigations. Then it looks less hypothetical. In San Francisco, Phoenix, and Las Vegas, autonomous ride services have expanded while regulators keep tracking immobilizations, hard braking, and emergency-access questions. The real issue is not just whether a robotaxi crashes. It is what happens when the vehicle stops, the doors do not behave as expected, and the passenger suddenly has to trust software, remote support, and a hidden manual release.
Why “trapped in a robotaxi” is becoming a real consumer fear
The fear is simple: you are no longer just riding in a car. You are riding inside a software system with locks, sensors, remote operators, and fallback protocols. When that system glitches, the experience changes fast. Federal regulators have already documented cases where autonomous vehicles became immobilized or behaved unpredictably in public streets. NHTSA says it monitors automated driving safety through a Standing General Order that requires crash and incident reporting for vehicles equipped with automated driving systems, a framework the agency highlighted again in its automated vehicle safety guidance page crawled in November 2025. That matters because immobilization is not a fringe issue. It is a known category of failure.
Cruise became the clearest warning sign. NHTSA’s preliminary evaluation, later summarized by Ars Technica in August 2024, examined whether Cruise robotaxis could engage in “inappropriately hard braking” or become immobilized while driving. The same report said investigators reviewed 7,632 hard-braking events in Cruise data. That number is not about passengers being locked in, specifically, but it shows how often autonomous systems can create abrupt, confusing situations for riders and surrounding traffic. In August 2023, California regulators also moved against Cruise after recurring problems involving unexpected stops and erratic behavior, according to the Associated Press.
Then there is the public psychology of it. A stalled human-driven taxi is annoying. A stalled robotaxi feels different because there may be no driver to reassure you, no obvious explanation, and no immediate sense of control. That gap between technical redundancy and human confidence is where this fear lives.
What the official manuals reveal about being stuck inside
The strongest evidence that this is a real design concern comes from the companies themselves. Zoox’s rider manual, published in September 2025, explicitly tells passengers that the robotaxi may stop because of a brief delay and instructs riders to wait for the vehicle to resume driving or for Zoox Support to contact them. It also says that if riders need to exit, they should use the Emergency Door Handle. That is a crucial detail. You do not write emergency door instructions into a rider manual unless the possibility of a non-routine stop is part of the operating reality.
Zoox’s law-enforcement guide from September 2025 goes further. It references emergency access to robotaxi doors, says the emergency release handle will illuminate its location, and notes that powered door operation will be maintained until the system can no longer support it. It also instructs responders to ensure autonomous mode is deactivated by contacting Zoox remote operators. In plain English: if something goes wrong, there is a designed chain of escalation involving the vehicle, the rider, and a remote human team.
Waymo has taken a similar approach, though with different public documentation. Its first responder materials, available on its official site and updated through late 2024 and early 2025, emphasize emergency manual disengagement protocols and coordination with public safety officials. Waymo also disclosed in February 2024 that it issued a voluntary software recall after two low-speed collisions with the same towed pickup scenario, noting that neither vehicle was carrying riders and that there were no injuries. The company said at that time it had driven more than 10 million fully autonomous miles and served more than 1 million ride-hail trips. Those scale figures are reassuring on one level. They are also a reminder that even edge-case failures become meaningful once fleets get large enough.
The incidents that turned a niche risk into a mainstream one
The most dramatic robotaxi stories have involved collisions, not passengers physically unable to exit. But immobilization and access problems sit right next to those headline events. In December 2025, a San Francisco power outage knocked out traffic lights across large parts of the city and paralyzed Waymo robotaxis, according to Ars Technica. The report said the outage at times affected as much as a third of the city and more than 130,000 homes. Waymo vehicles stopped in place and clogged traffic. That is not the same as riders being locked in, but it shows how quickly external infrastructure failures can freeze autonomous mobility.
Cruise offered another lesson. After an October 2, 2023 pedestrian incident in San Francisco, California suspended Cruise’s robotaxi operations, and NHTSA opened a probe on October 16, 2023, according to Ars Technica and AP reporting. AP later reported in December 2024 that NHTSA closed its preliminary Cruise investigation without further action, while GM simultaneously retreated from the robotaxi business. The regulatory closure did not erase the reputational damage. For consumers, the takeaway was harsher: if a robotaxi gets confused in a crisis, the passenger may be the last person in the chain to understand what is happening.
Zoox has faced scrutiny too. AP reported that NHTSA investigated Zoox after two rear-end crashes involving sudden braking. A separate NHTSA recall report tied to a May 8, 2025 San Francisco incident said an unoccupied stopped Zoox robotaxi operating in autonomous mode was involved in an event later reported to the agency on May 9, 2025. Again, no trapped rider there. But the pattern is consistent. Sudden stops. Immobilized vehicles. Software updates after edge cases. That is the ecosystem in which rider entrapment anxiety grows.
What riders should actually worry about, and what they should not
The biggest risk is not that robotaxis are secretly designed to imprison passengers. There is no evidence of that. The more realistic concern is a temporary loss of clarity during a system fault: doors that stay powered but do not respond normally, a vehicle that stops in traffic, a rider who does not know where the emergency release is, or a remote support interaction that takes longer than the passenger expects. In other words, the danger is procedural confusion under stress.
That distinction matters because the safety debate around autonomous vehicles often gets flattened into one question: are they safer than human drivers? On some metrics, Waymo has argued yes. Ars Technica reported in March 2025 that after 50 million miles, Waymo’s crash rate appeared lower than that of human drivers in several categories. Even if that broad claim holds, it does not answer the rider-experience problem. A statistically safer system can still feel terrifying if a passenger cannot immediately open a door or understand why the car has frozen.
There is a useful comparison here with Tesla door investigations. In October 2025, AP reported that NHTSA opened an investigation into Tesla door issues after parents said faulty handles trapped children in the back seat. NHTSA noted that manual releases existed but that a child might not be able to reach or understand them. Different technology, same human-factors lesson: emergency exits only work if ordinary people can find and use them under pressure.
How to protect yourself before you ever sit down
If you use a robotaxi, do three things before the ride starts. First, locate the manual or emergency door release. Do not assume you will figure it out later. Second, identify the in-app or in-cabin support option for contacting a remote operator. Third, pay attention during the safety briefing, even if it feels repetitive. Those steps sound basic. They are exactly the kind of basics people skip until the cabin goes quiet and the vehicle does something unexpected.
There is also a policy lesson here. Regulators have spent years focusing on crashworthiness, braking, pedestrian detection, and reporting obligations. They should keep doing that. But the next phase of robotaxi trust may hinge on something more mundane: whether a first-time rider can exit quickly, confidently, and without instruction during a glitch. That is not a side issue anymore. It is part of the product.
Frequently Asked Questions
Can a robotaxi really trap a passenger inside?
There is public evidence that robotaxi companies design for non-routine stops and emergency exits, which means the risk is taken seriously. Zoox’s September 2025 rider manual tells passengers to use an Emergency Door Handle if they need to exit during a stop. That does not prove widespread entrapment incidents, but it does confirm the scenario is operationally relevant.
Have regulators investigated robotaxi immobilization problems?
Yes. NHTSA and state regulators have examined autonomous vehicle issues involving hard braking, unexpected stops, and immobilization. Cruise drew especially intense scrutiny after incidents in San Francisco in 2023, and federal investigators reviewed thousands of hard-braking events before closing the preliminary probe in late 2024.
Are Waymo and Zoox less safe than human-driven taxis?
That is not a simple yes-or-no question. Some company-backed and third-party reporting suggests Waymo performs better than human drivers on several crash metrics over large mileage samples. But rider safety is not only about crash frequency. It also includes whether passengers can understand and control the situation during a software or hardware fault.
What should I do if a robotaxi stops and will not open normally?
Stay calm, use the in-vehicle support or app-based help function, and locate the emergency door release immediately. Follow the vehicle’s posted safety instructions. If there is an urgent hazard, use the manual release rather than waiting for the system to recover on its own.
Why does this fear feel stronger than getting stuck in a normal car?
Because a robotaxi removes the human driver, who usually acts as the first layer of explanation and reassurance. In a glitch, the passenger may feel alone with software, sensors, and remote support. That loss of visible human control makes even a temporary stop feel more threatening.
Conclusion
Robotaxis are not just another ride-hailing upgrade. They are a new kind of enclosed, software-mediated public space. That changes the fear profile. The nightmare scenario is no longer only a crash. It is being inside a machine that stops making sense while you are still inside it. Official manuals, federal investigations, and real-world immobilization events all point to the same conclusion: the industry knows this is a live issue. Riders should know it too.






