Cruise, GM’s autonomous driving subsidiary, has been struggling with some growing pains as of late. Although the Cruise AVs have covered more than 500,000 driverless miles and counting, the vehicles have been involved in a few mishaps, including an accident with injuries and stopping in the middle of an intersection. Now, a few Cruise AVs were captured blocking traffic in San Francisco once again.
A friend of mine took a video in San Francisco tonight of 6 @Cruise self-driving cars stopped at an intersection for 20 min. Traffic came to a standstill + people didn’t know what to do. Two of the cars were on wrong side of the road. Doesn’t seem particularly safe. #Autonomous pic.twitter.com/tKZrHgrdEi
— Jose Fermoso (@fermoso) January 21, 2023
As shared in a few Tweets, pedestrians captured a number Cruise AVs blocking traffic at an intersection in California. One AV was sitting at the stoplight, while another was halfway in the opposing lane. As reported by the Tweets, it took 20 minutes before the situation was resolved. According to Cruise, one of the vehicles suffered a technical issue that required it to be moved by Cruise’s field team. Once this AV malfunction was situated, the other units proceeded as normal.
According to the comments on the Tweet, this type of fiasco is a common sight in San Fransisco, with some people pointing out that one AV eventually took a left turn at a red light.
Here’s the other video, when one of the cars decided to finally move on. pic.twitter.com/VSj1NlML2Q
— Jose Fermoso (@fermoso) January 21, 2023
Thankfully, no injuries or collisions were reported in the incidents, although this dysfunction seems to indicate that more testing and development may be required in order to safely deploy these autonomous vehicles. However, a recent study suggests that full autonomy may never be achieved, and that AVs like those in Cruise’s fleet will need human supervision in order to function.
thanks for sharing. One of our cars had a technical issue and needed to be moved by our field team (who were quickly dispatched and took ~15-20 min to clear the car). All other Cruise vehicles were able to proceed autonomously. We apologize to anyone who was inconvenienced.
— cruise (@Cruise) January 21, 2023
As a reminder, the National Highway Traffic Safety Administration (NHTSA) has opened a formal safety probe into Cruise based on reports of the autonomous robotaxi units excessively braking or blocking traffic. To date, no major injuries or fatalities directly-related to Cruise have been reported.
Meanwhile, Cruise AVs have started to roam the streets of Phoenix, Arizona an Austin, Texas, as the company seeks to expand its service to several more markets this year. In addition, the Cruise Origin robotaxi has begun testing in San Francisco, but with a driver on board, while Cruise is waiting for the city to green-light testing of driverless Origin units.
Subscribe to GM Authority for more GM Cruise news, GM business news, GM safety news, GM technology news, and around-the-clock GM news coverage.
Comments
A few incidents after 500,000 miles is not that bad. Some human drivers get into accidents several times a year with much less miles traveled.
Some judge in San Francisco needs to have some balls and impound these vehicles when an event like this happens and fine gm. gm can then explain why “Poppy” keeps screwing up.
Drug addicted homeless bums are a common sight in San Fran too.
Leave them in CA, they are a menace!
The cars or the drug addled homeless bums?
I’ve never understood why NHTSA would sign on to such a thing. Well, maybe I do…NHTSA is part of the government, and gov’t is comprised of people who are susceptible to influence peddling.
Well, anyway…
Could there be anthing more dangerous and stupid then having a driverless car,what the ??
How about that “driverless” airplane you most likely flew on. Pretty much all commercial airplanes fly via auto-pilot most of their journey. The exceptions being take-ff, landing and any anticipated turbulence. A pilot / co-pilot is simply sitting in their cockpit seat monitoring performance rather than actually being hands-on controls.
These Cruise AV incidents have been little more than annoying situations. Can’t say the same for Tesla’s AutoPilot mistakes. Just a week ago not too far from this Cruise incident, a woman was killed when the Tesla she was driving veered off the road, crashing through a fence and ending up in a swimming pool.
Is it possible to hack these vehicles, not saying that is what happened in these 2 instances, just asking as a point of information. ?
Looks like saturation of pedestrians near the lanes of travel confuse the AI causing it to stop. But I am not an engineer, just stayed at a Holiday InnExpress last night:)