Autonomous vehicle advocates argue that the technology has the potential to offer a wide variety of benefits, from improved safety and convenience, to greater efficiency. However, according to one recent study, U.S. drivers are less than keen on the idea of widespread autonomous vehicle adoption.
In a recent study between American Automobile Association (AAA) and the Technology and Public Purpose Project at the Harvard Kennedy School’s Belfer Center for Science and International Affairs, U.S. drivers voiced their concern over sharing the road with fully autonomous vehicles.
Per the study, the type of autonomous vehicle (semi-truck, delivery vehicles) and the driving scenario (highway/freeway driving, local or neighborhood driving) did not have a significant impact on the unease felt by those studied.
Sharing the road with… | Safer | No Difference | Less Safe | Unsure |
---|---|---|---|---|
Self-driving semi-trucks | 11 percent | 12 percent | 53 percent | 24 percent |
Small, self-driving delivery vehicles | 12 percent | 19 percent | 43 percent | 26 percent |
Self-driving vehicles on a highway or freeway | 12 percent | 15 percent | 47 percent | 24 percent |
Self-driving vehicles on local on neighborhood roads | 13 percent | 17 percent | 44 percent | 26 percent |
Interestingly, the study also identified a few measures that would help to alleviate some of the anxiety, with 62 percent of survey respondents saying they would feel safer if autonomous vehicles were clearly marked as self-driving. A further 60 percent said they would feel safe with designated lanes for self-driving vehicles only, and another 31 percent said they would feel safe with restrictions on the time of day and/or days of the week when autonomous vehicles would be allowed to operate on public roads.
The survey was conducted between January 15th and 17th, 2021, using a probability-based panel representative of the U.S. household population overall. Most surveys were completed online, while some surveys were completed by phone. A total of 1,010 interviews were completed among U.S. adults.
As AAA points out, 38 U.S. states currently have active programs that allow autonomous vehicle testing on public roads. According to the recent survey, support for autonomous vehicle test programs was split, with 34 percent in favor, 36 opposed, and 31 percent unsure.
Those opposed to the programs were primarily concerned with safety. Who would be responsible in a self-driving vehicle crash was also a major concern, as was lack of clarity over who was overseeing the testing.
General Motors unveiled its first fully driverless production vehicle, Cruise Origin, in January of 2020. Cruise Origin is intended to provide all-electric, driverless transportation.
Subscribe to GM Authority for more General Motors-related autonomous vehicle news and around-the-clock General Motors news coverage.
Source: AAA
Comments
That logic is inverted. Accidents are caused by bad drivers. I would feel much safer if I was driving along AVs every day.
Exactly! More like autonomous vehicle owners should worry about barely over 80 iq terrible drivers hit them while texting at the wheel. They’re the ones that don’t obey the traffic rules , taking drugs and alcohol. These people act like humans are perfect driving machines, they never distracted nor make a mistake. Around 40K American die every year in traffic accidents by humans drivers.
I think existence of AVs will force the human drivers to drive more causatively too.
It can’t be any worse that half of idiots already on the road who can’t put down their FN phones or refuse to get out of the passing lane while driving under the speed limit.
The problem with Autonomy is you are stuck with what is programmed. If the vehicle is set to go the speed limit it will become a rolling road block. If it is programmed to keep with traffic it is breaking the law,
This is just one of many issues that would be faced.
Also autonomy would be faced with non rational things like road rage. People will make sport to make the other car react to their actions if it is predictable.
Even that Dominos auto delivery I will wager will still get the wrong house that every GPS driver gets when he is looking for the house behind mine. How do I send this one away?
There will be applications out there for this but not for everything as long as humans have to interact.
Humans are the wild card you can’t program for. Or trucks laying on their side across a highway. That was a real mess.
Even the auto landing systems for planes and even the shuttle are not perfect.
They had to go manual on the shuttle. It had a tailwind that kept the speed up till the gear nearly did not come down. After that the pilot put the gear down. They were 3 seconds from belly landing.
All of this is why AV’s should not be allowed on the road. Sooner or later situations will arise that the programming wasn’t programmed to handle. Case and point, the shuttle landing scenario you mentioned. The auto-pilot programming for landing the shuttle most likely worked 99% of time. But when landing, if the programmer coded the auto-pilot to only deploy the landing gear based on speed, under a certain speed, and failed to consider a tailwinds that would not allow the shuttle speed to drop below the coded speed threshold, and/or also failed to consider distance above the ground, you crash and burn without human intervention.
Also, your speed limit point. Unless each and every vehicle on the road is doing the speed limit, the auto-pilot cannot be coded to do the speed limit. Regardless of speed, it will always be safer when all vehicles are doing the same speed at a proper safe distance. Coding the auto-pilot to do the speed limit, when all other vehicles around it are not doing the speed limit is a road hazard. No one drives the speed limit. Now I’m sure some people do, but most everyone on the road does not. Most are either under or over. The unders are causing annoying road blocks, which causes tailgating and lane changing, which in turn causes accidents. The overs are tail gating and/or weaving in and out of lanes, which in turn causes accidents. Speed limits should also not be adhered to in inclement weather.
I could go on but won’t.
Things like this only can do what a programmer teaches it. I am dealing with an AI speech program now at work that is good but still can’t account for everything,
Also there will be moral dilemma’s like if given a choice of hitting two human targets which one is the least destructive and what one could you live with.
I was in an accident a few years back with a cell phone driver who cut off my path. I was able to turn the car in gravel slide it to where I avoided oncoming traffic, his car, a guard rail and a stop sign. It was a move that no computer could have done. It would have never compensated for the lost of antilock brakes and steering. It would have pegged the guard rail on the end
I went back later and looked and I put the car right where I wanted it. Even then it was a risk as I had no idea what the gravel would do.
The worst part of autonomy is even in use a human should be engaged. As we have seen with Tesla people disengage if they are not active in the cars control. It takes discipline most people lack.
Will be a little unnerving seeing the first time with no driver, but some of the videos on gma lately show some stupid driving.
i hope it has better sense than the idiot in the traverse who backed into that woman the other day.
The P I lawyers are going to have a field day with this one!
As will hackers.
Having been in the IT/computer and electronics field for nearly 30 years there is also the other major elephant in the room nobody seems to consider- poor quality control from car companies that outsource to the lowest supply bidder, wildy varying temperature conditions that can plague electronics, programming mistakes, age related wear on sensors and electronic items, poor quality soldering techniques that cause instability and just good old fashioned wonky behavior caused by locking up, voltage issues along the various bus circuits, battery wear etc.
All of these items would need to be in perfect harmony along with the roads, signs, painted portions of the roadways themselves and the other vehicles that will be talking to each other and interacting with infrastructure. I’m not even going to bring in weather conditions and other driver’s habits because that has been discussed to death. We are not ready for this to go prime time yet!
Liability laws will force them not to go low bidder. They would pay more just in lawsuits.
Even recently GM is learning as they are using LG to develop and build their Oled dashes. That would not have happened 15 years ago.
They will still skimp on some areas but there has been much improvement in other areas.
Or ever…..
We are not yet at the point where it would be safe to implement widespread autonomous vehicle use. I base this opinion on my own experience with the automatic braking feature in my car. The other week a reckless driver decided to slam on his brakes in the fast lane and then cross all lanes of traffic to take an exit ramp. I was traveling behind this driver at a safe distance, matching his speed at 65mph. When he did this the automatic braking in my vehicle engaged, slamming on the brakes needlessly with cars behind me. I had started to move left into the emergency lane to drive around as this happened (far safer than just slamming on your brakes), which meant I then had to maintain control of my car as the weight shifted around from the sudden and unexpected braking.
Other things to consider, car radar systems don’t work when snow covers the sensor and/or camera. I have also had at least a dozen instances of the car randomly flashing “BRAKE BRAKE BRAKE” on my screen despite absolutely nothing being in front of me (thus far it has not actually braked in these situations, it just beeps and tells you to brake). And then there’s the time I was turning a corner behind another car when it automatically braked for no reason (the car in front of me did not brake, I was matching his speed).
Of course there’s also Lane Keep Assist, an overzealous system I had to disable. Yeah, there’s nothing safer than your vehicle nudging the steering wheel and changing the direction of your car if you even think about getting near the line. There have been times when I have NEEDED to cross the line, in a construction zone or to avoid an object in the road for example, and the car actively tried to stop me by attempting to turn itself back into the lane. I have not missed this feature one bit since disabling it. Unfortunately automatic braking turns itself back on every time I start the car.
When weighing the pros and cons of vehicles as I shop for my next one, “safety” features such as these are a con for me. Yes, there are bad drivers out there. Horrible ones. Perhaps these people are ideal candidates for autonomous vehicle use. They shouldn’t be driving to begin with. Type in your destination and let the car do the work. But the technology is not there yet. There are too many things a computer cannot currently comprehend better than a half decent human driver. I’m not opposed to the technology, generally speaking, but I share and understand the concerns people have over full autonomous vehicles when even the most basic autonomous technology doesn’t work properly right now.
If people really wanted safer vehicles the first thing they would do is take that phone they are holding to their head and put it down.
The call can wait.
They may do more good with a cell phone jammer inside the car when in drive.
There will be NO road “sharing” with numerous AVs on the road. EVER. WHEN the controlling Authoritarian Progressive Left Government of the future (that most of you millennials are voting into absolute power; i.e. current admin) “decides” for all of us that the AV Tech is “ready” it will shut down entire highways & city blocks to have AV ONLY traffic corridors. Future Generations will find driving a friggin car a giant hassle when they have to put down their handhelds for a time to travel somewhere. Isn’t this so plainly obvious sheeple???