Skip to content
Menu

Scientists Say Self-driving Cars May Be Dangerous Because They Don’t Understand Social Cues

Researchers say self-driving cars can cause jams and anger other road users.

Self-driving cars annoy other road users, cause jams and could be dangerous because they don’t understand human interaction, a new study claims.

Researchers found that the vehicles, touted as the future of transport, can’t pick up on the subtle human social cues that inform driving.

The most obvious one is whether to give way or go in traffic which humans typically make quickly and intuitively.

But self-driving cars fail to read the humans in traffic and their reactions can cause jams and anger other road users, according to award-winning research from the University of Copenhagen.

Researchers analyzed 18 hours of 70 videos uploaded by YouTube users of self-driving cars in various traffic situations.

The results show that self-driving cars have little social intelligence in understanding when to ‘yield’ and when to drive, necessary for efficient delivery.

Professor Barry Brown, at the University’s Department of Computer Science, who has studied the evolution of self-driving car road behavior for the past five years, listed several questions that self-driving cars would have difficulty answering.

Three different self driving car systems in action: Alphabet/Google’s Waymo, Tesla, and Intel’s MobileEye. (University of Copenhagen via SWNS)

 

“It is one of the most basic questions in traffic, whether merging in on a motorway or at the door of the metro. […] The ability to navigate in traffic is based on much more than traffic rules. Social interactions, including body language, play a major role when we signal each other in traffic. This is where the programming of self-driving cars still falls short. That is why it is difficult for them to consistently understand when to stop and when someone is stopping for them, which can be both annoying and dangerous.”

Companies like Waymo and Cruise have launched taxi services with self-driving cars in parts of the United States. Tesla has rolled out its FSD (full self-driving) model to about 100,000 volunteer drivers in the US and Canada.

But according to Professor Brown and his team, their actual road performance is a well-kept trade secret that few have insight into.

Therefore, the researchers performed in-depth analyses using 18 hours of YouTube footage filmed by enthusiasts testing cars from the back seat.

One of their video examples shows a family of four standing by the curb of a residential street in the United States.

There is no pedestrian crossing, but the family would like to cross the road. As the driverless car approaches, it slows, causing the two adults in the family to wave their hands as a sign for the car to drive on.

Instead, the car stops right next to them for 11 seconds. Then, as the family begins walking across the road, the car starts moving again, causing them to jump back onto the pavement.

Brown said: “The situation is similar to the main problem we found in our analysis and demonstrates the inability of self-driving cars to understand social interactions in traffic.

Self-driving car visualizations of the path ahead. (University of Copenhagen via SWNS)

 

“The driverless vehicle stops so as to not hit pedestrians, but ends up driving into them anyway because it doesn’t understand the signals. Besides creating confusion and wasted time in traffic, it can also be downright dangerous,” Professor Brown said.

In tech-centric San Francisco, driverless cars have been deployed in several parts of the city as buses and taxis, navigating the hilly streets among people and other natural phenomena.

“Self-driving cars are causing traffic jams and problems in San Francisco because they react inappropriately to other road users”” Brown said. “Recently, the city’s media wrote of a chaotic traffic event caused by self-driving cars due to fog. Fog caused the self-driving cars to overreact, stop and block traffic, even though fog is extremely common in the city.”

“I think that part of the answer is that we take the social element for granted. We don’t think about it when we get into a car and drive – we just do it automatically. But when it comes to designing systems, you need to describe everything we take for granted and incorporate it into the design. The car industry could learn from having a more sociological approach. Understanding social interactions that are part of traffic should be used to design self-driving cars’ interactions with other road users, similar to how research has helped improve the usability of mobile phones and technology more broadly,” he said.

The study was presented at the 2023 CHI Conference on Human Factors in Computing Systems, where it won the conference’s best paper award.

Produced in association with SWNS Talker

Edited by Kyana Jeanin Rubinfeld and Sterling Creighton Beard

“What’s the latest with Florida Man?”

Get news, handpicked just for you, in your box.

Check out our free email newsletters

Recommended from our partners