News and culture through the lens of Southern California.
Hosted by A Martínez
Airs Weekdays 2 to 3 p.m.

To build trust in self-driving cars, communication is key, Intel says




Intel conducted its Car and Rider Trust study on its Arizona campus.
Intel conducted its Car and Rider Trust study on its Arizona campus.
Intel

Listen to story

06:52
Download this story 16.0MB

For all the attention autonomous cars are getting, you’d think more people actually want them. But they don’t. According to the American Automobile Assn., three out of four people say they’re afraid to be a passenger in a car without a driver. Trust is a major barrier.

To figure out what it would take to build that trust, computer chip maker Intel conducted a Car and Rider Trust study, the results of which were revealed Thursday.

Ten people participated in the study, held on Intel's Arizona campus. Representing a diversity of ages, ethnicities, gender and backgrounds, the participants were given a cell phone loaded with an app that allowed them to summon a ride in a self-driving car. Once the car arrived, each of the passengers got a ten-minute ride on a closed course during which they were asked to verbalize everything they were thinking as the car did things like deviate from its planned route and react to a pedestrian stepping in front of the car. They were also interviewed about the experience afterward.

The key findings ...

Humans trust other humans more than machines.

People worry that robots can’t make moral decisions -- that machines are only capable of calculating simple costs and benefits. But driving is a complicated task. It requires complex decision making to deal with things like road construction or someone unexpectedly walking into the street. So even though thousands of people lose their lives in traffic crashes each year as a result of human error, most people think they’re good drivers and they have a lot of confidence in human judgment. Self-driving cars will be safer, but machines are held to a higher standard.

A steering wheel that moves on its own provokes anxiety.

For this study, the passenger rode in the back seat. There was a person in the driver’s seat as a safety precaution, but the so-called safety driver didn’t touch the controls. The car itself was doing the driving. Seeing a steering wheel spinning around with no one touching it freaked people out, the study found. As long as the controls are there, the human instinct is to use it, so this study seems to indicate that the standard operator controls shouldn’t be there because they make passengers anxious.

Passengers want proof the technology works.

The Intel study found that passengers felt more confident about getting in the self-driving car after they were briefed on how the sensors work. And once they were in the car, they appreciated the car itself talking to them to communicate what it was doing — like re-routing. Intel says the car’s voice had a huge impact on making passengers feel comfortable. But the car also needed to know when to stop telling passengers what it was doing because it got annoying. The study also found that some passengers wanted to be able to converse with the car like they would ordinarily do with a human Uber or taxi driver. Intel says this conversational aspect will be a very important part of autonomous vehicles going forward.

Passengers feel more safe in a car that always follows the rules.

Like other self-driving cars, Intel's  followed all road rules to a T, stopping at stop signs and traveling the speed limit. It didn’t drive like a human, which in theory, sounds like it would be annoying since that’s what we’re used to. But the passengers in the Intel study said that knowing the car would follow the rules at all times helped them feel safe.

Building trust with a machine is similar to building trust between humans.

Building trust requires open communication, having questions answered and being informed about what's happening.