Go for a ride in a Google autonomous car

What's it like to ride in a driverless car? Google's Autonomous Car Team explains the features on a test ride with Sacramento Bee transportation writer Tony Bizjak.
By
Up Next
What's it like to ride in a driverless car? Google's Autonomous Car Team explains the features on a test ride with Sacramento Bee transportation writer Tony Bizjak.
By

Back-Seat Driver

Tony Bizjak writes about traffic and travel in the Sacramento region

Back-Seat Driver

Sacramento Bee’s transportation reporter puts Google’s self-driving car to the test

By Tony Bizjak

tbizjak@sacbee.com

February 06, 2016 04:01 PM

Mountain View

I’m sitting in the back seat of a car with no driver, watching the steering wheel spin one way, then the other, as the car winds through suburban neighborhood streets. Yet I have no fear. Well, almost none.

It’s a self-driving car built by Google, and I’m getting a glimpse of the future.

That future – where drivers become mere passengers, sitting back, perhaps taking a nap, as a computer pilots their vehicle – could be closer than you think. Each month, it seems, another car or technology company announces it’s dispatching a platoon of self-driving cars onto the streets for tests, mingling with human drivers.

Last week, as The Bee’s Back-Seat Driver transportation writer, I joined two Google researchers on one of those tests, a 25-minute ride in a fully autonomous car, guided by lasers, radar and video cameras.

My report: It wasn’t scary or even exciting. (Eerie might be the word, watching that steering wheel turn without human touch.)

There were no crashes or close calls. In fact, our Google car was a goody-two-shoes. It drove like someone who’d read the DMV handbook cover to cover, and liked it so much it read it again.

The car refused to make a single rolling “California stop,” no matter what other cars around it did. It even stopped for a second at a “yield” sign when it couldn’t quite see around a hedge.

We look at the potential as nothing short of revolutionary for the disability community.

Teresa Favuzzi, California Foundation for Independent Living Centers

That’s the point, though. Engineers for technology and automobile companies are pitching self-driving cars as a safe and sane alternative to human drivers. The cars obey the rules. They will not drive drunk or distracted or angry. Someday, they may even communicate with other cars, electronically announcing their intention to change lanes or take the next exit.

Some of the biggest names in cutting-edge technology – Google, Uber and Tesla – as well as legacy automobile makers such as BMW, Ford, Volkswagen, Mercedes Benz, Nissan, Honda and others, are pumping billions of dollars into the effort to bring robot cars to life. So far, 11 companies have permits to test advanced-technology cars in California.

Some, such as BMW and Tesla, are incrementally adding self-driving technology, such as lane-tracking systems and self parking, to cars already on the market. A BMW executive said in an interview that his company isn’t rushing to bring a fully self-driving car to market – although it is testing fully autonomous cars now on German autobahns.

He pointed out that BMW is, after all, “The Ultimate Driving Machine,” and that some drivers will want their own hands on the wheel on a mountain road.

“We have the opportunity to give it as much time as it needs,” said Frank Breust, a BMW government affairs executive. “This is not something which will happen all of a sudden. This will be a continuous process.”

Google, on the other hand, is full throttle. The Silicon Valley-based technology giant wants to get a fully autonomous car on the market in the next few years. Google self-driving car chief Chris Urmson has said he has a personal goal: He doesn’t want his 12-year-old son to get a driver’s license.

“We’ve been improving very quickly,” said Nathaniel Fairfield, Google’s autonomous car program software lead, who hosted my ride last week. “We really want to get self-driving vehicles out in the world, making a difference for people, improving safety.”

‘Put safety first’

Safety is a big question right now, pitting regulators against innovators.

California’s Department of Motor Vehicles is scrambling to catch up with the technology, and it hopes to come up with a set of three-year rules of the road by the end of the year for research and early public usage. So far, the DMV says all self-driving test cars must have a steering wheel and a licensed driver in the front seat to take control.

“We want to get there, but we want to get there in a way that is safe for everybody,” said DMV deputy director and chief attorney Brian Soublet.

We really want to get self-driving vehicles out in the world, making a difference for people, improving safety.

Nathaniel Fairfield, Google’s autonomous car program software lead

Google, which built its prototype cars without a steering wheel or gas pedals, opposes that requirement. Their attitude is that you don’t build tomorrow with yesterday’s tools. In deference to the current regulatory reality, they’ve retrofitted their prototypes with steering wheels to use on street tests.

At a state hearing in Sacramento two weeks ago, John Simpson of Consumer Watchdog told the DMV to stand firm. He pointed out that robot car technology isn’t quite there yet. “I think this technology, down the road, will offer potential safety (improvements), but it is not ready yet for prime time,” Simpson said. “Put safety first.”

Some advocates for blind, elderly and disabled people are eager to see the technology move quickly, and don’t want the DMV to require a licensed driver in the car when the cars go public. They fear non-drivers will be shut out of what could be a technology that gives them a freedom they’ve never had.

“We look at the potential as nothing short of revolutionary for the disability community,” said Teresa Favuzzi of the California Foundation for Independent Living Centers. “Don’t keep us out of the revolution.”

An early look at that revolution can be found on the streets of Mountain View in Silicon Valley. There, Google has been testing robot cars for six years, logging 1.4 million road miles, supplemented by simulated laboratory driving tests.

The day before The Bee’s test ride, I asked University of Southern California engineering professor Jeffrey Miller, an expert on autonomous technology, what to expect. At first, “you’ll have a feeling of helplessness,” he said. “Your fate is in the hands of the vehicle.” In a few minutes, he said, you’ll start to relax and may even trust the car.

Google’s Fairfield sat in the “driver” seat during our drive, to take the wheel if the car ran into trouble. I took the back seat on the passenger side. A Google test driver sat in the front passenger seat, holding a laptop that shows a 3-D computer image of the car’s route and surroundings, recording what the car was “seeing” on the road and how it reacted.

Fairfield hit a button marked “ON” on the steering wheel. A chipper female speaker voice announced, “auto driving,” and the car took off, but not before stopping at the edge of the Google parking lot and signaling a right turn, even though there wasn’t another car in sight.

Right away, I felt competitive with the car. For some reason, I viewed the car as a thinking being with its own personality, a solid citizen who was, I admit, probably smarter than me (at least in a wonky Silicon Valley way). After all, it has laser beams. But I drive with more panache. I can stick an elbow out the window. So there.

Confused by rain

Google’s test cars are easy to detect. The fleet of snow-white prototypes look like little gumdrop candies on wheels. Google designed them that way so they will appear non-threatening. Our ride, however, was in one of several Lexus RX 450h vehicles Google also tests.

It has a black plastic dome on the roof that holds a constantly spinning 360-degree laser device, essentially “seeing” everything around the car. A series of radar devices and video cameras positioned on the sides and back of the car gather more data. Notably, the radar can wrap around cars, allowing it to see what’s happening ahead of the car ahead of you. Google also has intricately mapped the streets of Mountain View – the stop signs, signals, crosswalks – and given that information to the car’s computer brain as its cheat sheet.

Every test drive provides the cars with data that enable the cars to see and categorize other objects on the road – cars, pedestrians, bicyclists – which the car views as clumps of data points. Certain clumps, the car’s brain has learned, tend to act certain ways.

Some objects are hard to figure out, though. Heavy rain, for instance, confuses the cars. Google postponed The Bee’s first planned test ride because heavy rain had left puddles in the road, and the cars aren’t sure how to react when the car ahead splashes water into the air. If confused, the cars are programmed to just pull over and stop.

But they are learning, Google says. On our ride, we passed a man walking next to a child on a tricycle on the sidewalk. In early test years, the car might have slowed down, just in case. Our car didn’t, but it monitored those objects as it passed, knowing from experience that data points similar to those sometimes change direction and move into the street.

Google executives like to tell the story about the Google car that came around a corner to find a woman in a wheelchair waving a broom and chasing a duck in circles in the middle of a residential street. The scenario was new. But it wasn’t hard. The car stopped and waited until the pair got out of the way. “The cars are very patient,” Fairfield said.

Our car did not stop, however, when a gardener with a leaf blower walked into the street about 50 feet ahead of us. Instead, it angled wide and passed as the gardener glanced our way and walked back to the sidewalk.

The car sometimes did seem timid. At one point on a narrow residential street with cars parked on both sides, it slowed to about 5 miles per hour, as if uncertain. Then it seemed to make up its mind, choosing a line down the middle of the street and speeding up.

At another point, though, the car was surprisingly assertive. Making a right turn onto a busy, high-speed street, the car stopped, signaled, then hit the gas hard and bolted into the near traffic lane a few lengths ahead of a car that was bearing down from a hill.

I could tell that we had enough space to scoot in front of that car, but it did seem uncharacteristic, in my limited experience, for the car to show that much verve.

Fairfield explained that the Google car knew it had the room to make its move and had been programmed to go for it. “If you never go until there is a gap that’s so big that the vehicle behind you won’t have to react at all, you are not going to get into traffic and ... the people behind you will get all mad.”

A few seconds later, fresh off that triumph of assertiveness, our car hit the brakes harder than it needed to as it approached waiting cars at a red light. I couldn’t see why. Neither could Fairfield. “That’s probably something we want to look at,” he said.

Slow driver

Google cars have been in 16 crashes since they hit the roads a few years ago, according to reports the company files with the state. One involved a minor injury. In almost all of the accidents, cars driven by humans hit the Google car from behind at intersections. In one case, the Google car was struck from behind when it braked hard for a pedestrian who had stepped into the crosswalk.

“We feel we made the right decision to brake,” Fairfield said. “On the other hand, maybe there is something we could have done a little differently to give the person behind us a little room.”

A Google test car also was famously pulled over by a traffic officer a few months ago for going 24 in a 35 mph zone. Google noted it isn’t allowing its prototype cars to go faster than 25 mph for now. The Lexus test cars are allowed to go faster.

Like California, federal regulators say they will step up their efforts this year to get a handle on the new technology, and also to encourage it. Last month, the Obama administration said it will ask Congress to approve $4 billion over the next decade on research supportive of driverless car technology.

Federal Transportation Secretary Anthony Foxx recently rode in a Google test car. “Holy smokes; it’s really interesting technology,” Foxx said in an interview Friday.

He said advanced vehicle technology can help stem the country’s traffic fatality rate. “We do have a significant amount of fatalities that happen as a result of human error,” Foxx said. “This is technology that we need to advance.”

But the technology can’t make driving completely safe, said Miller of USC, a member of the Institute of Electrical and Electronics Engineers. “Technology has flaws in it. It is created by humans.”

He and others pose this question: What do you program the car to do when it is presented with a no-win, imminent crash scenario? There will be times when a self-driving car cannot avoid a crash. Does it run into the object in front of it – say, a pedestrian – or veer right into a motorcyclist, or steer left into a wall? In each case, someone will be hurt or killed. Which one?

“We need to have this discussion before driverless vehicles hit the consumer market,” Miller said. “I guarantee you we will get into one of these situations. Will it be frequent? No. But of course it is going to happen.”

That example is extreme, but it serves notice that the new technology poses social questions that go beyond engineering. How do you keep the cars’ computers from being hacked by someone who wants to do harm? Who is responsible if a self-driving car causes a crash? And this big one: Will a day come when humans are banned from driving?

As our Google car glided into the company parking lot at the end of our ride, Fairfield pivoted in his seat, grinning. There is work still to be done, but a lot already has been accomplished, he said. He’s optimistic.

“The day I’m very excited about is the day where we are just as safe as human drivers,” he said. “The next day, guess what? I come in and work on making it a bit safer and the next day after that a bit safer and it just adds up.”