The day is approaching when commuters stuck in soul-crushing traffic will be freed from the drudgery of driving. Companies are investing billions to devise sensors and algorithms so motorists can turn our attention to where we like it these days: our phones.
But before the great promise of multitasking on the road can be realized, we need to overcome an age-old problem: motion sickness. “The autonomous-vehicle community understands this is a real problem it has to deal with,” said Monica Jones, a transportation researcher at the University of Michigan. “That motivates me to be very systematic.”
So starting in 2017, Ms. Jones led a series of studies in which more than 150 people were strapped into the front seat of a 2007 Honda Accord. They were wired with sensors and set on a ride that included roughly 50 left-hand turns and other maneuvers.
Each subject was tossed along the same twisty route for a second time but also asked to complete a set of 13 simple cognitive and visual tasks on an iPad Mini. About 11 percent of the riders got nauseated or, for other reasons, asked that the car be stopped. Four percent vomited.
Ms. Jones takes no joy in documenting her subjects’ getting dizzy, hyperventilating or losing their lunch. She feels their pain. Ms. Jones, a chronic sufferer of motion sickness, has experienced those discomforts in car back seats all her life. “I don’t remember not experiencing it,” she said. “As I’m getting older, it’s getting worse.”
It’s also getting worse for the legions of commuters hailing Ubers or taxis and hopping in, barely lifting their gaze from a screen in the process.
The University of Michigan subjects were recruited to represent not only those with histories of getting carsick, like Ms. Jones, but also passengers along a spectrum of susceptibility. An equal number of men and women were tested.
The first 20-minute test drives were conducted at MCity, an ersatz city managed by the University of Michigan’s Transportation Research Institute. But more recently, the Accord merged with local traffic for one-hour drives. Test riders will eventually be relocated to the back seat, where Americans increasingly find themselves.
In the study, subjects narrated their levels of nausea during the route. Video cameras and wired sensors captured facial expressions, heart rate, skin temperature and changes in body and head posture. Those were indexed against precise metrics about the vehicle’s movement.
Ms. Jones wants to help people avoid and treat motion sickness. But at this early stage of her research, she’s merely aiming to better understand the “fundamentals of human response.” For example, there might be clues in how riders who get carsick hold their heads, maintain their posture or position the mobile devices they’re using. “I’m not out for the engineering solution directly,” Ms. Jones said.
But Florian Dauth, an automated-driving engineer for the ZF Group of Germany — one of the world’s largest automotive suppliers — is in the business of devising engineering solutions. He has been working for more than two years on strategies to reduce motion sickness in autonomous vehicles.
“We are developing algorithms that self-learn based on bodily reactions,” he said, referring to the machine-generated code that determines the vehicle’s path. To navigate the road safely, automated vehicles already receive and combine data from an arsenal of radar, laser, video and ultrasonic sensors. ZF says data about the passenger’s well-being should be added to the algorithm.
Mr. Dauth is collecting passengers’ biological data via cabled inputs, like measurements of brain activity from electrodes placed on a rider’s scalp and similar monitoring of the heart. When put into production, the self-driving biofeedback system would most likely be reduced to cameras powered by facial-detection software or perhaps wearable devices.
“Let’s say the car takes a strong left curve and then brakes very roughly at a red traffic light. We are recording all the vehicle movements and the passenger’s reactions in parallel,” Mr. Dauth said. “If you react in a way that gives you symptoms, then in the future we will avoid these maneuvers.” In other words, the self-driving car’s A.I. learns how to drive in a way that doesn’t make you sick.
ZF might want automated cars to become calmer drivers, but back in Michigan, Ms. Jones’s research places some of the responsibility for avoiding motion sickness on a rider’s common sense. As you might expect, not reading a book, or Twitter, helps one avoid motion sickness.
But Brian Lathrop, a technologist at Volkswagen with a doctorate in cognitive psychology, doesn’t harbor hope that passengers will put down their phones. “If you’re talking about a Level 4 autonomous vehicle, you have to ask yourself what are people going to be doing in the car,” he said. In a so-called Level 4 car, passengers don’t need to pay any attention to a steering wheel or the road.
“The easy answer is they’ll still use their smartphones,” he said. “But you also have to anticipate the high probability that they will be using some sort of virtual reality or augmented-reality system.” That’s right. We’re facing a brave new automotive world in which people zoom down the road in a self-driving vehicle while wearing fully immersive virtual-reality headgear.
Mr. Lathrop, working with fellow technologists at Volkswagen’s Innovation and Engineering Center California in the heart of Silicon Valley, is trying to eliminate motion sickness when using V.R. in a moving automobile. Mr. Lathrop said the unease happens when there’s a disconnect between the signals sent to your brain from your inner ear and what you’re seeing. “I wanted to look at how could you address that disconnect between the visual signals and the stimulus signal,” he said.
Before long, Volkswagen and its luxury brand, Audi, were developing original V.R. content for the car. “You can coordinate the optic flow of visual information inside the V.R. headset such that it’s correlated with the actual motion of the vehicle,” he said.
A few months ago, I took one of those virtual-reality experiences for a test ride in the back seat of a sport utility vehicle with a game created by Holoride — a company spun off from Audi to develop V.R. entertainment in cars. I slipped on an Oculus Rift headset and was immediately transported to the sights and sounds of a submarine pod surrounded by schools of jellyfish, gently gliding blue whales and undulating aquatic creatures.
The streets of San Jose, where the ride occurred, vanished from my consciousness. Yet every movement of the virtual submarine — speeding up, turning, avoiding sharks — was informed by the Audi S.U.V.’s actual movements. Using Mr. Lathrop’s vernacular, my optic flow was perfectly aligned with my stimulus signal: so, no motion sickness.
“You could apply that same logic to productivity applications, social activity, shopping and exploration,” Mr. Lathrop said.
This doesn’t mean that we all need to don V.R. headsets to avoid nausea while being productive in a self-driving vehicle. “I anticipate that we might have a very low-profile, lightweight type of V.R. platform that’s like putting on a pair of sunglasses,” he said.