“The Human Machine Interface.” It sounds very Brave New World. And yet those man-machine interactions happen even as I type this into my computer and you read it from an electronic, Internet-accessible device. In the world of automobiles, the human machine interface means understanding how people interact with technology in order to develop user-friendly ways to organize and display information in vehicles to make the driving experience safer and more efficient.

At General Motors’ Human Machine Interface lab, this is Dr. Thomas Seder’s job. Seder gave me a peak into a few potential game-changing technologies under his lab’s purview. Some of these advancements are already here, or about to be – and others are more distant.

Much of the technology centers on understanding the driver’s alertness or level of distraction, such as drowsiness detection systems. Already in use, these systems use sensors and cameras to determine if the driver is entering a micro-sleep state often by measuring “perclose,” or the eyes’ blink rate. When a drowsy state is detected, the driver is alerted by some combination of audible and haptic (sense of touch) cues. “In this case, it wouldn’t be subtle,” Seder says. “We want to make the driver know he or she is falling asleep.”

Enhanced Vision Systems (EVS) are filtering down from aviation. As an application of “augmented reality,” these types of systems understand the focus of the driver’s attention based on the position of the eyes and then use visual cues to bring the driver’s attention to unrecognized potential threats in the roadway.

This prototype heads-up display (HUD) highlights road signage (image 1) and road curvature in poor visibility conditions (image 2).

This prototype heads-up display (HUD) highlights road signage (image 1) and road curvature in poor visibility conditions (image 2).


For instance, the sensors may detect that the driver is focused on a traffic light about to turn red, and not on a kid on a bicycle in his periphery. The EVS would then highlight the kid by tracing a digital outline on the windshield. Conversely, if the driver’s eyes were trained on the kid and not the traffic signal, the EVS would highlight the traffic signal. An EVS could also highlight the road through fog or project a thermal outline of a deer crossing the road at night, Seder says.


These types of systems can also be used in navigation in conjunction with GPS and a planned map route. An EVS could overlay an electronic arrow on the road to show a turn or an off ramp, or highlight a street sign or physical landmark such as an office tower.

Seder says partial EVS applications could be available in model years 2015 or 2016, while a fully realized system could be available at the end of this decade.

Another safety alertness detection system might measure brain waves using an EEG (electroencephalogram) to understand if the driver is in a “high workload position.” While an EEG traditionally functions with probes on the skull, theoretically, an EEG could be mounted in the headliner of a car to monitor the driver without contact. However, while the electric fields that emanate from your head extend about a half inch into the atmosphere, the technology is not yet sensitive enough to pick them up, Seder says. This is one of those “horizon possibilities,” according to Seder.

Television makers such as Samsung and LG are sinking billions into AMOLED (Active Matrix Organic Light Emitting Diode) technology, which promises flexible, lightweight and power-efficient TV screens. In autos, AMOLED technology could be used for digital rearview “mirrors” or as a transparent heads up display (HUD). Because AMOLED is flexible, it could potentially be used to alleviate blind spots by overlaying a projection on a vehicle’s B or C pillars of objects blocked from view in the roadway.

There are a few kinks that have to be worked out here for autos, Seder says, as OLED pixels experience rather quick “luminance depreciation” and leave a type of image burn when subjected to repeated fixed patterns such as those in dashboard display gauges.

Persuasive design is a nascent area of research that extends well beyond automotive engineering, Seder says. This technology already exists to promote eco-driving, such as the Chevy Volt’s floating ball display that “rewards” drivers for more efficient driving characteristics. When pushed on where GM is going with this, Seder couldn’t reveal any secrets, except to say the company has a whole eco system of displays and techniques to prod the driver to drive more ecologically, if they so desire.

Automakers are also keen to understand the decline of the human visual system and cognitive ability in general in order to design displays accordingly. These take into account the aging characteristics of a reduced perception of blue light, the loss of high-frequency hearing and our ability to process and make decisions as quickly. The key today is to achieve a universal design that works for 18- to 85–year-olds, though customized prescription-based display formats based on individual characteristics could be on the horizon.

While some GPS navigation systems currently offer rudimentary routing options to save fuel (such as UPS’ routing technology that uses the fewest left-hand turns) the next level of eco-routing would be to factor in present and historical traffic data, road grade data, characteristics of your make and model and even historical data for your specific vehicle, Seder says.

But with all this data at our fingertips, I ask Seder, could the car send information to the DMV to be used as an evaluator of fitness to drive? That would certainly be possible, Seder admits, noting that privacy is a key concern. “It’s hard to quantify and politically dangerous to hand this [type of information] over to someone else,” Seder says. However, Seder noted the data is certainly useful, and less of a privacy issue, for fleets in which the vehicles are the owned property of the company.

The idea of autonomous vehicles has taken hold of the imagination recently with the publicity of Google’s self-driving car. Seder predicts some form of semi-autonomous driving by 2015 and full autonomous driving by 2025. Seder tracks autonomous vehicle technology as it relates to situational awareness in an autonomous vehicle. “It’s about understanding the state of the driver,” he says. “If you have to take control [of the autonomous car], how ready are you?”

But Seder says his lab is only tangentially involved in autonomous vehicles. At General Motors, that’s another department with a separate group of very smart people trying to figure out the future.  

Originally posted on Business Fleet

About the author
Chris Brown

Chris Brown

Associate Publisher

As associate publisher of Automotive Fleet, Auto Rental News, and Fleet Forward, Chris Brown covers all aspects of fleets, transportation, and mobility.

View Bio
0 Comments