
Harman International Industries (Stamford, Conn.), for example, was at the Geneva International Motor Show last week to demonstrate its own “eye and pupil tracking” technology. The company’s in-cabin camera “continually captures the driver’s pupil dilation, and a proprietary software algorithm analyzes the pupil reflex using advanced filtering and signal processing,” according to Harman.
FotoNation (San Jose), an imaging algorithm specialist for smartphones, came to the Mobile World Congress last month and talked about its own driver monitoring system – which, for change, doesn’t use eye-gazing technology. Youssef Benmokhtar, senior director of marketing and business development at FotoNation, told EE Times, “At FotoNation, we see automotive as the next opportunity for our growth.”
Other technology companies moving into the driver monitoring market include Smart Eye (Gothenburg, Sweden), whose algorithms search for both the iris and pupil, and Tobii Tech (Stockholm, Sweden), designer of an eye-tracking system composed of sensors and algorithms.
Level 3 Vehicle Automation
So, what’s driving the auto industry to monitor drivers?
Roger Lanctot, associate director of automotive practice at Strategy Analytics, said that it’s “Level 3 autonomous driving.”
The U.S. Department of Transportation’s National Highway Traffic Safety Administration (NHTSA) defines vehicle automation in five different levels. Level 3 implies “Limited Self-Driving Automation.”
Under Level 3, Lanctot said, “there is an implied need to monitor the driver to ensure he or she is available to/capable of taking control of the car as it transitions from automated driving back to being driven.”
Jeremy Carlson, senior analyst, automotive technology at IHS Automotive, believes that automakers need this step before they advance their vehicles’ functions from ADAS to automation and ultimately autonomy.
Carlson said, “We know very little about the driver since the only contact points we have today are the steering wheel, pedals and transmission, with only the first two being relatively ‘constant’ points of feedback.”
Carlson added, “It is critical to understand as best we can what the driver is doing—and hopefully avoiding the pitfalls of intuiting what they’re doing simply by inferring from how they interact with the steering wheel, etc.”
But beyond safety, and insurance companies with a vested interest in knowing how drivers drive, what can carmakers gain from monitoring drivers?
Car sharing, car as a service
FotoNation’s Benmokhtar rattled off a few factors motivating carmakers to install a driver monitoring system. They want, for example, to build a car that knows the personal preferences of driver and passengers, so that the car can tailor comfort issues such as seat positions and entertainment content. Auto companies are also eager to design a car for ride-sharing or as a service. In either case, a car, for security reasons, should be able to identify the driver and authorize him or her to drive, while allowing the driver to pre-load choices in navigation and entertainment.
IHS’s Carlson added, “OEMs can also tailor experiences to better suit the driver, leveraging eye tracking.” He said, “This can allow the vehicle to provide information in a different venue (head-up display vs instrument cluster, for example) depending on where they’re looking.” Carmakers will be able to know, “perhaps, where drivers are looking most often, to avoid the constant changing the head-up display, for instance, he noted. It could “allow carmakers to know in turn which could cause distractions.”
The automotive industry’s interest in monitoring drivers appears real.
Benmokhatar noted that FotoNation, which licenses face recognition/driver monitoring algorithms and hardware accelerator, has been getting traction for its monitoring system from Tier Ones and OEMs. The company has responded to several RFQs, and its technology has been benchmarked against others. “We think we are getting close [to get designed in],” he said.
Lanctot, however, made it clear that “the immediate market opportunity” for driver monitor systems is “in commercial vehicles across a wide range of applications.” He noted that Seeing Machines (Canberra, Australia) reported achieving profitability thanks to an existing contract with Caterpillar.
In his opinion, it’s still not that all clear if “car companies will in fact try to bring systems to market with Level 3 automation.”
He noted, “Google obviously eschews this level of automated driving and multiple organizations have suggested the wider automotive industry may also seek to leapfrog directly to Level 4 – given the challenges and potential failures associated with the process of transition.
In NHTSA’s definition, Level 4 means “full self-driving automation,” where a driver is not expected to be available for control at any time during the trip.
In contrast to Google, however, Japanese automakers, such as Toyota and Nissan, are reportedly eager to embrace Level 3 autonomous driving now.
Tracking eyes without using eye-gazing tech
Eye-tracking, no doubt, is a complex technology, as the human eye is in constant motion, Carlson noted.
Although at an early stage, a handful of vendors have made progress with eye-gazing technologies.
But FotoNation believes eye-tracking/eye-gazing technologies come with limitations — especially related to cost and implementation issues for OEMs and Tier Ones.
FotoNation claims a technology that can do away with high-cost cameras. “We can use a VGA camera – instead of a mega-pixel sensor – to track a driver’s head position/orientation,” which in turn helps determine eye location, said Benmokhtar.
Technically speaking, FotoNation’s system isn’t an eye-tracking system. Rather, it tracks heads. The technology tracks 50 points on the driver’s face, enabling the monitoring system to extrapolate eyebrows and even measure accurately how open or closed an eye is, according to the company.

Combined with other data points such as mouth opening (that indicates a driver is yawning) and head orientation (whether it’s nodding), the system can determine driver drowsiness, for example.
Combined with other data points such as mouth opening (that indicates a driver is yawning) and head orientation (whether it’s nodding), the system can determine driver drowsiness, for example.
The system works even when a driver wears a mouth mask (as often observed in Asia) and with all types of glasses and light conditions. FotoNation’s Benmokhtar said that it works also with NIR blocking sunglasses.

Because the system doesn’t depend on eye-gaze, automakers can save cost and gain the freedom of choice about the place to put cameras. A frontal camera position is not necessary, according to the company. “They can put it either in A or B pillar in a vehicle, or they can even put it on a console system or a rear view mirror,” said Benmokhtar.
For driver identification, FotoNation uses its own “biometrics-grade facial recognition with deep learning algorithm,” according to the company. FotoNation also offers iris recognition technology for extra security.
360-degree around vehicle sensing
FotoNation’s interest in the automotive sector isn’t limited to an in-cabin imaging solution.
The company aspires to use its imaging technology for “around-vehicle sensing” for urban driving. Benmokhtar distinguished this from ADAS features such as forward collision and lane departure warning systems often used on highways.
“We aren’t going after Mobileye,” said Benmokhtar. “But we want to add some intelligence – around vehicle sensing – to see if any objects coming closer to a car, for example.”
Lanctot called city driving “one of the greatest challenges for automated driving,” as cars need to avoid pedestrians and navigate intersections.
“Nearly everyone is already working in this ‘around-vehicle sensing’ space,” said IHS’s Carlson, “because it’s needed to support the driver (ADAS and automation) or replace the driver (autonomy).” Carlson noted that even Mobileye is getting into 360-degree sensing. “In short, there is going to be plenty of competition.”
CE heritage
Originally coming from the consumer electronics world, FotoNation has a long history of supplying critical imaging algorithms to vendors of smartphones and digital still cameras. The company supplied the panoramic features of Sony’s Cyber-shot camera (used in TV commercials featuring Taylor Swift). Its imaging technology has also focused on people. FotoNation, for example, has developed its own face detection/tracking technology and algorithms for face beautification to smooth skin and enhance the user’s eyes (used in Huawei’s smartphones), Benmokhtar explained. “We also offer image stabilization technology.”
The company’s consumer heritage has worked well for FotoNation as it moves into the automotive sector. “Our imaging algorithms are small enough to run on embedded vision SoCs,” said Benmokhtar, “such as Texas Instrument’s TDA3X SoC for ADAS.”
FotoNation also makes hardware acceleration available for licensing.
— Junko Yoshida is Chief International Correspondent, EE Times
