MENU

An extra pair of eyes on the world

An extra pair of eyes on the world

Opinion |
By Matthew Cockerill & Carsten Eriksen



Smart glasses integrated with automotive systems can usher in a new era of enhanced human perception and safety on the road, say Matthew Cockerill and Carsten Eriksen. 

The recent announcement of Meta’s partnership with Oakley to deliver durable, long-lasting AI-enabled smart glasses designed for athletes marks the next step beyond Meta’s initial Ray-Ban smart glasses of this emerging product category. As it begins to move from initial product market fit towards potentially widespread adoption, it echoes the early days of the smartphone, a period marked by experimentation and gradual refinement.

What began as a device just for phone calls and text messages spawned innovations like Instagram, Uber, and countless others. Smartphones didn’t just add features, they changed how we live. Smart glasses could follow the same path.

We’re currently witnessing the smart glasses “phone calls & texting” phase. They offer basic hands-free smartphone features – capturing memories, streaming music, and AI-powered visual description of what you’re looking at – enough for adoption in some people’s day-to-day lives.

Today’s AI can detail a historic monument or translate a foreign menu, but this visual description capability is just the foundation. The future lies in contextual intelligence that doesn’t just describe what you see, but understands what you need. Picture glasses that guide you through a bicycle repair step-by-step, or instantly highlight your departing train on a crowded station board so you can run and catch it.

This evolution toward contextual intelligence is what we call “Physical AI” – systems that actively perceive the environment, interpret it spatially, and deliver precisely timed insights. Rather than simply overlaying digital information, these systems extend our human senses by understanding the physical world around us. Forward-thinking businesses are already planning for these more capable, integrated possibilities.

But amidst the excitement surrounding this new category, it’s worth pausing to consider the broader context. The recent failure of Humane’s Pin serves as a cautionary tale – an ambitious attempt to reinvent everything at once, often missing the mark by trying to be too much, too soon. Early adoption thrives when it meets people where they are, familiar products that do more than expected, rather than forcing a radical departure from established habits. That’s why early use cases – especially those that enhance safety and perception – will be key to smart glasses’ success.

Myself and Carsten have decades of experience helping global brands define what to build next, have explored one possible future potential of smart glasses, when they have become more widely adopted, much like smartphones do today, to work seamlessly and more effectively with other systems in our homes, businesses and cars, to enhance our perception and capabilities.

What if your smart glasses could work with your car’s systems?

From sensors and onboard compute currently used for driver assist features like lane assist and auto park, to allow you to see the road ahead more clearly and anticipate the actions of other road users. The key differentiator is the tight integration with the vehicle’s perception and compute systems. Rather than simply being a wearable display, these smart glasses become an extension of the car’s own sensors and AI. By combining data from 360-degree cameras, radar, LiDAR, and internal vehicle controls – like steering angle, brake pressure, and gear selection – we can create a richer, more context-aware augmented reality overlay. This symbiotic relationship unlocks new possibilities for both the smart glasses and the vehicle itself, creating a truly integrated driving experience.

Let’s paint a picture of how this translates to real-world scenarios. Think about driving on a foggy rural road, visibility reduced to near zero. The smart glasses dynamically adapt, highlighting lane markings and outlining the edges of the road, preventing accidents that might otherwise occur. Or consider an urban environment – a cyclist suddenly appears in your blind spot, unnoticed by the human eye. The glasses instantly detect and alert you, providing crucial reaction time. Anticipating other road users is equally important. The AI prediction algorithms analyse the movements of vehicles, outlining their projected paths and providing drivers with extra reaction time. Imagine seeing a subtle “shadow” projection of another car’s intended lane change, allowing you to adjust your speed and position proactively. Or seeing a pedestrian step into the road from behind parked cars, giving you that extra millisecond to react.

We’ve all experienced the stress of driving in challenging conditions, the moments when you accidentally pull out without spotting someone, or the surprise of another driver cutting in front of you. These future iterations of smart glasses would alleviate that stress, providing a constant layer of awareness and reassurance – the feeling that you have an extra set of eyes on the road.

While applications we already do on our smartphones have utility in a wearable form and are important for early adoption, the true opportunity lies beyond mirroring existing functionality. It’s about identifying practical use cases that fundamentally enhance our ability to perceive, understand, and interact with the world by leveraging the developing capabilities of physical AI.

We believe we are witnessing the dawn of enhanced human perception. When smart glasses become as ubiquitous as smartphones, we’ll all gain an extra pair of eyes that reveal what was previously hidden. Physical AI will transform smart glasses from clever gadgets into genuine extensions of human capability. This isn’t just about better technology – it’s about fundamentally expanding what it means to see and understand the world around us. The future isn’t just smarter glasses; it’s smarter humans.

Matthew Cockerill is a design innovation consultant specialising in uncovering human and business value. He helps global companies develop opportunities with emerging technologies and human needs through new use cases and their everyday products and experiences. His work accelerates R&D, shapes product development, and informs go-to-market strategies.

Currently, he is working with leading physical AI companies on making frontier AI capabilities valuable in physical environments. Previously, he has shaped pioneering products and experiences across diverse industries for Samsung, TikTok, Sky, Panasonic, OVO Energy, Ford, Logitech, and Fairphone.

Carsten Eriksen is founder and CEO of design house Swift Creatives in Denmark.

If you enjoyed this article, you will like the following ones: don't miss them by subscribing to :    eeNews on Google News

Share:

Linked Articles
10s