But SuriCog is here to demonstrate some unique eye-tracking technology which could turn your surroundings into a custom user interface, naturally.
A small camera mounted inwards detects the reflection of two IR LEDs illuminating the right eye ball, then video image processing determines the actual axis of vision of the wearer. Also integrated on the right side of the spectacle is a special telemetry system which gives the distance of the wearer to the nearest physical object in his/her line of sight.
If you intersect that data with a 3D digital model of your environment, then you are able to not only tell where the wearer is looking but also where he or she is situated in this environment.
"We can track the user’s location to within millimetres", told us Arthur Carrier, business engineer at SuriCog, though he was shy about the actual technology used to perform the distance measurement.
From looking at the WEETSY design (that’s how they call the wearable device), one could suspect a dark slanted optical window on the slightly bulging right spectacle arm. This could be hosting some sort of IR laser, but again Carrier didn’t want to comment. Of course, we are not talking about absolute geolocation but relative user positions within a given environment.
"For now, we are looking for early adopters of our technology to identify the potential killer apps and help us refine our offering" he said, "So we perform feasibility studies and provide a demonstrator together with a software development kit" he added.
The company started generating some revenues in 2014 from applications in ergonomic studies and assisted training in aeronautics, when the instructor needs to check if the trainee is looking at the right place.
The vision health market is another promising market, precisely for orthopsy, where the WEETSY spectacles could help with the detection of oculomotor disturbances. Dedicated monitoring applications could help the wearer re-educate their eyes through specific motion exercises.
What makes SuriCog’s eye-tracking different from other products on the market is that it is not limited to one screen, but encompasses the whole space. Also, multiple users can have access to different information depending on where they look, in the same room. In fact, the company is experimenting with the Louvre museum in Paris to provide audio guides to its visitors. The experiment is to take place in front of a wall-sized painting, with triggered audio comments corresponding to elements of the painting being looked at.
SuriCog has just raised a round of funds from private investors to further its product development. Among other things, the startup would like to reduce the signal processing hardware from its current chewing-gum box sized format to maybe a USB stick or even to an ASIC. The demonstrator was tethered and tied to a little processing box, but in the future, it could operate wirelessly using a Bluetooth connection, hopes Carrier.
First, SuriCog will focus on B2B projects, with companies who already have a digital model of their environment or at least who have the resources to recreate their workspace in 3D. But as 3D scanning technologies become more affordable or even mainstream, then the company could chase the consumer market, either with its own standalone product or maybe by licensing its unique eye-tracking technology to other AR or VR headset manufacturers.
Initiatives such as Google’s Tango project or Parrot’s recent 3D mapping drone demonstrator could enable consumers to easily model their environment, hence creating a surrounding user interface where the mere look at something could trigger an event. The real world would become a remotely accessible digital interface, a suitable API could let users define the codes of interaction.
Visit SuriCog at www.suricog.net