A UK startup has developed a computational model of human vision that challenges the current state of the art 3D graphics technology.
Fovotec in Cardiff has developed the model based on a non-linear approach that includes depth information and provides a more natural image, especially for images with a wide field of view.
This compares to the linear perspective algorithms that are used by all the 3D graphics rendering engines and graphics processor units (GPUs) but have a limited field of view before the image distorts.
The computation model has been implemented as a rendering path tracer that plugs into the Unreal graphics engine and can run on mobile phones to provide more realistic images in real time. This has been used for designing the interior of vehicles and for architectural visualisations.
However it also applies to large digital signage systems and to virtual reality (VR) systems that have a 150 or 180 degree field of view. Using a non-linear approach maps more accurately to how the eye works and so can provide more accurate alignment of what the VR display shows and where a user’s hands are in space.
The non-linear computation model also allows the depth information in an image to be altered in real time through a series of sliders.
The technology could be implemented through an open application programming interface (API) using hardware IP blocks in a chip, says Rob Pepperell, CEO and research director of Fovotec. He is also a Professor at Cardiff School of Art and leader of the multidisciplinary Fovolab at Cardiff Metropolitan University.
The inspiration for the work came from looking at the change from medieval art to the linear perspective used in more modern pictures. The researchers, led by Alistair Burleigh, Senior Research Fellow at Cardiff Metropolitan University, designed and built equipment to capture how people see the depth of images, and elements of the model have been published in peer-reviewed psychology journals.
The company has developed a prototype API for the Vulkan 3D graphics format, says Burleigh.
The company is also looking to work with IP providers on optimisations for ray tracing and rendering engines and is now looking to raise £5m to develop that technology, with a focus on corporate partners.