By tracking the focal length with a microsecond time resolution, the lab prototype described in the paper was capable of generating 1600 focal planes per second, fast enough to assign 40 focal planes per frame at 40 frames per second, sequentially. The authors are confident that the optical module necessary to track the lens’ focal length could be miniaturized to be integrated in wearable VR systems.
Of course, at such frame rates, display luminosity would need to be improved too as each focal plane is illuminated for a smaller fraction of time. Now for any given video, that’s many more views to render, which will call for more processing power too and possibly new ASICs to synchronize the new image rendering stack with the focal length detection and focal plane distribution set.
Interestingly, here the viewer can accommodate freely on arbitrary depths from 25cm to infinity (the authors established that the maximum possible depth range of 7cm to infinity would require 147 focal planes to be achieved effectively).
Carnegie Mellon University - www.cmu.edu