Qualcomm unveils dedicated ‘extended reality’ platform
The company pointed to original equipment manufacturers Meta, VIVE, Vuzix and Picoare as its first partners developing on the XR1 platform. XR1 integrates Qualcomm Technologies’ heterogeneous compute architecture, including the ARM-based multicore Central Processing Unit (CPU), vector processor, Graphics Processing Unit (GPU) and Qualcomm AI Engine.
Other key features include an advanced XR software service layer, machine learning, the Snapdragon XR Software Development Kit (SDK) and Qualcomm Technologies connectivity and security technologies.
The XR1 platform also provides an AI engine for on-device processing. This engine enables customers to run high performing, power efficient machine learning based computer vision algorithms that can help with key AR use cases like better pose prediction, object classification, etc.
The XR1 platform will enable consumers to be immersed in their favorite movies, programs and sports by supporting Ultra high-definition 4K video resolution at up to 60 frames per second for high-quality VR HMDs. New dedicated hardware and software algorithms within its Qualcomm Spectra Image Signal Processor (ISP) can help significantly reduce unwanted noise from snapshots producing a substantially improved final picture in high-quality AR headsets.
The integrated display processor provides a range of display options with hardware accelerated composition, dual-display support, 3D overlays and support for leading graphics Application Programming Interfaces (API), including OpenGL, OpenCL and Vulkan. The platform also features advanced vision processing capabilities fundamental for technologies like Visual Inertial Odometry (VIO), which lets users move around in the virtual world or interact with augmented objects in an AR experience.
The chip uses Qualcomm’s Aqstic Audio Technologies, and Qualcomm aptX Audio for high-fidelity audio experiences and “always-on, always-listening” voice assistance as well as Bluetooth playback. XR1’s head-related transfer functions (HRTF) enable users’ ears to synthesize binaural sound that feels like it comes from a specific point in space.
The XR1 also packs three- and six-degrees of freedom (3DoF, 6DoF) head tracking and controller capabilities for XR devices, with an integrated sensor hub and optimized sensor fusion for users to experience rich interactions with motion to photon latencies well below the scientifically required 20ms.
Qualcomm – www.qualcomm.com