R&S, Nvidia team for AI-enabled 6G wireless receiver

R&S, Nvidia team for AI-enabled 6G wireless receiver

Technology News |
By Nick Flaherty

Rohde & Schwarz has worked with Nvidia on a wireless receiver that uses machine learning for a complex system for next generation 6G networks

The hardware-in-the-loop demonstration of a neural receiver at Mobile World Congress (MWC) in Barcelona next week will show the performance gains possible when using trained ML models compared to traditional signal processing.

The 5G NR uplink multi-user multiple input multiple output (MU-MIMO) being used is a blueprint for a possible 6G physical layer.  The setup combines high-end test solutions for signal generation and analysis from Rohde & Schwarz and the Sionna open-source library for link-level simulations that runs Nvidia AI accelerator chips.

A neural receiver constitutes the concept of replacing signal processing blocks for the physical layer of a wireless communications system with trained machine learning models. A future 6G standard is likely to use AI/ML for signal processing tasks, such as channel estimation, channel equalization, and demapping. TSimulations suggest that a neural receiver will increase link-quality and will impact throughput compared to the current high-performance deterministic software algorithms used in 5G NR.

To train machine learning models, data sets are the key starting point. Often, the required access to data sets is limited or simply not available so for early 6G research, test and measurement equipment provides a viable alternative when generating various data sets with different signal configurations to train machine learning models for signal processing tasks.

The R&S SMW200A vector signal generator emulates two individual users transmitting an 80 MHz wide signal in the uplink direction with a MIMO 2×2 signal configuration. Each user is independently faded, and noise is applied to simulate realistic radio channel conditions.

The R&S MSR4 multi-purpose satellite receiver acts as the receiver, capturing the signal transmitted at a carrier frequency of 3 GHz by using its four phase-coherent receive channels.

The data is then provided via the real-time streaming interface to a server where the signal is pre-processed using the R&S Server-Based Testing (SBT) framework including R&S VSE vector signal explorer (VSE) micro-services. The VSE signal analysis software synchronizes the signal and performs fast Fourier transforms (FFT). This post-FFT data set serves as input for a neural receiver implemented using the Sionna software.

The Sionna open-source library for link-level simulation enables rapid prototyping of complex communications system architectures and provides native support to the integration of machine learning in 6G signal processing.

As part of the demonstration, the trained neural receiver is compared to the classical concept of a linear minimum mean squared error (LMMSE) receiver architecture, which applies traditional signal processing techniques based on deterministically developed software algorithms. These already high-performance algorithms are widely adopted in current 4G and 5G cellular networks.

“Signal processing in wireless communications using machine learning algorithms is a very hot topic in the industry right now, often controversially discussed among industry peers,” said Andreas Pauly, Executive Vice President of Rohde & Schwarz Test & Measurement Division.

“We are delighted to work with a partner like Nvidia on this test bed. It will enable researchers and industry experts to validate their models based on a data-driven approach and put them to the test in a hardware-in-the-loop experiment, using our leading test solutions for signal generation and analysis,” he said.

Ronnie Vasishta, Senior Vice President of Telecommunications at Nvidia, said, “Trained ML models open up considerable potential for increasing performance compared to conventional signal processing. This hardware-in-the-loop demonstration of a neural receiver from Rohde & Schwarz and Nvidia marks a milestone for the industry in demonstrating the utility of AI and machine learning in 6G technology.” 

Related 6G articles

If you enjoyed this article, you will like the following ones: don't miss them by subscribing to :    eeNews on Google News


Linked Articles