Nvidia 5G ML [shutterstock: 1376671043, BAIVECTOR]
[shutterstock: 1376671043, BAIVECTOR]
Artificial Intelligence Press Release

Calling AI: Researchers Leverage Machine Learning For 5G

5G researchers from three top institutions have joined Nvidia in bringing AI to telecom.

The Heinrich Hertz Institute (HHI), the Technical University in Berlin (TU Berlin) and Virginia Tech are collaborating with Nvidia to harness the power of GPUs for next-generation (5G) cellular networks.

The journey began in October 2019 at MWC Los Angeles, where Nvidia and partners announced plans to enable virtual radio access networks (vRANs) for 5G with GPUs.

Nvidia also debuted Aerial, a software development kit for accelerating vRANs. And partners Ericsson, Microsoft and Red Hat are working with Nvidia to deliver 5G at the edge of the network powered by GPUs.


These vRANs will bring cellular network operators the kind of operational efficiencies that cloud service providers already enjoy. Carriers will program network functions in high-level software languages, easing the work of adding new capabilities and deploying capacity where and when it’s needed.

Forging wireless ties

The new research partnerships with HHI, TU Berlin and Virginia Tech will explore multiple ways to accelerate 5G with AI.

They’ll define novel techniques leveraging GPUs that help wireless networks use precious spectrum more efficiently. The work will span research in reinforcement learning and other techniques that build on the product plans announced in October.

HHI is part of Germany’s Fraunhofer Society, a research group founded in 1928 that has a history of pioneering technologies in mobile and optical networking as well as video compression. The collaboration with TU Berlin includes a new 5G test bed with participation from a number of wireless companies in Germany.

“I want to redesign many algorithms in radio access networks (RAN) so we can perform tasks in parallel, and the GPU is a good architecture for this because it exploits massive parallelism,” said Slawomir Stanczak, a professor at TU Berlin and head of HHI’s wireless networking department.

Stanczak’s teams will explore use cases such as adapting AI to deliver improved 5G receivers. “If we are successful, they could offer a breakthrough in dramatically increasing performance and improving spectral efficiency, which is important because spectrum is very expensive,” he said.

Hitting 5G’s tight timing targets

Work at Virginia Tech is led by Tom Hou, a professor of computer engineering whose team specializes in solving some of the most complex and challenging problems in telecom.

“Using GPUs transformed our research group, now we are looking at AI techniques on top of our newly acquired parallel techniques,” he said.

Specifically, Virginia Tech researchers will explore how AI can automatically find and solve in real time thorny problems optimizing 5G networks. For instance, AI could uncover new ways to weave multiple services on a single frequency band, making much better use of spectrum.

“We have found that for some very hard telecom problems, there’s no math formulation, but AI can learn the problem models automatically, enhancing our GPU-based parallel solutions” said Huang.

Groundswell starts in AI for 5G

Other researchers are starting to explore the potential for AI in 5G.

Addressing one of 5G’s top challenges, researchers at Arizona State University showed a new method for directing millimeter wave beams, leveraging AI and the ray-tracing features in Nvidia Turing GPUs.

And Professor Terng-Yin Hsu described a campus network at Taiwan’s National Chiao Tung University that ran a software-defined cellular base station on Nvidia GPUs.


Social Media

Sign up for e3zine´s biweekly newsbites

Please do not use administrative mail adresses like "noreply@..", "admin@.." or similar as these may get blocked for security reasons.

We use rapidmail for dispatching our newsletter. By signing up, you agree that the data you have entered will be transmitted to rapidmail. Please take note of their terms and conditions and privacy policy. terms and conditions .

Our Authors