Machine learning is a necessity in 5G

A fully functional 5G system is not going to happen without machines that can learn and make decisions by themselves. Machine learning allows 5G wireless networks to be predictive and proactive, which is fundamental in making the 5G vision possible.

“5G and machine learning go hand in hand,” remarks adjunct professor, Academy of Finland research fellow Mehdi Bennis from the CWC (Centre for Wireless Communications, University of Oulu). “Without machine learning, networks simply cannot predict. Without the ability to predict, applications like remote surgery and wireless virtual reality cannot happen.”

Existing 4G networks are reactive. “Reacting takes you nowhere,” Bennis states. Instead, 5G networks have to be predictive, proactive and anticipatory to enable low latency and high reliability applications.

Bennis’ international collaborations and his team at the CWC have put forward the concept of proactive edge caching to serve predictable user demands in wireless networks. This is done by using an array of machine learning algorithms.

Bennis’ team is currently developing solutions that allow base stations (fixed points of communication) to predict what kind of content users nearby may request in the near future. Base stations have limited storage size, so they have to learn to predict user needs by applying a variety of machine learning tools. With these tools every base station will be able to store a judicious set of files or contents.

“If the local base station has the desired content, bingo. If it does not, the user will have to download it from a cloud server very far away. That takes forever,” Bennis explains. In the 5G world, even a few milliseconds of latency can make a difference.

Machines have to learn to predict and make decisions by themselves.

According to Bennis, machine learning has to evolve as a discipline. “So far it has been focusing on running algorithms in a centralized manner, without caring about latency issues.” He suggests that instead of a centralized cloud server, there should be distributed mini-servers with storage and computing capabilities, referring to the base stations that serve users proactively.

The base stations have to run machine learning techniques in a very reliable way. “Can you guarantee that 99.9999% of the time the requested files will be in the base stations near users?“ Bennis asks. “And if the user is not requesting a file cached for him, you waste resources. It’s as if I prepare a dish for you, and you come home and you say you want something else.”

One of the main challenges with 5G and machine learning is where to actually run the demanding algorithms and computations. A reliable 5G system requires extremely low latency, which is why everything cannot be stored in remote cloud servers far away. Latency increases with distance and congestion of network links. “This is why our networks must be predictive. Machine learning becomes crucial in optimizing them,” Bennis explains.

Latency and reliability are extremely important for 5G scientists and engineers alike. With 5G there is no room for unbounded delay. For example remote surgeries and autonomous vehicles make no sense with high latencies. “Take self-driving cars for example: as the car is driving, it must recognize pedestrians and other objects in real-time, not tomorrow,” Bennis elucidates.

Drawing inspiration from the brain

Bennis and other scientists at the CWC research the fundamentals of 5G, where machine learning plays a big role. “In my team, we have obtained significant improvements in the context of edge caching just by applying off-the-shelf machine learning algorithms, like k-means clustering and non-parametric Bayesian learning.”

In addition to applying well-known machine learning algorithms, Bennis wants to focus on something deeper. He is just starting to investigate deep learning from a fundamental perspective. Basically, he wishes to emulate how the human brain does its computations. “The amount of information the brain processes – it’s just amazing. Understanding it to even some extent will undoubtedly spearhead unforeseen applications.”

Because of the massive amounts of data in future 5G networks, Bennis says it is much more efficient to fragment a big server into multiple smaller ones to run computations in parallel. This is analogous to the brain with its different layers and endless connections.

In the end it’s all about going back to the fundamentals. “The end goal for us in this lab is not machine learning. The goal is to lever techniques, such as algorithms and fundamentals of machine learning, to improve 5G networks,” Bennis concludes.

Who’s Mehdi Bennis?

Academy of Finland Research Fellow (2015–2020), adjunct professor Mehdi Bennis has received several prestigious IEEE awards for his works, including the Best Tutorial Paper Award (2016) from the IEEE Communications Society and EURASIP Best Paper Award (2017) for the Journal on Wireless Communications and Networking (JWCN). He has also received the IEEE COMSOS Fred Ellersick Prize for his on work on the integration of LTE and WiFi.

Bennis is currently working on:
- Ultra dense networks (UDNs)
- Edge and fog computing
- Ultra reliable low latency communication (URLLC)
- Drones
- Vehicle-to-vehicle communication

Mehdi Bennis speaks highly of the CWC. “CWC is by far the best research institute in Finland. This is in fact simply due to the very talented researchers we have here, and by the mere fact that this is the only research institute in Finland with one academy professor and three academy research fellows.”
 

Read more about the 5G research in CWC:

Welcome to 5G Test Network

 

Text: Antti Miettinen

 

Last updated: 21.2.2018