AT MWC 2024 we caught up with Nokia to discuss the smaller language models it is working on designed for more specific telco AI needs, and how AI in general is being integrated into networks.

Andrew Wooden

February 28, 2024

4 Min Read

It’s no surprise that AI in general and generative AI in particular were key talking points for Nokia at MWC 2024, because that clearly has emerged as the primary theme of the show for most of the firms we’ve spoken to this year.  

Generative AI is where the glamour is right now, and where a lot of firms are looking to attach their sails to. But as Matthieu Bourguignon, Senior Vice President, Network Infrastructures at Nokia told us, the business of automating elements of networks with AI has been going on the background as well, which he considers ‘essential’.

“We are looking at a wide range of AI and ML technologies and solutions. Because we consider that their integration, whether internally for Nokia to develop our own solutions, but also for customers –  especially for automation and optimization of their operations –  is essential for industry transformation. So of course as you say, AI is everywhere. Last year we were talking about the metaverse, but we didn't know too much what [the] Metaverse was in fact. AI is really a reality now, so it's something that we've integrated in our solution for years – we were not waiting for Mobile World Congress 2024. It's integrated in our solutions for automation, [and] for service provisioning on our systems.”

Chat GPT started the boom in interest in generative AI, but it’s a generalist platform based on scraping data in the public domain. Theirry E. Klein, President of Nokia Bell Labs Solutions Research told us how Nokia is playing around with more bespoke solutions either for its own purposes or as a customer product.

“GPT is great, but it is trained on public data. It doesn't know anything about Nokia products. So if you ask it specific questions about Nokia products, it doesn't do very well. It hallucinates and it doesn't have the correct answers, because it doesn't have access to Nokia proprietary documents. So we're working on a much smaller language model, [which is] doing much better from an accuracy perspective because it's trained on Nokia documents, and I can do the same thing for customers.

“I can train a [system] that understands the specificities of the Nokia solution, the customer environment, and I don't need a huge language model that can write poems and draw pictures and give me cooking recipes. That's not relevant to me. Now I have this huge model. It takes a lot of time and resources to train and run inferences on it when a lot of it is not needed. But it's missing the critical questions I'm trying to ask.

“So I think that's where we've seen a lot of a lot of progress in the last year and I'm quite encouraged from the number of conversations, we hear a lot of people talking about the private models trained on private data, and I think we'll see a lot more because that's where you're really addressing your problems in in the way that that gives you the accuracy you need but without a lot of overhead that you don't really care about.”

When asked what role AI will have in building out networks and how they run by the time we get round to 6G, and if it will mean the rollout will be fundamentally different to what we saw with 5G, Klien added:

“I think it will be fundamentally different in how we build, deploy, and operate the networks. It doesn't necessarily mean that the end customer will see AI in action, but you should see the benefit of AI optimizing networks in a way that we that we could not do it before because it's too time consuming,  we can’t get to these optimization problems, the search space is too large...  we should only use AI because it creates tangible benefits. And that's either on the operation of the network, which gives you operational benefits for the service providers, or because you're unlocking performance that then benefits the customers – we shouldn't do it just for the sake of it. But you might see that you have fewer dropped calls as a customer because we managed to optimize the beamforming.

“Or the operator might say I'm still able to meet all my SLA s, my customers are happy, but now my energy consumption is reduced because I've optimized my configuration. So you will see the benefit, but I don't think you necessarily see it as a consumer like you you're interfacing with the chat GPT. But certainly from the operation side, you would expect the operations team to be having a chat GPT like interface to query the network, or things are automatically detected and you get promoted – do you agree we should make this change? So that’s where the human will still be in the loop.”

Read more about:

MWC 2024

About the Author(s)

Andrew Wooden

Andrew joins on the back of an extensive career in tech journalism and content strategy.

You May Also Like