A Turning Point for Monetising Telco Data

Avatar photo

Of late TelcoForge has been steeped in lessons about the changes afoot in telecoms data, and especially how to make it more usable, more monetisable and more valuable for telecoms providers.

Gavyn Britton from data security provider Protegrity summarised the status quo as follows:

“I think the problems business leaders face at the moment is, one, they don’t know where it all is. They have enormous systems. Some of it’s on-premise, some of it’s in the cloud, some of it’s in old, antiquated machines that they just don’t want to touch. It’s kind of like asbestos. You know it’s in the building, you don’t want to touch it. You heard it’s dangerous, no one wants to go there.

“So number one is, where does it all exist? And then how do we protect it and how do we make sure we can use it? And that’s especially prevalent now in the age of AI, because AI thrives on data – all of it, lots of it. And the problem is when you start doing AI projects and you start pointing it to data sources, are you 100% sure the CEO’s compensation package isn’t lurking in some spreadsheet on some guy’s hard drive?”

Earlier this year TelcoForge spoke to a BSS vendor who observed that, in an average six-week project integrating a service into a telco, five weeks would be spent on finding the data. That’s the kind of conversation which could have taken place twenty years ago, and to external observers it might sound absurd that this is still taking place. However:

  • Telecoms providers are never the same thing from one year to the next. Acquisitions, divestments, new systems and capabilities change the data landscape constantly.
  • The sheer quantity of data that telcos produce is immense. They’re responsible for transporting every packet of the internet, and just the metadata around delivering those IP packets is a huge overhead. Add on measurements to do with users’ service quality, signal strength, handover management and so on. Bringing everything together from the different touchpoints in the network would be a massive undertaking, making a single ‘corporate data lake’ unfeasible.
  • At the same time, there’s also the fact that telcos can’t risks the services which are already working. So it tends to mean that new systems layer on top of old ones rather than replacing them, which is where the problems associated with ‘legacy’ come from.

Data has been called ‘the new oil’ for well over a decade now, and telcos have always been data-rich. They have just struggled with refining it; and, with critical information about users, their activity, their bank details and more, they’ve steered on the conservative side of how they use it.

Can this change, though? There have been a few actions recently that feel like stones rolling down a slope. There’s a good chance they’ll just come to rest in a new equilibrium… but there’s also the possibility that they trigger all the turmoil and change of a landslide.

Data Aggregation

Cloudera recently held their Evolve event in London, where they discussed the impact of several acquisitions that will be making their way through into product in the coming months. One of these in particular will have telecoms providers cheering; their incorporation of Octopai provides an automated data discovery function, mapping out what data exists where in a company or network.

Cloudera CEO Charles Sansbury told the story of a company they’d been brought in to assist consultants who had been doing a data mapping exercise for six months. “24 hours later we gave them a map of all their data. They sacked the consultants,” he observed.

While this won’t be commercially available from Cloudera until early 2026, it’s an exciting step and means that data can be engaged with where it is, relieving the demand for data lakes or major upheavals to simplify data holdings.  

Once you have that data visibility and can engage with it where it lies, things can change fast. Recently Totogi held a demonstration at MVNO Nation which should give pause for thought. Their ‘BSS Magic’ platform accesses data by hooking into existing BSS systems, layers a telecoms ontology over the top and then puts an AI over that, with which people can build the BSS elements they need.

In their demonstration, one engineer was able to create and roll out over 600,000 lines of code consisting of multiple modules in a twelve-hour period, effectively custom-building a BSS using AI to do the hard work of coding. The ontology gives context and abstracts the complexity of the underlying systems. Again, the lesson seems to be less about replacing systems but simply masking the underlying complexity. BSS Magic does this in a different way from Cloudera’s solution, but there’s a similar outcome.

Data Value

The other piece of the puzzle lies in what to do with the data once it’s been found. Data analytics company Ocient recently produced a research paper “Beyond Big Data: From Roadmap to Reality” which brings together insights from 500 of their customers across 15 industries, including telecoms providers. The results paint a compelling picture.

While it’s still early days proving out AI’s return on investment, IT organisations are becoming increasingly confident in addressing it. However, while good data is foundational to getting good results in AI, data processing requirements have been growing fast; according to 29% of respondents, beyond their organisation’s ability to keep pace.  Meanwhile, 55% highlighted data quality and preparation as one of their biggest challenges in making the most of their AI/ML investment.

That said, where people are on top of their data game it appears to be paying off in AI ROI; 43% of respondents noted that they had seen significant measurable returns on their investments. Where had they seen those returns? Operational efficiency, process automation and customer service sound like obvious areas, but perhaps less so for risk management and decision-making…. But apparently so.

We’ve all heard nightmare statistics in the past few months saying that very few companies are making money from their AI investments, and diving into the data a bit more this seems to be true so far. ROI is definitely taking place, but not directly in creating new revenues…yet.

Telco, Compliance and the Red Queen

Ocient’s report tells us that compared to survey averages, telco respondents are less likely to say they feel:

 • Prepared for the increased data processing demands from AI

 • Very confident in their data governance solutions for AI data pipelines

The data governance challenge, of course, is nothing new for telecoms providers. The trick will be to ensure data is used in ways which don’t open up the company (or its AI) to risk.

This is an issue which Protegrity has been working on.

“There is a feeling of “I don’t want to get left behind”, so organizations will probably start taking more risks than they should do,” Britton commented.

“They’ll tend to go, “You know what? We’ll take the risk because the benefits are huge. Until they get breached.”

Ocient refers to a similar phenomenon – the Red Queen effect, where companies are running fast and feeling like they make no progress. In that situation, the temptation to get ahead by ‘moving fast and breaking things’ is very real for business leaders, while the temptation from the IT side, who are very aware of the scale of downside, is to lock down sensitive data.

Data Money

For some, though, there is a different approach to governance, which is about embedding the right rules in the operational structure. At the TM Forum’s Innovate Americas event Martin Kessler, Verizon’s VP, Deputy CISO and Head of Enterprise Cybersecurity Services noted that “Operationalized governance isn’t a barrier — it’s how we go from pilot to production with confidence.”

Which sounds good in theory, but what does that mean in practice? For Protegrity it means data tokenisation. Britton explained it as follows:

“If your name is Ian, we’ll take 3 letters to represent your name, but it won’t be I, A and N. So I can still see, okay, that’s a name. I can see your date of birth; the first couple of digits are rubbish, and then I can see your year, which I might want to use for my data scientists. So you then have the ability to interchange letters or tokens with any personally identifiable information or information field so that it still looks like an address and it still looks like a name, but it’s actually not identifiable. I can’t work out that this is Ian, but I can see it’s a data field and I can then potentially still use that.”

Protegrity have put this forward as a way to enable in-house analysis of data in ways which comply with GDPR, for example to improve customer service. However, it also opens up the possibility of using tokenised telco data to support third parties, for instance by training third party AI models. Finally all that rich data can be leveraged in ways that regulators can’t object to and which don’t open the company up to risk.

So… are we on the verge of a step-change in how telecoms providers think of and use their data? Possibly. There are a host of operational reasons for ‘no’ to intervene, but all it will take is one service provider to take the capabilities that have been on show, or something like them, and run with it.

There are technology solutions out there now which can mask the complexity that has slowed service providers down before; and can enable the sharing and use of all that rich telco data in responsible ways. Monetising telecoms data through network APIs is certainly a good start, but it’s possible that we could see this going significantly further.

Note – we’ll be going into this more on 4th December in a webinar featuring Jesse Lynch from Ocient. More on this soon!

Image by Wynpnt on Pixabay

Total
0
Shares
Previous Post

Digital Scams Cost US $1bn Every Week, Report Finds