27 Sep 2021

How service providers can profitably ride the data wave and improve customer experience

The speed of data growth is rapid and shows no sign of abating. According to recent studies, data creation will grow to more than 180 zettabytes by 2025. In other terms, that’s a 127% increase from this year’s projected data production. This significant leap in data traffic, only exacerbated by 5G rollouts, is making it a real challenge for Service Providers to successfully capture new opportunities and ensure they maintain a high-quality customer experience, which is profitable at the same time.  Conventional wisdom is simple – more data traffic means more network probes are needed to keep on top of it to maintain a positive user experience and prevent customer churn. Yet there is a critical flaw with this conventional wisdom: the expense of additional network probes can easily result in a massive unbudgeted overspend, with no guarantee of improving the customer experience and a possible reduction in ACPU.

The difficulty for Service Providers is therefore how to reduce overspend without negatively impacting the end-user experience. Existing network visibility technologies and probes are not coping with the data traffic levels, but instead are resulting in significant amounts of traffic being dropped and a knock-on impact on the operator’s ability to analyse subscriber data. What is required is a re-think on how to make network monitoring more effective and cost effective.

Breaking the cost curve

Currently, probing costs have a directly proportional relationship to throughput processing capabilities, and the more data travelling across the network, the higher the investment will be on probes. However, it is not impossible to flatten the cost curve. It is unsustainable to allow tooling expenditure to continue increasing alongside data volumes, especially as Service Provider’s revenues remain largely flat. Breaking the cost curve will offer a clear competitive advantage, as SPs will be able to focus on the most profitable subscribers and traffic types, as well as optimise operations and prepare networks for the exponential data growth in the coming years.

Packet reduction for network optimisation

Let’s look at one example. For one global Telecommunications Service Provider (TSP), investment into high-quality customer experience was a top priority and originally set to cost €117m. Yet the organisation eventually racked up a further unbudgeted €11.7m on network probes due to increasing data volumes. This was not only unsustainable, but also hampered the team’s ability to deliver projects efficiently and guarantee customer satisfaction.

To overcome this issue, the operator needed to implement an innovative approach to infrastructure visibility. With utilising patented packet reduction techniques, it became possible to retain critical data that may have previously been dropped from a network and instead only send relevant traffic to the existing probing infrastructure. This had the effect of not only reducing the volume of traffic going to probes but also increasing the operational efficiency and data quality available to the customer service team. Better data ultimately means better customer experience, in turn leading to improved ARPU. For this particular TSP, filtering data through packet reduction allowed for a 60% reduction in traffic volumes and the total cost savings per site came to €2.1 million.

This network aggregation approach is built on the following four-stage process:

  • Filtering or flow mapping to pass just header data through to the probe and reducing payload and storage requirements
  • De-duplication to further reduce traffic volumes
  • GTP correlation to make the GTP traffic more consumable
  • Tool load balancing to enable the probes to see whole conversations but prevent oversubscription and dropped packets

This ultimately reduces the operational expenditure and allows the service provider to direct those savings towards delivering a best-in-class customer experience. The other benefits are extending the operational life of the existing probe infrastructure, gaining greater visibility into network traffic and allowing the business to re-focus its technology spend on more exciting areas.

Traffic intelligence is becoming indispensable for Service Providers and visibility is central to ensuring the right intelligence can be leveraged to boost customer experience and optimise operations. While conventional thinking and the continuous addition of probing tools is a flawed solution to unmanageable traffic increase, innovation through packet reduction can transform the way TSPs function in an era dominated by the data deluge. To find out more about how service providers can break the cost curve with next generation visibility platforms, click here.

Matt Percival.jpg

Guest blog by Matt Percival, Senior Director Service Provider, Gigamon. Follow Gigamon on Twitter and LinkedIn