22 Nov 2023
by David Clarke

How the public sector can avoid 1990s excess at the cloud’s edge (Guest blog from NEC Software Solutions)

Author: David Clarke, Head of Infrastructure Technical Services, NEC Software Solutions 

The global cloud computing market is currently growing at a rate of 18% per year. The speed of adoption picked up in the 2000s, when Google and Amazon jumped in. And it was turbo-charged at the start of the 2020s by the pandemic, where many organisations found themselves disconnected from their key systems. 

For our clients – which range from government agencies running critical national infrastructure to the smallest local authorities – it’s keeping the 1990s in mind that will help them get the most from the cloud at the lowest possible cost.  

A quick introduction to edge computing 

To understand why the 90s matter, we need to look at what’s happening today. Let’s start with clouds. We use them to connect to different services over the internet, and those services can range from basic storage to serious computing power. But if every service calls on the same cloud continually, sending and requesting data over long distances, there’s a high risk of both latency and excessive cost, not to mention complete reliance on the network. 

The principle of edge computing is that you reduce latency and cost by processing data closer to where it’s used and generated. In video streaming, for example, edge devices compress content before transmission to use less bandwidth. And if you imagine an enormous cloud mothership sending out smaller cloud servers to where they’re needed, that’s an edge computing network. 

What lessons can we learn from the 1990s? 

We’ve been here before, of course. Processing data close to its users is exactly what happened with the client-server model 30 years ago. We were obsessed with the rich graphical user interfaces that replaced the simple terminals connected to large servers (not dissimilar to the cloud). By introducing rich client devices, we moved data processing into a clear space that was right next to the user - their own PC. After a while, those PCs needed big memory upgrades to cope with more complex applications. Then later, those applications themselves became costly to update because the number of machines they were running on jumped from tens to thousands.  

For the public sector bodies in NEC’s markets, I think there’s a risk of repeating this mistake and throwing everything into the cloud (or its edge) without considering the long-term implications. 

Why clouds are full of unintended consequences 

The very existence of the phrase ‘hybrid cloud’ proves it’s not a simple question of in or out. The set up is infinitely variable. Choose well and you can close the gaps in your cybersecurity and disaster recovery, enjoying 24/7 surveillance and advanced redundancy, with servers, data and applications replicated across different geographies. But choose badly, and you could find out where you went wrong far too late.  

Then there’s the cost. Firstly, you don’t need to do something just because you can. Backups can get expensive, for example, if they’re re-saving static data every day rather than securing it once. We’ve also seen organisations leave the public cloud and ask their software vendor to switch them to SaaS once they discover the hidden cost and complexities of managing their applications.  

How to match the solution to the problem 

The organisations we support aren’t just in ‘our’ cloud. We often take a hybrid approach, managing some services in Azure and others in the Oracle cloud, because that’s the most cost-effective solution for a particular client. Some older applications perhaps weren’t ready for cloud deployment, so interim solutions were required to bridge the gap. Understanding an application’s future roadmap is crucial to timing things right and avoiding forcing a square peg into a round hole. It just stores problems up for later.  

In our experience, public bodies of all sizes are looking for control, certainty and security. Control over upgrade timings is particularly important, because their demand can peak sharply and they need all their integrations working. They also want to be certain of availability, particularly for 24hr services, and to be clear on the ongoing costs. Cybersecurity is critical too, particularly when smaller organisations are struggling to recruit and retain experienced teams. 

Getting ready for AI and IoT 

A careful approach to edge computing is also vital for the future of public services. AI and IoT may seem further away for the public sector, but sensors are already sending live data on air quality to housing providers and it’s an area that’s set to grow. Getting the right mix of cloud vs premise, and core vs edge, is going to vital to the sustainability of key services because it will have a material affect on cost. But throwing everything at the edge to save money is no panacea, just as it wasn’t in the 1990s with client-server, so we need to keep that lesson in mind. 


Cloud Week 2023

News, views and insights on how cloud computing continues to reshape how we live and work. techUK's annual Cloud Week is an opportunity for the tech community to explore key issues in cloud and highlight new ideas and thought leadership from our members.

Find out more

 

techUK's Technology and Innovation newsletter

If you’d like to start receiving information about relevant events, news and initiatives via techUK’s monthly Tech Tracker Newsletter, please subscribe here and join the Technology and Innovation contact preference.

Sign-up here

 

Authors

David Clarke

David Clarke

Head of Infrastructure Technical Services, NEC Software Solutions