Edge Computing: Pushing Cloud to the Brink
Cloud computing has become one of the most common business technologies in use today – and for good reason. Via the cloud, even small businesses have access to resources they could never have dreamed of before, delivered from anywhere in the world at a price they can afford. From enormous data storage capacity to cutting-edge cybersecurity and a million different business apps, the cloud has enabled a new age of digital business innovation – and it’s just getting started.
And just as the cloud has rewritten the rule book on digital business optimisation, new technologies are optimising the way we use the cloud – the most notable of them being edge computing.
While accessing data and processing power from across the planet at any time of day or night has been cloud computing’s main appeal for years, it does have one drawback – it’s a bandwidth-intensive process, and though it’s fast with a decent Internet connection, the vast distances involved in the transfer of data mean that it’s still not quite fast enough to create a seamless experience in applications like voice telephony and the use of virtual assistants like Echo and Alexa, where the delay is often quite noticeable. Reducing this delay will also be essential for cloud-based processing that must occur in real-time – for the AIs, robotics and self-driving cars that are set to change the way we live and work in the coming years. Put simply, the cloud is good, but it could be better. And that’s where edge computing comes in.
So, what exactly is edge computing?
The term “edge computing” refers to the physical geography of where the actual computing takes place. The advent of cloud computing centralised these processes to the data centres of cloud service providers. These servers and data centres are where all the action happens, before the information is delivered via the Internet to devices at the “edge” of the network – PCs, tablets, smartphones and other IoT devices.
What edge computing does is bring these processes to geographies closer to where the data is ultimately needed. While much of the work will still happen in cloud data centres, edge computing shifts a large percentage of the processing to the devices and gadgets already at the “edge” of the network. This significantly improves speed and efficiency.
Will edge computing ever replace the cloud?
The short answer: probably not.
As more businesses adopt edge computing, it will be up to them to decide how much of their computing can be performed on edge devices, and how much of it should remain in the core as it stands right now. As an example, an AI-powered virtual assistant in your home or a manufacturing robot in a smart factory might take on some of its own processing in the name of a better user experience and added speed. However, it would also still deliver the data it gathers every day to a centralised server or data centre, to be analysed and used to drive improvements in the long run.
While edge computing does mean loading more and more IoT devices with added processing power, it’s far more likely to form a symbiotic relationship with the way we currently use cloud computing and storage today. It’s a supplement – not a replacement – and a new one at that. Only time will tell what advantages the edge might hold, but ultimately, achieving greater consistency between the services you use in the cloud and the devices you run them on can never be a bad thing.