What Is Edge Computing & Why Does it Matter?
In an edge computing model, data is processed close to the source. It doesn't need to travel up to a central location and back again. Instead, the work happens very close to (or even on) the device.
Let's give a quick example.
You own a refrigerator that can connect to the internet and send you a reminder to pick up eggs when the tray is empty. Every few minutes, that device weighs the egg cups to determine if the reminder should go out or not.
In a traditional setup, the refrigerator would record each cup weight and send it to the cloud for long-term storage. In an edge computing model, it would record only empty cup alerts.
Let's dig deeper and give you even more information about what an edge network is and what it can do.
What is edge computing?
The simplest edge computing definition sounds something like this: A connected device processes the data it generates close to the point of origin, not on the cloud or a faraway server.
It's easiest to understand what edge computing is by understanding what it is not.
In a traditional system, all monitoring data headed into the cloud or a corporate server for analysis and processing. The cloud or server stored applicable data and discarded unhelpful data. The device and the server were always in contact.
A system like this comes with plenty of drawbacks. Organisations could have far more data than their systems can handle. And processing time in faraway locations can lead to performance issues, such as jagged videos or lost alerts.
Edge computing puts the processing very close to the point of origin. It eliminates that constant connection, and the processing happens quickly with few latency problems. Companies can focus on delivering a great experience rather than figuring out how to deal with mountains of data.
How does edge computing work?
Everyone has a slightly different definition of edge computing. Most arguments concern where processing happens. But all experts agree that edge computing involves processing close to the point of origin.
Tools used in edge computing include:
- Wireless sensor networks
- Mobile devices
- Local servers
A connected device generates data. Those bytes are examined quickly, either on the device itself or via a technology or server close to the device. Only important parts head to the cloud or a central server.
Why you should care about edge computing
In the 1990s, the computer sitting on your desk created data. In the 2020s, your refrigerator, dryer, mobile device, car, and doorbell could all create data. This web of interconnected devices, called the Internet of Things (IoT), makes edge computing crucial.
In 2017, Gartner predicted a jump in connected devices. By 2020, the team thought, 20.4 billion connected devices would be up and running. Each one would generate mountains of data.
That timeline has shifted due to 5G technology. As 5G becomes widespread around the globe, more companies build devices that can connect. And consumers expect amazing speed from those connected devices.
For example, if your neighborhood offers 5G and you install a connected doorbell, you likely expect it to:
- Stay in touch. Every time you look at your phone's app, you want to see current video.
- Remain alert. Whenever something suspicious happens, such as someone peering into your door, you want your doorbell to notify you.
- Offer proof. If something happens, you want the video to share with the authorities.
Supporting one doorbell might not be a problem for the average company. But what happens when there are hundreds or thousands of these devices, and they all shoot information to servers across the globe?
In this environment, edge computing can:
- Scale. If a company adds thousands of devices, and they're all capable of semi-processing the data they generate, a company won't need to upgrade storage capacity constantly.
- Sustain. A system that isn't overwhelmed with data tends to stay online rather than crashing.
- Speed. Interconnected devices need low latency. A lag makes them much less effective. Local processing makes that possible
As we add more connected devices to our world and become accustomed to using them around the clock, we will need edge computing's power.
3 drawbacks associated with edge computing
While edge networks have plenty of benefits, they also have some problems.
Common concerns associated with edge systems include:
- Privacy risks. If connected devices that aren't secure handle sensitive data, theft is possible. Consumers may also end up sharing information with manufacturers that they wish they'd kept to themselves.
- Reliability. If devices don't connect with long-term storage regularly, and they suffer some kind of catastrophic failure, customers could lose some of the data they need.
- Security. Attackers may be able to hack into an edge node, and if they do, they could tap into sensitive information.
These risks may not outweigh the benefits of edge computing. You may read through them and decide that you'd like to take the plunge anyway. But it's wise to know the tradeoffs you face.
At Okta, we keep a close eye on topics concerning IoT and security. We wrote an interesting blog post about the trends that affect cloud identity in an IoT environment. Check it out.
The Edge Computing Model for Storage Is Looking Better All the Time. (July 2020). IT Pro Today.
How to Explain Edge Computing in Plain English. (November 2020). The Enterprisers Project.
Gartner Says 8.4 Billion Connected Things Will Be in Use in 2017, Up 31 percent from 2016. (February 2017). Gartner.
What 5G Promises for IoT. (October 2020). Networkworld.
Why We Need Low-Power, Low-Latency Devices. IEEE Innovation at Work.
Edge Computing Security and Privacy. (June 2020). Journal of Cloud Computing.