Edge Computing Definition and Examples

edge computing: [ehg com-pew-ting] noun.

According to the National Institute of Standards (NIST), edge computing is “the network layer encompassing the end-devices and their users, to provide, for example, local computing capability on a sensor, metering or some other devices that are network-accessible. This peripheral layer is also often referred to as (Internet of Things / IoT) network.” In other words, edge computing puts computing power closer to edge devices where it’s needed in order to reduce latency.

For example, Instead of centralizing server processing power in a data center in a faraway location, an edge computing implementation might employ many smaller, closer-in data centers set up to serve individual neighborhoods.

For context, edge computing is not a new term, but according to Google, interest in this concept spiked in 2016. The recent popularity of the term has to do with increasing reliance on sensors and IoT devices — which often create vast amounts of data. And as more IoT devices come there will be more and more data that needs to be processed quickly.

Source: Google Trends

And while use of cloud computing resources has many advantages, use of edge computing resources has increasingly been seen as a way to address some shortcomings of cloud computing, such as potentially low bandwidth, high latency, and disruptions to service.

According to many, edge computing will become increasingly important as more everyday objects contain computer chips.(Consider the recent smart-device boom, which has included cars, drones, autonomous robots, sensors, and more.)

Edge Computing Examples

Now that we know the concept behind edge computing, let’s examine some real-world use cases that help explain when edge computing implementations make sense.

Autonomous vehicles — Modern, self-driving cars are basically huge computers on wheels that are packed full of sensors. When you’ve got a vehicle flying down the road at high speeds, constantly analyzing its surroundings and communicating with other cars, a lot of data-crunching has to happen, and time is of the essence. If a hazard arises, the autopilot system will need to respond instantaneously. There’s no time to send data to a cloud server for processing when a split-second could make a world of difference.

Real-time facial-recognition systems — For advanced security solutions that use cameras to identify suspicious people or potentially dangerous behaviors on the fly, it will be important for the system to identify threats quickly. By either processing information in device or using a server located very close to the source, insights might be gained that can prevent security incidents before they occur. Additionally, due to the sensitive nature of this type of data, some organizations might hesitate to process or store this information in a public cloud providers’s data center.

Remote safety monitoring — In some far-flung industrial centers, cloud computing might not make sense, due to inadequate internet connectivity. For example, say you’ve got sensors monitoring an oil rig in the middle of nowhere. If you rely on processing this data in a centralized data center hundreds of miles away, there’s going to be a lot of latency. But if you process and analyze data on site, you might be able to react to a situation as it develops, potentially preventing a safety disaster.

Edge Computing Challenge: Comment below!

Now it’s your turn. How would you explain edge computing simply, to others who aren’t necessarily tech savvy?

What experiences have you had with edge computing, and how do you feel about this term? Does it accurately match the concept it describes?

82 Spice ups

Umm… Edge Computing: The way it was?

So, for most of my career, processing happened onsite with transmission of end-of day, or at best hourly results to a data center.

Then came “The Cloud” - Someone else’s computer. Folks rushed to that. No need for onsite processing, or those nasty servers.

Now “Edge Computing” is here. Processing happens onsite with transmission of results to a data center.

Welcome to the marketing hamster wheel.

53 Spice ups

I just tell them I’m an Edgelord.

16 Spice ups

Edge Computing, let’s see if we can make humans even lazier and more dependent on technology.

OR, we really don’t have enough attack vectors on our critical infrastructure so let’s increase the number of devices that can communicate by what your mathematical types call An Order of Magnitude.

Remember, computing started out consolidated because there was no way to justify the cost of anything else. Then we went to a distributed model of computing power on every desktop. Cloud took us back to consolidate and now we swing back to distributed again.

3 Spice ups

^^^ this. ^^^

You see the same lifecycle with facial hair popularity.

10 Spice ups

It’s a fancy buzzword whose purpose & meaning is more academic than practical.

3 Spice ups

Laziness pays off now, so i have a beard.

I always thought edge computing was just going to a bitcoin mining site in microsofts latest browser and having your cpu get maxed out due to a dodgy script on the page so you start making coins for them.

4 Spice ups

Wake me up when we reach the state of singularity computing.

2 Spice ups

I just figured Edge computing had to do with “The Edge” from the band U2 LOL!

download.jpg

31 Spice ups

I was expecting the article to be a bit more edgy.

8 Spice ups

Edge is the biggest headache in security right now. Trying to find ways to secure iot and peripheral devices in a healthcare setting… keeps us on edge.

4 Spice ups

Oh, I thought “edging” was…um, nevermind.

11 Spice ups

So let me see if I have this history of computing right:

  1. Centralized Mainframes
  2. Time Share on those mainframes
  3. Semi-Distributed computing with the rise of Unix
  4. Wide spread distributed computing with the IBM PC
  5. Centralized Servers connecting PCs (generic term here) to share files - Servers are frequently mainframes.
  6. Centralized Client/Server with “thin” clients - the bulk of the computing is centralized
  7. Wide spread client distribution with centralized computing (early smartphones)
  8. Decentralized computing with smartphones and cars and at the same time Client/Server using centralized web-services but more computing power on the client

Looks to me like we keep going back and forth between “Core” and “Edge” computing. We’ve finally gotten to the point where the edge devices have enough horsepower they can take over a lot of the previous services relegated to Servers & Mainframes.

5 Spice ups

I should probably clarify a little more. Another interpretation of edge computing is having much smaller data centers located on the neighborhood level to serve IoT devices, instead of having enormous facilities that cover vast regions. Some have even suggested that edge computing might evolve into some kind of peer-to-peer or mesh model where more powerful IoT devices talk to each other and solve problems together.

Fog computing is another concept … it’s like a more advanced edge computing that interacts with public cloud depending on workload, need, and geographic location.

So, you could use your fog machine for that too.

Oh yea … there’s mist computing too. So, you’ll get your money’s worth.

1 Spice up

Don’t forget Vapor Edge Computing.

2 Spice ups

Why do we create fancy names for simple things…

“The Cloud” -someone else’s servers/crap.

I wonder what it’s going to be called when the Cloud is old and everyone starts putting all the infrastructure back into their environments…

2 Spice ups

MR.Burnz …

I have to say that graphic made reading all these posts worth it just for that one post.

Update: I’m working on getting company approval for a fog machine.

4 Spice ups