HA Bytes is reader supported - So if you purchase through links on our site, we may earn an affiliate commission. More

HomeExplainedWhat is Edge Computing? Explained

What is Edge Computing? Explained

When we think about cloud computing and the Internet of Things (IoT) we tend to imagine Edge Computing as an afterthought, which can be easily explained by considering that Edge Processing was first conceived as a way to extend the limits on bandwidth.

Edge Computing is increasingly becoming popular among organizations of all sizes across different verticals such as manufacturing, healthcare, automotive, and telecom industries. The global Edge Computing market size of the industry is expected to grow in 2023 with a CAGR of during the forecasted period of 2017 – 2023.

Edge computing is made up of several key elements that improve storage efficiency and processing speeds while enabling real-time data collection across networks, such as RFIDs (Radio Frequency Identification) and the Internet of Things (IoT).  Edge computing can also provide real-time monitoring of data collected from Edge devices, Edge Computers, and Edge Servers to detect real-time changes in conditions that may affect end-user experience.

Image Credit: Wikipedia/HABytes

The concepts behind Edge Computing are still developing as Edge goes beyond IoT, into areas such as Augmented Reality (AR).  AR is “an enhanced version of reality where the computer graphics are merged with an incoming video stream”.

The primary software used for AR development is unity 3D. Unity uses a game engine architecture that allows developers to create 3D games using elements within the same environment. Edge Computing offers many benefits including improved response times, minimizing lag by distributing work to remote or mobile units rather than relying on central computing facilities.

Edge Computing can be seen as part of the  Edge of Everything, Edge being defined as “the place where two or more networks connect”, Edge Computing is the process of storing data at the Edge of a network on devices referred to as Edge Servers.

Edge Servers are Edge Devices designed with Cloud Computing in mind, typically have their own storage capabilities which allows developers to host Edge Computing applications.

The History of Edge Computing

The history of Edge Computing can be traced back to the late ’70s and early 80s when power requirements were constantly changing due to constantly fluctuating network speeds. This led engineers to start introducing Edge Servers in their infrastructure, which allowed for Edge Processing and Edge Storage.  

Complex applications such as a large-scale stock exchange required a lot of computing power which was not feasible on Edge devices or Edge Servers, this lead companies towards Edge Computing.

Image Credit: Gigabyte/HABytes

Edge Computing was made possible by the introduction of Edge Servers which allowed Edge data to be collected and stored, Edge devices were developed in response to Edge Servers as they provided Edge Processing. The Edge Device concept can also be traced back to its predecessor, distributed computing using task splitting. Task Splitting is the process of dividing a problem into smaller tasks that are assigned to different computers, Edge Devices are used for this purpose in Edge Computing environments.

Edge computing can also be traced back to the early days of the internet when it was used to manage traffic. However, the term “edge computing” wasn’t coined until 2010, when it was used in a paper by Cisco Systems. The paper outlined a new way to manage data traffic and improve performance.

Since then, edge computing has become an important part of the internet infrastructure. It is used to manage data traffic and improve performance for a variety of applications, including:

– Streaming media

– Online gaming

– Autonomous vehicles

– Industrial IoT

– Smart cities

The Future of Edge Computing

The future of edge computing may come faster than expected.

More people are beginning to realize that the Internet is increasingly becoming an integral part of the lives of many living in today’s world. We rely on it more and more for work, education, entertainment, communication, socializing, shopping, politics…the list goes on. It has almost become a necessity when you think about it.

So has the cloud by consequence. And with the amount of data being sent over this infrastructure growing exponentially every year, there will eventually be a need for innovation at both ends—the core (backbone), which is where most chief information officers are currently putting their energy and capital expenditures, and at the edge —home networks and datacenters closest to users’ devices, which is where the future of edge computing may be.

The so-called “Internet of Things” (IoT) will require intense computing power, and companies are ‘anticipating more data than ever before, especially as more users come online through 4G networks’.


Also Read: What is Virtual Reality? Is it Worth


As a result, gaining intelligence at the edge, whether at home or in business, will become imperative for companies to remain competitive with each other.

The major players in this new field are Samsung Electronics Corp, ARM Holdings PLC, Qualcomm Inc, Alphabet’s Google unit, and International Business Machines Corp, all currently selling some type of chip technology to drive innovation at the edge.

They are attempting to make it possible for high-bandwidth activities such as video streaming and augmented reality to take place without buffering or delay, which can be caused by round trips to the cloud.

Edge computing will play a role in reducing traffic on cellular networks and improving response time for users.

Samsung is making moves to put more of its hardware and software capabilities at the edge. For instance, it has developed an edge computing platform called Artik that is being marketed to companies in a range of industries, including transportation, health care, and manufacturing. It also offers a service that helps customers manage their devices and data on the edge.

Image Credit: Shale Gas Outrage/HABytes

ARM Holdings PLC, which designs microprocessors, has been expanding its capabilities in this area as well. Qualcomm Inc., whose processors are used in many smartphones, is also looking to make headway in edge computing.

Google, for its part, has been making moves to put more of its services and applications, including search, Maps, YouTube, and advertising on the edge. The company has introduced a tool called Cloud IoT Core that helps customers manage devices and data at the edge.

And it has teamed up with companies such as Samsung and ARM Holdings to develop chips and other technology specifically for edge computing. International Business Machines Corp., which has long been a player in the cloud-computing market, is also focusing on edge computing. It recently acquired The Weather Company, which operates a network of weather sensors that can be used to collect data closer to where it is needed.

Want to Know More

This post was originally published on 6, December 2021, but according to new information stuff, this post is updated frequently.


Random Suggestions:

>What is Mixed Reality (MR)? Explained

>What is Metaverse? Explained

>What is Machine Learning? Explained

>What is Blockchain Technology? Explained

>What are NFTs? Explained

HA Staff
HA Staffhttps://thehabytes.com/
HA Staff is a team of writers that work together, researching new products everyday, buyer's guides, fact-checked, and daily life problem solving toturials.

Leave a Reply