Edge computing is a philosophy in the field of networking which focuses on bringing the computer as close as possible to the source of the data, thus reducing the bandwidth use and latency. Talking in simpler terms, edge computing reduces the number of processes running in the cloud and moving them to local places such as user’s computers or devices.
The name itself has emerged from the concept of computation being performed close to the edge where the information is produced or consumed.
The growth of IoT devices gave birth to the concept of edge computing. These devices get connected to the internet and gather a lot of data from the cloud or deliver back to the cloud.
Benefits of Edge Computing
The major benefits of edge computing are as follows:
- Cost Saving
Server resources for any product or project do cost some considerable amount of money. Edge computing helps reduce these server resources thus leading to a great saving in terms of cost.
Whenever the device needs to communicate with a distant server, there is always a delay that is created. Edge computing, by bringing the process to run at the network edge enhances the performance of an application.
- New Functionality
There are many new functionalities that edge computing has made possible, for instance, it has made possible for a company to analyse data in real time, thus use that study in a process.
Drawbacks of Edge Computing
One major drawback of edge computing is that it increases the opportunities for malicious attackers on the devices. When data is at the edge and is being used by many devices, it can be troublesome.
Another limitation of edge computing is that it requires a lot of local hardware. Also the electricity and network connectivity has a great impact on the edge device reliability.