By Andrew Gunder, DCMME Graduate Assistant

Image result for edge computing

Edge computing is an emerging trend in the world of technology. Yet, its concepts aren’t completely new. Edge computing is defined as the practice of processing data near the “edge” of your network, where data is being generated, instead of at a centralized data-processing warehouse. It gathers the data at its closest point to make that data actionable in the least amount of time. For example, smart thermostats use edge computing to determine when to adjust temperature through out the day.

Edge computing is beneficial because it moves the computer workload closer to the consumer  this reducing latency, bandwidth and overhead for the centralized data center. Content delivery networks (CDNs) are a prime example that showcase the benefits, such as reduced latency and higher uptime, in storing information closer to the end user. It also increases security and reduces risk of breach since the data remains at its point of creation, rather than consolidating in a centralized location such as a server.

Edge computing has emerged as result of the increased prevalence of the Internet of THings (IoT) and IoT devices.  The network “edge” relies on use case. Cell towers, smartphones, and automated vehicles all function as micro datacenters for the network. As we continue to see IoT and its related devices expand into mainstream use the limits of edge computing will transcend most physical boundaries on a global scale.

What are some examples of devices that use Edge computing already?

How is the IoT necessary for the success of Edge computing?

How can business and different industries effectively use Edge computing?

Source: https://www.cloudwards.net/what-is-edge-computing/