Existing on-premises and centralized cloud infrastructure can’t support the vast computing needs of these powerful applications, which require low latency—or data-transfer delay—to smoothly transport and get real-time access to data. To reduce latency
and bandwidth use, as well as rein in costs, computing power and processes must be closer to the physical location of the data. The solution? Move computing power to local infrastructure at the “edge” of the network, rather than relying on distant
A whopping 90% of industrial enterprises will use edge computing technology by 2022, according to Frost & Sullivan, while a recent IDC report (registration required) found that 40% of all organizations will invest in edge computing over the next year. “Edge computing is necessary
to enable the next-generation industrial revolution,” says Bike Xie, vice president of engineering at AI technology vendor Kneron. The future of AI and other automation technologies depends on the decentralized edge, he explains, whether it is
by connecting internet-of-things and other devices to distributed network nodes or implementing AI-enabled chips that can build algorithmic models autonomously.
“Edge computing is complementary to the cloud,” Xie says. “Like cloud, edge technology enables applications manufacturers need to both gain and apply the data-driven knowledge that will power smart factories and products.”
Manufacturing moves to the edge
The move toward edge computing is the result of a sea change in manufacturing over the past two decades. Manufacturers, whether they make industrial products, electronic equipment, or consumer goods, have transitioned slowly but steadily to increased
automation and self-monitoring of systems and processes to drive greater efficiency in producing products, maintaining equipment, and optimizing every link in the supply chain.
As manufacturers implement more sensor-based, automation-driven devices, they also produce more data than ever before. But often, data sets from sensor-based devices to centralized systems can quickly grow unwieldy, slowing down automation and making real-time
Edge computing allows manufacturers to make flexible choices about processing data to eliminate time lags and decrease bandwidth use, as well as about which data can be destroyed right after it is processed, says Xie. “Manufacturers can process data quickly
at the edge if data transmission to the cloud is a bottleneck, or move certain data to the cloud if latency and bandwidth are not an issue.” Not only does processing data closer to where it’s used save bandwidth and reduce costs, he adds,
but data is more secure because it’s processed right away.
IDC predicts that by 2023 more than 50% of new enterprise IT infrastructure deployed will be at the edge rather than in corporate data centers, up from less than 10% in 2020.
An example of toggling from cloud to edge comes from Paul Savill, senior vice president for product management and services at Lumen, a technology company that offers an edge computing platform.
Lumen recently did an installation at a newly built, million-square-foot factory. Robotic systems from about 50 different manufacturers rely on edge computing “because they needed to be within 5 milliseconds of latency to accurately control the robotics,” Savill says.
The deployment provides secure connectivity from the edge applications to the robotics manufacturers’ data centers, “where they collect information on a real-time basis.”
But for long-term storage of data and for machine-learning and analytics applications—all that goes in the public cloud, says Savill. Other, larger workloads are processed in big data centers “with vast computational power” that can process enormous sums of data quickly.
“That chain from the public cloud to the edge compute to on-premises is very important,” says Savill. “It gives customers the ability to leverage the latest advanced technologies in a way that saves them money and drives tremendous efficiency.”