IoT trends continue to push processing to the edge for artificial intelligence (AI)
As connected devices proliferate, new ways of processing have come to the fore to accommodate device and data explosion.
For years, organizations have moved toward centralized, off-site processing architecture in the cloud and away from on-premises data centers. Cloud computing enabled startups to innovate and expand their businesses without requiring huge capital outlays on data center infrastructure or ongoing costs for IT management. It enabled large organizations to scale quickly and stay agile by using on-demand resources.
But as enterprises move toward more remote models, video-intensive communications and other processes, they need an edge computing architecture to accommodate data-hogging tasks.
These data-intensive processes need to happen within fractions of a second: Think self-driving cars, video streaming or tracking shipping trucks in real time on their route. Sending data on a round trip to the cloud and back to the device takes too much time. It can also add cost and compromise data in transit.
“Customers realize they don’t want to pass a lot of processing up to the cloud, so they’re thinking the edge is the real target,” according to Markus Levy, head of AI technologies at NXP Semiconductors, in a piece on the rise of embedded AI.
In recent years, edge computing architecture has moved to the fore, to accommodate the proliferation of data and devices as well as the velocity at which this data is moving.
To read the complete article, visit IoT World Today.