TinyML could democratize AI programming for IoT
Much of the development in machine language (ML) implementations follows a “bigger is better” path: more data, more storage, more compute power, more bandwidth all equals better results – or so the mantra goes.
But a community-based effort is moving in the opposite direction, building “TinyML” implementations for use on low-powered devices with scarce computer and memory assets — like the millions of sensors deployed in Internet of Things (IoT) implementations.
TinyML downsizes the technology to deploy neural networks on low-cost microprocessors where they can operate completely or semi-independently using long-lasting, low-power batteries. While semiconductor companies have created chipsets to exploit this technology, proponents say they can deploy TinyML software to existing microcontrollers in the field.
The TinyML effort coalesced in early 2019 at a meeting to formalize effort around the fast-growing movement, which drew nearly 200 attendees and led to the formation of the tinyML Foundation.
“We are really exploring machine learning for low-power inexpensive applications, rather than big machine learning algorithms running in a data center,” said Zach Shelby, then an executive with the Arm Ltd. semiconductor and software design company, who left to co-found Edge Impulse, which provides TinyML developer resources.
Making Sense of Noisy Sensor Data With Tiny ML for IoT
TinyML started as a hashtag from Pete Warden of Google, one of the proponents of the movement. In a book he co-authored for TinyML developers, Warden wrote, “It became clear to me that there was a whole new class of products emerging, with the key characteristics that they used ML to make sense of noisy sensor data, could run using a battery or energy harvesting for years, and cost only a dollar or two.”
To read the complete article, visit IoT World Today.