Jan Jongboom
Make your IoT device feel, hear and see things with TinyML (2020)
Status: Available NowMany IoT devices are very simple: just a radio sending raw sensor values to the cloud. But this limits the usefulness of a deployment. A sensor can send that it saw movement in front of it, but not what it saw. Or a sensor might notice that it's being moved around, but not whether it's attached to a vehicle or is just being carried around. The reason is simple: for knowing what happens in the real world you'll need lots of data, and sending all that data over your IoT network quickly drains your battery and racks up your network bill.
How can we do better? In this talk we'll look at ways to draw conclusions from raw sensor data right on the device. From signal processing to running neural networks on the edge. It's time to add some brains to your IoT deployment. In this talk you'll learn:
- What is TinyML, and how can your sensors benefit from it?
- How signal processing can help you make your TinyML deployment more predictable, and better performing.
- How you can start making your devices feel, hear and see things - all running in realtime on Cortex-M-class devices.
- Hands-on demonstrations: from initial data capture from real devices, to building and verifying TinyML models, and to deployment on device
Live Q&A Make your IoT device feel, hear and see things with TinyML (2020)
Status: Available NowLive Q&A with Jan Jongboom following his talk titled 'Make your IoT device feel, hear and see things with TinyML'