Android Things can serve up M&M candies, and recognize dogs
While Project Brillo was based on Android, feedback from uncertain developers about which SDKs and APIs could be used prompted Google shelve it and create Android Things.
Android Things currently works on multiple low-power computer broads, the NXP Pico, NXP Argon, Intel Edison, Intel Joule, and Raspberry Pi 3, meaning that a good mix of Intel and ARM-based CPUs are supported (32 and 64 bit). Those that follow these lines of gear will notice that none have less than 512MB RAM, and all have Bluetooth and Wi-Fi connectivity.
Google is no stranger to IoT, Nest is a testament to that. While Android Things is still very much in a preview phase of sorts, progress continues, and it is leveraging bits of the AI that Google has been showcasing.
We got to see a couple small examples of that in action at Google I/O 2017. One set-up showed a connected camera that was able to identify what it was looking at. They had several examples set up, but hearing Android Things say more than a simple “dog” response it pretty impressive.
Another set-up could recognize whether you smiled for the camera or not, then, offer you some M&M candies. Our experimental attendee, Morgane Lustman, a developer who specializes in machine learning, wasn’t ready for the M&Ms to come so quick, but it shows perfectly how Android Things will be able to facilitate simple transactions with specific types facial recognition (like smiling).
We are looking forward to seeing where Android Things takes us at next year’s Google I/O.