The company is releasing Developer Preview 2 (DP2) for Android Things, which includes some new features and bug fixes to the proto IoT platform. Catching the eye is its added support for USB Audio to the Hardware Abstraction Layer (HAL) for the Raspberry Pi 3 (and Intel Edison).
The idea is that developers can build an Android-friendly smart device using Android APIs and Google services. Specifically, tools such as the IDE Android Studio, the Android Software Development Kit (SDK), Google Play Services, and Google Cloud Platform.
The company writes:
Android Things supports a System-on-Module (SoM) architecture, where a core computing module can be initially used with development boards and then easily scaled to large production runs with custom designs, while continuing to use the same Board Support Package (BSP) from Google.
Thanks to great developer feedback from our Developer Preview 1, we have now added support for USB Audio to the Hardware Abstraction Layer (HAL) for Intel Edison and Raspberry Pi 3. NXP Pico already contains direct support for audio on device. We have also resolved many bugs related to Peripheral I/O (PIO). Other feature requests such as Bluetooth support are known issues, and the team is actively working to fix these. We have added support for the Intel Joule platform, which offers the most computing power in our lineup to date.
Native I/O and user drivers
Other features of DP2 include support for the standard Android NDK, for native C++ development. They have now released a library to provide native access to the Peripheral API (PIO), with a sample and documentation.
There is also support for user drivers.
Developers can create a user driver in their APK, and then bind it to the framework. For example, your driver code could read a GPIO pin and trigger a regular Android KeyEvent, or read in an external GPS via a serial port and feed this into the Android location APIs. This allows any application to inject hardware events into the framework, without customizing the Linux kernel or HAL. We maintain a repository of user drivers for a variety of common hardware interfaces such as sensors, buttons, and displays. Developers are also able to create their own drivers and share them with the community.
One of the interesting aspects of Android Things is its aim to support machine learning and computer vision.
Pictured right is one sample application:
We have created a highly requested sample that shows how to use TensorFlow on Android Things devices. This sample demonstrates accessing the camera, performing object recognition and image classification, and speaking out the results using text-to-speech (TTS). An early-access TensorFlow inference library prebuilt for ARM and x86 is provided for you to easily add TensorFlow to any Android app with just a single line in your build.gradle file.
The TensorFlow sample identifies a dog’s breed (American Staffordshire terrier) on a Raspberry Pi 3 with a camera.