[ad_1]
The device was created by DeepLocal, built using a development kit from Technexion and is powered by the NXP Pico Pi i.MX6 flavour of Android Things, Google’s IoT platform.
Basically, it draws a facial image previously captured by photo. Check out the Hackster.io project for all the source code, schematics, and 3D models.
There are six options when it comes to hardware support for Android Things: the Raspberry Pi 3 Model B, the NXP i.MX7D, the NXP i.MX8M, the Qualcomm SDA212 and SDA624, and the MediaTek MT8516 hardware platforms. Previous support for NXP i.MX6UL devices has been discontinued, as was that for Intel Edison back in 2017.
Drawbot on NXP Pico i.MX7D
The project in question runs on an NXP Pico i.MX7D System-on-Module, and the bill of materials is estimated at $160 (plus an Android Things starter kit, £24 for the Rainbow HAT for a Raspberry Pi 3).
Rather optimistically, I think, the build time is estimated as five hours, or “a rainy afternoon”.
Oscar Prom and Robert Rudolph, of Deeplocal, write:
Our DrawBot is built like a sandwich. The top and bottom plates are cut from plywood, and everything else sits inside. Each wheel is driven by it’s own stepper motor. Two ball casters under the front and back of the robot keep it balanced.
The DrawBot draws using a brush tip marker. A servo motor raises and lowers the marker to draw lines of different thickness on the paper. The DrawBot can turn in place around the marker, and can drive forward in straight lines, just like the original Logo Turtle programming game! Using only basic commands (TURN and FORWARD), we’re able to generate complex drawings.
We use two USB power banks to power our DrawBot. One battery powers the Pico.iMX7 board, which runs Android Things. The other battery powers the motors.
Software
When it comes to the firmware, you are in the territory of Android Studio, using OpenCV for facial detection and photo manipulation, and Android Things hardware drivers to control the button, LED, drive motors, and pen servo.
We are told there are four steps:
- Flash Android Things onto the NXP development board using this guide.
- Clone the software repository onto your computer.
- Import the project into Android Studio. (File > New > Import Project)
- Deploy the application. (Run > Run ‘app’)
Note, you will then need to calibrate the DrawBot before it can actually generate pictures.
HandBot
There’s also a HandBot you can build, if you are really interested!
It’s a robotic hand that learns and reacts to hand gestures, Google describes it as visually recognising gestures and applying machine learning.
[ad_2]
Source link