Getting Started with Sipeed M1 based Maixduino Board & Grove AI HAT for Raspberry Pi

Last year we discovered Kendryte K210 processor with a RISC-V core and featuring AI accelerators for machine vision and machine hearing. Soon after,  Sipeed M1 module was launched with the processor for aroud $10.

Then this year we started to get more convenient development board featuring Sipeed M1 module such as Maixduino or Grove AI Hat. Seeed Studio sent me the last two boards for review. So I’ll start by showing the items I received, before showing how to get started with MicroPython and Arduino code. Note that I’ll be using Ubuntu 18.04, but development in Windows is also possible.

Unboxing

I received two packages with a Maixduino kit, and the other “Grove AI HAT for Edge Computing”.

Maixduino & Grove AI HAT

Grove AI HAT for Edge Computing

Let’s start with the second. The board is a Raspberry Pi HAT with Sipeed M1 module, a 40-pin Raspberry Pi header, 6 grove connectors, as well as connectors for camera and display. The USB-C port is used for programming and power in standalone mode.

Grove-AI-Hat Board
Click to Enlarge

There are also headers for the board, but failed to see them during unboxing. We’ll see them when I insert them into the board.

Grove AI Hat for Edge ComputingThe bottom side does not have much expect numbering the I/O header pins, and another mount point for the camera.

grove AI kit camera

Later on I received the camera for the board as well.

Maixduino kit

The second package includes a Maixduino Arduino compatible board with Sipeed M1 module and an ESP32 module. The camera is already attached to the board, and a 2.4″ color display (QVGA resolution) is also part of the package.

Maxiduino kit with camera, LCD display
Click to Enlarge

The bottom side of the board just include silkscreen for the header pin names, as well as a block diagram for the board.

Maxiduino board block diagram

Getting Started with Maixduino Kit using MicroPython

I’ll start with Maixduino first, trying to following some of the instructions from the Wiki, and other resources from the Internet since it’s somewhat incomplete or inaccurate at times.

There are 4 methods of development / SDKs:

  • MaixPy (MicroPython)
  • Maxduino (Arduino)
  • Kendryte SDK, the official SDK of Kanji, providing the basic API
  • RT_Thread – RT-Thread support

I’ll use the first one for the pupose of this tutorial / getting started guide.

But first let’s connect a USB type-C cable and see what happens.

The board does start with “Welcome to MaixPy” written over a red background.

Firmware upgrade

Let’s upgrade the firmware as per the Wiki instructions. We could either build the firmware from source or get a binary.

I just went with the latest version available the first time I tried.


There are four files in the directory:

  • elf_maixpy_v0.4.0_39_g083e0cc.7z – elf file, ordinary users do not care, used for crash debugging
  • maixpy_v0.4.0_39_g083e0cc_m5stickv.bin – For M5Stack’s M5StickV AI camera
  • maixpy_v0.4.0_39_g083e0cc_minimum.bin – MaixPy firmware minimum set, not supported by MaixPy IDE, does not contain OpenMV related algorithms
  • maixpy_v0.4.0_39_g083e0cc.bin – Full version of MaixPy firmware (MicroPython + OpenMV API + LittlevGL  embedded GUI framework)

I’ll use the later, but before going further let’s add ourselves to the dialout group and set some udev rules to be able to access /dev/ttyUSB? as a normal user:


Create  /etc/udev/rules.d/50-usbuart.rules file with the following line:


Now download the latest kflash_gui program for your operating system (Linux or Windows). I got version 1.5.3, extract it and started it:


kflash_gui

Now we can open the firmware file, select Sipeed Maixduino, as well as  the serial port. To do so I looked into the kernel log (dmesg) to find out:


I looked a bit too fast only only noticed the last line with ttyUSB1, and I got some errors…

greeting fail check serial port maixduino

Also I tried with the command line Python utility, and same results:


I asked Seeed Studio, and I was told to use the other port, and it worked!

kflash_gui maixduino

Looking at the kernel log above, I can actually see both ttyUSB0 and ttyUSB1 assigned to Maixduino, and with hindsight, that’s probably because we can access both ESP32 and Kendryte K210 processors over different serial ports through a single USB cable.

However when I connected to /dev/ttyUSB0 via minicom (115200 8N1), there seems to be an issue with the firmware:


MaixPy version 0.4.0 I’ve just used is a pre-release firmware, so I’ve switched to MaixPy 0.3.2 that’s considered an actual stable release. This time I use the command line tool:


Let’s see the output from the serial console:


All good. Let’s try to get more information with help():


This gives list of modules, and we can help(‘<module_name>’) to get a list of functions for each module. That’s not the most user-friendly way to learn, so let’s get back to the Wiki, and check the sample to capture a photo and show it on display:


Actual output in terminal:


We can indeed see the camera output on the attached display.

Maixduino Camera Display

As your program grows, the REPL interface where you type commands in the serial console becomes impractical. But good news the company also designed MaixPy IDE to more conveniently write and load code to the board.

I wanted to use the latest MaixPy IDE 0.2.4 which required firmware 0.4.0 or greater. I had no luck with 0.4.0.39 I could see a new release had just been build on August 16th, so I flashed it:


This time, the board could boot. Good. Time to install and launch MaixPy IDE in Ubuntu 18.04:


The last line launches the IDE with the default hello world program.

MaixPy IDE
Click to Enlarge

We should now select Maixduino in Tools->Select Board:

MaixPy IDE Maixduino

Click on the green chain link icon on the bottom left of the ID and select /dev/ttyUSB0 to connect to the board, and once it’s done press the “Play” button underneath to load and run the Hello World sample on Maixduino.

MaixPy IDE Hello World
Click to Enlarge

The top right zone shows the actual frame buffer, and the three zones underneath the R, G & B histograms.

It’s all good but the highlight of K210 processor is its KPU AI / machine vision accelerator. There’s documentation to help us.

First let’s download the models:


There are three:


“Face model” is a pre-trained tiny Yolo-v2 model to detect (human) faces, and the other two are mobilenet. I’ll run the Yolo-V2 demo.

load face model maixduino

We can get back to kflash_gui, load the model, and click Download to flash it to the board.

Now we can write (copy/paste) Yolo2 code in the IDE:


First part until sensor.run(1) is to setup the camera, as we’ve done in the previous example, and  the KPU part is initialized with three lines:


The first line loads the model we’ve just flashed at address 0x300000. The second line defines the anchors for Yolo V2. Those are predefined width & heights for Yolo “boxes”. More details about what Yolo anchors mean can be found on Github. The third line initializes Yolo on the KPU with five parameters:

  • kpu_net: kpu network object
  • threshold: probability threshold
  • nms_value: box_iou threshold
  • anchor_num: number of anchors
  • anchor: anchor parameters previously defined

The while loop is pretty much self-explanatory: capture image, run yolo, if a face is detected draw a rectangle, push the result to the display, and loop again.

But does it actually work?

maixduino face detection
Click to Enlarge

Yes, it does. Note recognition is done at around 11 fps in the IDE, but it will be faster when running standalone. So I’ve added code I found in Tiziano Fiorenzani’s MaixPy Dock demo to show the fps in the top left of the display:


I’ve then called an associate to help testing multiple faces detection, and shot a video.

Face detection/tracking is done at around 15 to 18 fps, and my panda friend did not get detected, but it’s probably more of a feature than a bug since the model must have been trained with human faces. When I tried before the panda could get tracked from time to time, but not quite as well as my face.

To go further you may want to use ESP32 features like WiFi and/or Bluetooth connectivity, but sadly there’s no documentation for it. I tried my  ESP32 Micropython tutorial to setup WiFi but it did not work:


I guess that’s normal since I’m running this in K210 instead of ESP32. I was hoping they may have adapted it. Looking at the help there are only a few function implemented:


It’s not possible to use /dev/ttyUSB1 to connect to ESP32, so it may require a bit more work..

If you want to train and run your own models on Maixduino using Keras, a forum post provides the guideline to do just that.

Getting Started with Grove AI HAT

Standalone mode

Let’s switch to the other board: Grove AI HAT which works both in standalone mode or connected to a Raspberry Pi. I’ll mostly follow the Wiki that shows how to use the board in standalone mode with the Arduino IDE, skipping the GPIO / Grove stuff, and jumping directly to the camera, display, and computer vision section.

I’ve connected the fisheye camera, inserted the pin headers, and borrowed the display from Maixduino kit. Note that you could also use Maixduino camera with Grove AI HAT, and I’ve tried but noticed face detection and tracking did not work as well.

Grove AI HAT Camera Display

I’ll assume you have already installed the Arduino IDE in your host computer. For reference, I’m using Arduino 1.8.9 in Ubuntu 18.04.

First we need to go to File->Preferences, and edit the “Additional Boards Manager URLs” with the following URL to add support for Seeed Studio boards:


seeeduino boards url arduino
Click to Enlarge

Please also note the path about “More preferences can be edited directly in the file” since we’ll see to access this directory later one.

Now go to Tools->Board->Boards Manager and search for “k210”. Click Install when you see “Grove AI HAT for Edge Computing by Seeed Studio”.

Arduino Boards Manager K210
Click to Enlarge

After a little while you should be good, and can select Seeed K210 Pi, and make sure “k-flash” programmer and /dev/ttyUSB0 are selected as illustrated below.

Arduino K210 Pi & k-flash programmer
Click to Enlarge

Now install Kendryte standalone SDK in your working directory:


The model file needs to be manually flashed to the board (that’s where we use the path in Preferences as mentioned above):


Here’s the output of the last command if everything goes according to plans:


Now go back to your working directory and create a folder and empty sketch with the same name, in our case “face_detect”:


Copy the content of face_detect from the SDK:


Now open face_detect.ino in Arduino IDE. This will open a bunch of other files automatically, including board_config.h which we need to modify to enable OV2640 sensor and LICHEEDAN board as illustrated below.

Arduino OV2640 camera and LICHEEDAN Board

You can go to main.c to study the code, but it will require a bit more efforts to understand the code than Maixduino’s MicroPython sample. If you’d just like to try it out, you can check whether it compiles. I had a few warnings due to unused variables:


But it compiled fine:


We can now upload the sketch right from the IDE:


The board will now capture video and show a red rectangle when a human face is detected in a similar fashion to the Maixduino MicroPyhon demo.

The display was up-side-down though. A small change to the code fixed that (see line 5 below):


Grove AI HAT Face Detection

That’s basically like Maixduino demo above but instead of relying on MicroPython, we did so in the Arduino IDE.

Grove AI HAT and Raspberry Pi 4

Since we have a HAT board that would be a shame if we did not also use it with a Raspberry Pi board. So I inserted the board into a Raspberry Pi 4 SBC, after disconnecting the power supply obviously. Talking about power, please make sure RPI 5V jumper on the top of the Grove AI HAT board is set to ON.

Grove AI HAT connected to Raspberry Pi 4

One of the first things I thought would be interesting was to develop code directly on the Raspberry Pi 4. So I installed Arduino 1.8.9 for Arm 32-bit, and loaded Seeed Studio board URL based on the instructions above.

Raspberry Pi 4 Android IDE
Click to Enlarge

But when I tried to install “Grove AI HAT for Edge Computing by Seeed Studio” I quickly encountered a roadblock:

tool kflash not available for armThe message above is “Tool kflash is not available for your operating system”, but I also got “Tool kendryte-standalone-sdk is not available for your operating system”. It should not be surprising that Kendryte K210 tools are not (yet) available for Arm.

As a side note, I’m using Raspberry Pi 4 with 1GB, and I quickly found out it was not a good idea to run Chromium, even with just one tab, and the Android IDE side-by-side, as you quickly run out of memory, and it even completely filled the 512MB swap on the microSD card. Even with ZRAM there’s no hope, so if you ever considered running the Arduino IDE on Raspberry Pi 4 (e.g. for Arduino boards) while also checking out documentation, do yourself a favor and get the 2GB or 4GB RAM version.

Nevertheless, another way to use Grove AI HAT with Raspberry Pi is to display data returned by the expansion, as Seeed Studio explains in their face count demo tutorial. Let’s try it.

We’re off to a good start since we already setup the Arduino IDE above, and installed the face demo firmware. Now we can just install the Face Count program on the Raspberry Pi:


However, if I run it a segfault occurs:


Let’s have a look at setup.sh:


We can see the Grove AI HAT communicate with the Raspberry Pi board over SPI, and they set it by modifying /boot/config.txt file. So I decided to reboot, but the problem is the same. The installer directory contains many binary libraries which were tested on Raspberry Pi 3 B+ with Debian Wheezy, so switching to Raspberry Pi 4 with Debian Buster probably screwed things. I’ve let Seeed Studio know about it and they are looking into it.

Let’s complete the review by showing a video of the face detection demo running on Grove AI HAT, again joined by a few “friends”.

It works roughly as expected with the demo filtering out most of my non-human friends, and catching up to 3-4 faces simultaneously from a photo.

Purchasing the Boards, Display and Cameras

I’d like to thanks Seeed Studio for sending the board and let me experiment with Kendryte K210 / Sipeeed M1 platforms using MicroPython and Arduino. If you’d like to reproduce my setup, you can buy the Grove AI Hat for $28.90, the OV2640 fisheye camera for $7.60, and/or the Maixduino kit with the board, a display and a camera for $23.90.

Share this:
FacebookTwitterHacker NewsSlashdotRedditLinkedInPinterestFlipboardMeWeLineEmailShare

Support CNX Software! Donate via cryptocurrencies, become a Patron on Patreon, or purchase goods on Amazon or Aliexpress

ROCK 5 ITX RK3588 mini-ITX motherboard

8 Replies to “Getting Started with Sipeed M1 based Maixduino Board & Grove AI HAT for Raspberry Pi”

  1. Good to see the software support is getting there for this now. I have a bunch of sipeed stuff I haven’t touched since buying them a year or so ago.

  2. I can confirm that the latest firmware release (24/08/2019) maixpy_v0.4.0_46_gf46e4c4.bin works with the IDE AND the face detection example on Maixduino board. So no need anymore for a downgrade to v 3.2 .

  3. firmware: maixpy_v0.4.0_46 (latest)

    I tried to convert movies with ffmpeg but their own movies don’t work. Even their own supplied ‘badapple.avi’ movie fails with OSError: [Errno 22] EINVAL

    The lvgl library (graphocs) is not there. When I try the examplles, I get: ImportError: no module named ‘lvgl’

    Did any of these worked before?

  4. Hello, Nice tutorials.
    I am using Arduino and trying to run a selfie example to capture the image. However, i am getting a snap fail in the serial window and nothing on LCD.
    I have to capture an image for my project, but cant seem to do so. Cant find any solutions anywhere and ialso contacted sipeed but no response from them.
    Any help will be appreciated.
    Thank you.

    1. Just to make sure there’s no issue with the hardware, does the face_demo sample used in the tutorial above work?

      I haven’t tried, but if you want to capture the image you may also need to insert a MicroSD card.

Leave a Reply

Your email address will not be published. Required fields are marked *

Boardcon Rockchip and Allwinner SoM and SBC products
Boardcon Rockchip and Allwinner SoM and SBC products