BeagleY-AI SBC review with Debian 12, TensorFlow Lite, other AI demos

BeagleY-AI SBC review
BeagleY-AI SBC

Today I’ll be reviewing the BeagleY-AI open-source single-board computer (SBC) developed by BeagleBoard.org for artificial intelligence applications. It is powered by a Texas Instruments AM67A quad-core Cortex-A53 processor running at 1.4 GHz along with an ARM Cortex-R5F processor running at 800 MHz for handling general tasks and low-latency I/O operations. The SoC is also equipped with two C7x DSP units and a Matrix Multiply Accelerator (MMA) to enhance AI performance and accelerate deep learning tasks. Each C7x DSP delivers 2 TOPS, offering a total of up to 4 TOPS. Additionally, it includes an Imagination BXS-4-64 graphics accelerator that provides 50 GFlops of performance for multimedia tasks such as video encoding and decoding. For more information, refer to our previous article on CNX Software or visit the manufacturer’s website.

BeagleY-AI unboxing

The BeagleY-AI board was shipped from India in a glossy-coated, printed corrugated cardboard box. Inside, the board is protected by foam padding for shock absorption with the BeagleY-AI placed in an additional plastic bag. The board comes with the antenna already attached, but no other accessories are included.

BeagleY-AI package.
BeagleY-AI package.
Inside the BeagleY-AI package.
Inside the BeagleY-AI package.
Ethernet and USB ports.
Ethernet and USB ports.
USB power port and HDMI port.
USB power port and HDMI port.

For comparison, I took photos of the BeagleY-AI SBC alongside other boards I have that share a similar form factor, including the Purple Pi OH and Raspberry Pi 4 Model B, as shown in the images below.

(left) Purple Pi OH, (center) Raspber Pi 4 Model B, and (right) BeagleY-AI.
Purple Pi OH (left), Raspberry Pi 4 Model B (center), and BeagleY-AI (right).
(left) BeagleY-AI, (center) Raspber Pi 4 Model B, and (right) Purple Pi OH.
Purple Pi OH (left), Raspberry Pi 4 Model B (center), and BeagleY-AI (right).

Debian 12 operating system installation

I installed the Debian 12 operating system by following the instructions in the Quick Start guide. The setup process is similar to that of a Raspberry Pi or any general SBC, requiring a microSD card to load the OS image. The manufacturer recommends a 32 GB MicroSD card, so I used a 32 GB SanDisk MicroSD card for this review.

The Quick Start guide suggests using either the bb-imager program or Balena Etcher to flash the OS. However, after downloading bb-imager, I found that the program did not run on my computer (Windows 10). I then switched to Balena Etcher, which worked without any problems.

To flash the operating system for the BeagleY-AI board, I started by downloading the OS image from the manufacturer’s website: https://www.beagleboard.org/distros/beagley-ai-debian-12-5-2024-06-19-xfce. The image file I used was approximately 1.4 GB. After downloading, I extracted the .xz file to a .img file, making it ready for flashing. Next, I opened Balena Etcher, selected the .img file, and clicked the ‘Flash!’ button. The flashing process took about 10 minutes, followed by another 10 minutes for validation, and it completed successfully.

Preparing the micro SD card with BalenaEtcher.
Preparing the micro SD card with BalenaEtcher.
Flashing the operating system to the micro SD card.
Flashing the operating system to the micro SD card.

After flashing, I needed to edit the default username and password for logging into the OS. I did this by accessing the sysconf.txt file from the BOOT partition, then modifying the user_name and user_password fields to my desired credentials. Once saved, I inserted the SD card into the BeagleY-AI board and it was ready to use.

Accessing the BeagleY-AI board

The manufacturer recommends three ways to use the BeagleY-AI board: direct USB tethering via a USB Type-C cable, a headless connection through the UART port, and a standalone setup with a monitor, mouse, and keyboard. They also recommend using a 5V power supply capable of delivering at least 3A, especially when connecting peripherals like cooling fans or during heavy usage. Alternatively, if using a USB Type-C to Type-C cable, it should support a current of at least 1000mA. For this review, I experimented with both USB tethering and the standalone connection with my BeagleY-AI sample as detailed below.

Testing with USB tethering

I connected the BeagleY-AI board to my computer, and when it powered up, the red light initially stayed on. After a moment, the LED changed to a pattern of two slow blinks followed by two quick blinks in green, indicating that the board was operational. Once the board was ready, I was able to connect to it via SSH through the Virtual Wired interface created by the board which appeared as ‘Ethernet 2’ on my system. I then opened Windows PowerShell and connected to the BeagleY-AI using SSH with the username and password I had set earlier. The connection was established immediately. Below is an example of the htop output from the USB tethering connection.

Connecting to BeagleY-AI via USB tethering.
Connecting to BeagleY-AI via USB tethering

Testing with a standalone connection

In this test, I needed to connect the BeagleY-AI to external devices. I used a micro HDMI to full-size HDMI cable to connect to a BenQ EL2870U monitor and a Logitech wireless keyboard and mouse. Additionally, I connected a LAN cable for Ethernet. For power, I used my smartphone’s power adapter, which supplies 5V at 3A. Once powered on, the system booted up and was ready to use after a short while.

Default desktop of the BeagleY-AI.
Default desktop of the BeagleY-AI.

For internet usage via Ethernet, the connection worked immediately. However, for Wi-Fi, I first had to enable NetworkManager and start the service. After that, I used nmtui to configure the Wi-Fi settings, which I could to do without any issues.

Checking BeagleY-AI board’s information

Check hardware with inxi

I installed inxi by running the command sudo apt install inxi. When I executed it, inxi correctly detected the CPU as a 4-core Cortex-A53, as expected. Additional system details are provided in the output below.

Benchmarking with sbc-bench

I tested the performance of the BeagleY-AI board by running Thomas Kaiser’s sbc-bench v0.9.67 script, and the results are detailed below. However, I encountered an issue with the temperature reporting section.


For reference, the result for the 7-zip multi-core makes the BeagleY-Ai about equivalent to a Raspberry Pi 3B+.

Testing the network performance

During this part of the testing, I was unable to find a router that supports Gigabit Ethernet. Therefore, I used a TP-Link TL-MR100 4G LTE Router, which has a maximum Ethernet speed of 100 Mbps and a Wi-Fi 2.4 GHz communication speed of around 300 Mbps. The results of the iperf3 test are as follows.

Ethernet: send…


Ethernet: receive…


Wi-Fi 2.4GHz: send…


Wi-Fi 2.4GHz: recevie…

Testing YouTube video playback on the BeagleY-AI board

I tested video playback performance on YouTube using the web browsers included with the BeagleY-AI operating system image, which are Firefox ESR 111.15.0 (64-bit), Firefox Nightly 129.0a1 (2024-06-18) (64-bit), and Chromium 129.0.6668.58 (64-bit). I experimented with YouTube video playback in full-screen mode at various resolutions, from 144p to 2160p. The BeagleY-AI board performs best at 480p, with a dropped frame rate of about 10–15%. Lowering the resolution to 144p results in a slight increase in stuttering.

When I changed the video resolution to 1440p or 2160p, I found that it was almost impossible to play the videos smoothly possibly due to connection issues/limited speed (over 100Mbps Ethernet) as the buffer was often empty. It’s especially true in the case of 2160p videos, as they couldn’t be played at all, as shown in the screenshot examples from my tests with Chromium.

Playing YouTube video (144p).
Playing YouTube video (144p).
Playing YouTube video (240p).
Playing YouTube video (240p).
Playing YouTube video (360p).
Playing YouTube video (360p).
Playing YouTube video (480p).
Playing YouTube video (480p).
Playing YouTube video (720p).
Playing YouTube video (720p).
Playing YouTube video (1080p).
Playing YouTube video (1080p).
Playing YouTube video (1440p).
Playing YouTube video (1440p).
Playing YouTube video (2160p).
Playing YouTube video (2160p).

Testing web browsers with Speedometer 2.0 and 3.0

For this test, I connected to the Internet via Ethernet and tested the performance of web applications in various web browsers using Speedometer 2.0 and Speedometer 3.0. The results are shown in the images below. Chromium and Firefox Nightly produced similar scores, while Firefox ESR consistently had the lowest scores, regardless of whether it was tested with Speedometer 2.0 or 3.0.

Testing with Speedometer 2.0.

Testing Chromium with Speedometer 2.0.
Testing Chromium with Speedometer 2.0.
Testing Firefox ESR with Speedometer 2.0.
Testing Firefox ESR with Speedometer 2.0.
Testing Firefox Nightly with Speedometer 2.0.
Testing Firefox Nightly with Speedometer 2.0.

Testing with Speedometer 3.0.

Testing Chromium with Speedometer 3.0.
Testing Chromium with Speedometer 3.0.
Testing Firefox ESR with Speedometer 3.0.
Testing Firefox ESR with Speedometer 3.0.
Testing Firefox Nightly with Speedometer 3.0.
Testing Firefox Nightly with Speedometer 3.0.

Testing WebGL rendering on web browsers

When testing 3D graphics performance in web browsers using WebGL, I opened the browser in full-screen mode with one fish displayed. The results were consistent across all browsers: the processing struggled to keep up. I found that Chromium rendered at only 1 FPS, while both versions of Firefox had a rendering rate of 3 FPS, approximately.

Testing WebGL on Chromium.
Testing WebGL on Chromium.
Testing WebGL on Firefox ESR.
Testing WebGL on Firefox ESR.
Testing WebGL on Firefox Nightly.
Testing WebGL on Firefox Nightly.

Testing 3D graphics rendering with glmark-es2

In this test, I evaluated the performance of GLES using the glmark2-es2 command over 4–5 times. The average glmark2 score was around 32, as shown in the example of the first run below. The performance was fairly consistent, with rendering being mostly smooth and only minor, insignificant stuttering. However, during the terrain rendering test, there were significant stutters, with the frame rate dropping to just 1–2 FPS. The score is low due to the drivers for the Imagination GPU not being enabled (llvmpipe indicates software rendering):

Object detection on the BeagleY-AI SBC with TensorFlow Lite

The next test of the review involved using artificial intelligence for object detection on the BeagleY-AI with TensorFlow Lite. I followed the sample steps provided by the manufacturer’s website, which were straightforward and easy to follow. I was able to copy, paste, and run the commands without any issues. The general steps are as follows. First, I installed the lightweight version of Conda using Miniforge/Mambaforge 24.3.0.0.


Next, I created a new virtual environment, as suggested in the manufacturer’s example, using Python version 3.9 before installing numpy and OpenCV with the following command:


Next, I downloaded pre-trained models from Google. For this example, I selected the COCO SSD MobileNet model.


I initially tested the functionality with still images by modifying the code provided by the manufacturer, and it worked well. To experiment with real-time operation, I connected a RAPOO C260 1080p USB webcam to the USB port of the BeagleY-AI board. I then ran the following commands to check the device and retrieve camera information. In the manufacturer’s example, they specified the device as number 3, and in my test, I obtained the same result, allowing me to proceed with the setup immediately.


Aside from preparing the camera class, the provided Python sample includes code for loading and preparing the model:


After that, it calls functions to detect objects in the image. This part of the code converts the image into a float32 format and normalizes the data range to [-1, 1] before detecting objects in the image and drawing the obtained results.


I ran the code and obtained the output shown in the image below. The BeagleY-AI displayed a frame rate of only 3 FPS, which is similar to the results reported by other users. I found that some users recommend using the Texas Instruments Deep Learning (TIDL) library for faster processing on the BeagleY-AI. However, some users have encountered issues with the TI image, which is still not fully functional, and there are no official examples from the manufacturer at this time. Therefore, I stopped my experiment at this point.

Testing TensorFlow Lite in realtime (1280x720).
Testing TensorFlow Lite in real time (1280×720).
Testing TensorFlow Lite in realtime (640x480).
Testing TensorFlow Lite in real time (640×480).

Additionally, I opened htop to monitor CPU usage while running the object detection code with TensorFlow Lite. From visual observation, it utilized about 50% of the CPU on average.

TensorFlow Lite CPU usages on BeagleY-AI board.
TensorFlow Lite CPU usages on BeagleY-AI board.

Image processing with MediaPipe

Since I couldn’t find other official examples from the BeagleY-AI manufacturer related to artificial intelligence or image processing, I decided to test the BeagleY-AI with MediaPipe machine learning library from Google. In this section, I will experiment with using Python for tasks such as object detection, image segmentation, face landmark detection, and gesture recognition.

To set up MediaPipe, I followed the examples in the MediaPipe Guide for each topic I was interested in. First, I installed MediaPipe using the following command. Next, I prepared to use the selected modules by properly configuring the options for the classes I wanted to use. After setting the options, I created instances of those classes before invoking them.


The example below demonstrates setting up object detection, where I needed to select and download the desired model. In this case, I chose the EfficientDet-Lite0 model and configured the required options to prepare it for use.


After that, the object detector can be called as needed. In my case, since I was reading video frames using OpenCV, I needed to convert the image data into a mp.Image format so that MediaPipe could process it. Then, I called the detect method to obtain the results and display them, as shown in the sample code below:


I found that using MediaPipe for object detection resulted in a display rate of about 1 FPS, while gesture recognition had an average display rate of around 1.5 FPS. The face landmark detector performed faster, reaching approximately 4.3 FPS. For image segmentation, I experimented with separating the person in the image from the background and slightly blurring the background. With a kernel size of 25×25 pixels for the GaussianBlur function, the rendering performance was about 7 FPS. When I reduced the kernel size to 3×3 pixels, the display rate increased to 9.5 FPS.

Testing BeagleY-AI with MediaPipe's Object Detection.
Testing BeagleY-AI with MediaPipe’s Object Detection.
Testing BeagleY-AI with MediaPipe's Face Landmark Detection.
Testing BeagleY-AI with MediaPipe’s Face Landmark Detection.
Testing BeagleY-AI with MediaPipe's Gesture Recognition.
Testing BeagleY-AI with MediaPipe’s Gesture Recognition.
Testing BeagleY-AI with MediaPipe's Image Segmentation, with 25x25 pixel Gaussian blur.
Testing BeagleY-AI with MediaPipe’s Image Segmentation, with 25×25 pixel Gaussian blur.
Testing BeagleY-AI with MediaPipe's Image Segmentation, with 3x3 pixel Gaussian blur.
Testing BeagleY-AI with MediaPipe’s Image Segmentation, with 3×3 pixel Gaussian blur.

Testing extracting SIFT features

Another aspect I wanted to test is the Scale-Invariant Feature Transform (SIFT), an algorithm that extracts significant features from images. These features are useful for various applications, such as image matching, mosaicking, and epipolar geometry computation.

Using SIFT in modern OpenCV is straightforward, and it runs on the BeagleY-AI without any issues. In the sample code below, I used the same USB webcam and initialized the SIFT feature extractor by calling cv2.SIFT_create(). I then used it to extract features, or keypoints, from each video frame by calling the detect() method and used cv2.drawKeypoints() to visualize the detected keypoints. The program’s display rate was approximately 0.1 FPS at a video resolution of 1920×1080 pixels, around 0.5 FPS at 1280×720 pixels, and about 7.5 FPS at 320×240 pixels, as shown in the example images below.


Realtime SIFT feature extraction for 1920x1080 video frame.
Realtime SIFT feature extraction for 1920×1080 video frame.

Realtime SIFT feature extraction for 1280x720 video frame.
Realtime SIFT feature extraction for 1280×720 video frame.
Realtime SIFT feature extraction for 320x240 video frame.
Realtime SIFT feature extraction for 320×240 video frame.

Other testings

Measuring temperature

For testing the temperature of the BeagleY-AI board, I conducted experiments during the daytime when the room temperature was approximately 31°C. Using a FLIR E4 thermal camera, I measured the board’s temperature three times. First, I measured right after a fresh boot without running any programs and found that the maximum temperature of the board was around 42–45°C.

Next, I tested the board under full load by playing 4K videos on YouTube at 2160p in full-screen mode while simultaneously running object detection code using TensorFlow Lite. I let the BeagleY-AI board run in this state for about 10 minutes before measuring the temperature again. The maximum temperature reached approximately 61°C and continued to rise slowly. After that, I used a desktop fan to cool the board by directing airflow over it. I found that the temperature dropped almost immediately, taking a short while to decrease to about 51°C.

Running BeaglY-AI with 100% CPU usages.
Running BeaglY-AI with 100% CPU usage.
Measuring BeagleY-AI board temperatue while running at 100% CPU usages.
Measuring BeagleY-AI board temperature while running at 100% CPU usage.
Reducing board temperature with a desktop fan.
Reducing the board temperature with a desktop fan.

BeagleY-AI power consumption

I measured the power consumption of the BeagleY-AI board using a USB Power Meter. I found that when the BeagleY-AI board is in an idle state, it consumes approximately 4.7–4.8W. During full-screen playback of 4K YouTube videos, the board consumes about 5.5–5.8W, as shown in the example images below.

Power comsumption in idle state.
Power consumption in idle state.
Power comsumption when running objection detection script with TensorFlow Lite.
Power consumption when running objection detection script with TensorFlow Lite.

Checking other open-source files

For additional information about the BeagleY-AI board, you can visit https://openbeagle.org/beagley-ai/beagley-ai, which provides additional access to resources like 3D files, certificates, board photos, and PCB files. The 3D files are available in STP (STEP) format, which I was able to open using the Autodesk Viewer website. The schematic diagram is provided in PDF format, and the manufacturer supplies PCB information as BRD files. Since I don’t have a program to open BRD files, I used KiCad’s Gerber Viewer to open the Gerber files, as shown in the images below.

3D model of the BeagleY-AI board (top).
3D model of the BeagleY-AI board (top).
3D model of the BeagleY-AI board (bottom).
3D model of the BeagleY-AI board (bottom).

The following images are some of the provided Gerber files of the BeagleY-AI, viewed in KiCAD’s Gerber Viewer.

begalyy ai hardware gerber 01 begalyy ai hardware gerber 02

Conclusion

Overall, our review showed the BeagleY-AI is a promising SBC with CPU performance similar to the Raspberry Pi 3B+ and built-in 3D graphics, video encoder/decoder, and DSPs for AI acceleration. The problem is that more work is needed on the software side since support for all those accelerators does not seem to be fully implemented. I also found it challenging to locate official tutorials on the manufacturer’s website that cover more advanced programming and testing of the AI processing unit. At the time of writing, I only found examples using TensorFlow Lite, and the rest use software processing with the CPU only.

I would like to thank the manufacturer for providing the BeagleY-AI for review. Those interested can purchase the board for about $70 on Seeed Studio or Newark/Element14.

Share this:

Support CNX Software! Donate via cryptocurrencies, become a Patron on Patreon, or purchase goods on Amazon or Aliexpress

Radxa Orion O6 Armv9 mini-ITX motherboard
Subscribe
Notify of
guest
The comment form collects your name, email and content to allow us keep track of the comments placed on the website. Please read and accept our website Terms and Privacy Policy to post a comment.
5 Comments
oldest
newest
Radoslaw
2 months ago

58 celsius degree in stress. How do you reach this. My is 80 degree. Board very quick get hot. Iam using heatsing from rpi.

Lalith
Lalith
2 months ago

I had high hopes for this board, and yes I agree, its running extremely hot even when idle. Bought 3 boards for evaluation and all three are the same.

Woky
Woky
2 months ago

Very interesting, many thanks for your work. From your personal assessment, could be Beagle cheaper alternative to Rpi?

Unturned3
Unturned3
2 months ago

Well, this seems disappointing! I’m glad I didn’t buy it just yet. Seems like the actual AI/computer vision capabilities fall well-short of what’s advertised.

Jean-Luc Aufranc (CNXSoft)
Admin

The problem is that the software does not seem to be fully ready. Maybe it will be (much) better in a few months.

We initially wanted to review the board with the Arducam Dual V3Link Camera Kit mentioned in the wiki, but after waiting for a few weeks it was made clear to us that support would take longer than expected. So we went ahead with the review of the board itself without a camera.

Boardcon Rockchip RK3588S SBC with 8K, WiFI 6, 4G LTE, NVME SSD, HDMI 2.1...