ArduCam Mega – A 3MP or 5MP SPI camera for microcontrollers (Crowdfunding)

Arducam Mega SPI camera

ArduCam Mega is a 3MP or 5MP camera specifically designed for microcontrollers with an SPI interface, and the SDK currently supports Arduino UNO and Mega2560 boards, ESP32/ESP8266 boards, Raspberry Pi Pico and other boards based on RP2040 MCU, BBC Micro:bit V2, as well as STM32 and MSP430 platform. Both cameras share many of the same specifications including their size, but the 3MP model is a fixed-focus camera, while the 5MP variant supports autofocus. Potential applications include assets monitoring, wildfire monitoring, remote meter reading, TinyML applications, and so on. ArduCam Mega specifications: Camera Type 3MP with fixed focus 5MP with auto-focus from 8cm to infinity Optical size – 1/4-inch Shutter type – Rolling Focal ratio 3MP – F2.8 5MP – F2.0 Still Resolutions 320×240, 640×480, 1280×720 x 1600 x1200x 1920 x 1080 3MP – 2048 x 1536 5MP – 2592×1944 Output formats – RGB, YUV, or JPEG Wake-up time 3MP – […]

Bee Motion S3 – An ESP32-S3 board with a PIR motion sensor (Crowdfunding)

ESP32 S3 board PIR Motion Sensor

The Bee Motion S3 is an ESP32-S3 WiFi and Bluetooth IoT board with a PIR motion sensor beside the more usual I/Os, Qwiic connector, USB-C port, and LiPo battery support. It is at least the third PIR motion wireless board from Smart Bee Designs, as the company previously introduced the ESP32-S2 powered Bee Motion board and the ultra-small Bee Motion Mini with an ESP32-C3 SoC. The new Bee Motion S3 adds a few more I/Os, a light sensor, and the ESP32-S3’s AI vector extensions could potentially be used for faster and/or lower-power TinyML processing. Bee Motion S3 specifications: Wireless module – Espressif Systems ESP32-S3-MINI-1 module (PDF datasheet) based on ESP32-S3 dual-core Xtensa LX7 microcontroller with 512KB SRAM, 384KB ROM, WiFi 4 and Bluetooth 5.0 connectivity, and equipped with 8MB of QSPI flash and a PCB antenna USB – 1x USB Type-C port for power and programming Sensors PIR sensor S16-L221D […]

Sipeed M1s & M0sense – Low-cost BL808 & BL702 based AI modules (Crowdfunding)

Sipeed M1s & M0Sense

Sipeed has launched the M1s and M0Sense AI modules. Designed for AIoT application, the Sipeed M1s is based on the Bouffalo Lab BL808 32-bit/64-bit RISC-V wireless SoC with WiFi, Bluetooth, and an 802.15.4 radio for Zigbee support, as well as the BLAI-100 (Bouffalo Lab AI engine) NPU for video/audio detection and/or recognition. The Sipeed M0Sense targets TinyML applications with the Bouffa Lab BL702 32-bit microcontroller also offering WiFi, BLE, and Zigbee connectivity. Sipeed M1s AIoT module The Sipeed M1S is an update to the Kendryte K210-powered Sipeed M1 introduced several years ago. Sipeed M1s module specifications: SoC – Bouffalo Lab BL808 with CPU Alibaba T-head C906 64-bit RISC-V (RV64GCV+) core @ 480MHz Alibaba T-head E907 32-bit RISC-V (RV32GCP+) core @ 320MHz 32-bit RISC-V (RV32EMC) core @ 160 MHz Memory – 768KB SRAM and 64MB embedded PSRAM AI accelerator – NPU BLAI-100 (Bouffalo Lab AI engine) for video/audio detection/recognition delivering up […]

TinyML-CAM pipeline enables 80 FPS image recognition on ESP32 using just 1 KB RAM

TinyML-CAM image recognition microcontroller boards

The challenge with TinyML is to extract the maximum performance/efficiency at the lowest footprint for AI workloads on microcontroller-class hardware. The TinyML-CAM pipeline, developed by a team of machine learning researchers in Europe, demonstrates what’s possible to achieve on relatively low-end hardware with a camera. Most specifically, they managed to reach over 80 FPS image recognition on the sub-$10 ESP32-CAM board with the open-source TinyML-CAM pipeline taking just about 1KB of RAM. It should work on other MCU boards with a camera, and training does not seem complex since we are told it takes around 30 minutes to implement a customized task. The researchers note that solutions like TensorFlow Lite for Microcontrollers and Edge Impulse already enable the execution of ML workloads, onMCU boards, using Neural Networks (NNs). However, those usually take quite a lot of memory, between 50 and 500 kB of RAM, and take 100 to 600 ms […]

AI, computer vision meet LoRaWAN with SenseCAP K1100 sensor prototype kit

Wio Terminal Grove Vision AI LoRaWAN module

CNXSoft: This is another tutorial using SenseCAP K1100 sensor prototype kit translated from CNX Software Thai. This post shows how computer vision/AI vision can be combined with LoRaWAN using the Arduino-programmable Wio Terminal, a Grove camera module, and LoRa-E5 module connecting to a private LoRaWAN network using open-source tools such as Node-RED and InfluxDB. In the first part of SenseCAP K1100 review/tutorial we connected various sensors to the Wio Terminal board and transmitted the data wirelessly through the LoRa-E5 LoRaWAN module after setting the frequency band for Thailand (AS923). In the second part, we’ll connect the Grove Vision AI module part of the SenseCAP K1100 sensor prototype kit to the Wio Terminal in order to train models to capture faces and display the results from the camera on the computer. and evaluate the results of how accurate the Face detection Model is. Finally, we’ll send the data (e.g. confidence) using […]

Embedded World 2022 – June 21-23 – Virtual Schedule

Embedded World 2022

Embedded World 2020 was a lonely affair with many companies canceling attendance due to COVID-19, and Embedded World 2021 took place online only. But Embedded World is back to Nuremberg, Germany in 2022 albeit with the event moved from the traditional month of February to June 21-23. Embedded systems companies and those that service them will showcase their latest solution at their respective booths, and there will be a conference with talks and classes during the three-day event. The programme is up, so I made my own little Embedded World 2022 virtual schedule as there may be a few things to learn, even though I won’t be attending. Tuesday, June 21, 2022 10:00 – 13:00 – Rust, a Safe Language for Low-level Programming Rust is a relatively new language in the area of systems and low-level programming. Its main goals are performance, correctness, safety, and productivity. While still ~70% of […]

Seeed XIAO BLE – A tiny nRF52840 Bluetooth 5.0 board with (optional) IMU sensor and microphone

Seeed XIAO BLE Sense

Seeed Studio has just introduced two new members to their XIAO board family with the Seeed XIAO BLE and XIAO BLE Sense boards equipped with Nordic Semi nRF52840 Bluetooth 5.0 microcontroller, as well as an IMU sensor and microphone on the “Sense” model. Just like the earlier XIAO RP2040 board, the tiny Seed XIAO BLE board can be programmed with Arduino, MicroPython, and CircuityPython, and offers two headers with 7-pin each for GPIOs. What’s really new is the wireless connectivity, the sensors, and a battery charging circuitry. Seeed XIAO BLE specifications: Wireless MCU –  Nordic nRF52840 Arm Cortex-M4F microcontroller @ up to 64 MHz with  1 MB flash, 256 KB SRAM, Bluetooth 5.0, NFC, Zigbee connectivity Storage – 2 MB QSPI flash Expansion I/Os 2x 7-pin headers with 1x UART, 1x I2C, 1x SPI, 1x NFC, 1x SWD, 11x GPIO (PWM), 6x ADC 3.3V I/O voltage (not 5V tolerant) Sensors […]

Benchmarking TinyML with MLPerf Tiny Inference Benchmark

MLPerf Tiny Inference Benchmark

As machine learning moves to microcontrollers, something referred to as TinyML, new tools are needed to compare different solutions. We’ve previously posted some Tensorflow Lite for Microcontroller benchmarks (for single board computers), but a benchmarking tool specifically designed for AI inference on resources-constrained embedded systems could prove to be useful for consistent results and cover a wider range of use cases. That’s exactly what MLCommons, an open engineering consortium, has done with MLPerf Tiny Inference benchmarks designed to measure how quickly a trained neural network can process new data for tiny, low-power devices, and it also includes an optional power measurement option. MLPerf Tiny v0.5, the first inference benchmark suite designed for embedded systems from the organization, consists of four benchmarks: Keyword Spotting – Small vocabulary keyword spotting using DS-CNN model. Typically used in smart earbuds and virtual assistants. Visual Wake Words – Binary image classification using MobileNet. In-home security […]

Exit mobile version
EmbeddedTS embedded systems design