Elephant Robotics has launched its most advanced 6 DoF robot arm so far with the myCobot Pro 600 equipped with a Raspberry Pi 4 SBC, offering a maximum 600mm working range and support for up to 2kg payloads. We’ve covered Elephant Robotics’ myCobot robotic arms based on Raspberry Pi 4, ESP32, Jetson Nano, or Arduino previously, even reviewed the myCobot 280 Pi using both Python and visual programming, and the new Raspberry Pi 4-based myCobot Pro 600 provides about the same features but its much larger design enables it to be used on larger areas and handles heavier objects. myCobot Pro 600 specifications: SBC – Raspberry Pi 4 single board computer MCU – 240 MHz ESP32 dual-core microcontroller (600 DMIPS) with 520KB SRAM, Wi-Fi & dual-mode Bluetooth Video Output – 2x micro HDMI 2.0 ports Audio – 3.5mm audio jack, digital audio via HDMI Networking – Gigabit Ethernet, dual-band WiFi […]
XGO 2 – A Raspberry Pi CM4 based robot dog with an arm (Crowdfunding)
XGO 2 is a desktop robot dog using the Raspberry Pi CM4 as its brain, the ESP32 as the motor controller for the four legs and an additional robotic arm that allows the quadruped robot to grab objects. An evolution of the XGO mini robot dog with a Kendryte K210 RISC-V AI processor, the XGO 2 robot offers 12 degrees of freedom and the more powerful Raspberry CM4 model enables faster AI edge computing applications, as well as features such as omnidirectional movement, six-dimensional posture control, posture stability, and multiple motion gaits. The XGO 2 robot dog is offered in two variants – the XGO-Lite 2 and the XGO-Mini 2 – with the following key features and specifications: The company also says the new robot can provide feedback on its own postures thanks to its 6-axis IMU and sensors for the joints reporting the position and electric current. A display […]
ultraArm P340 Arduino-based robotic arm draws, engraves, and grabs
Elephant Robotics ultraArm P340 is a robot arm with an Arduino-compatible ATMega2560 control board with a 340mm working radius whose arm can be attached with different accessories for drawing, laser engraving, and grabbing objects. We’ve previously written and reviewed the myCobot 280 Pi robotic arm with a built-in Raspberry Pi 4 SBC, but the lower-cost ultraArm P340 works a little differently since it only contains the electronics for controlling the servos and attachments, and needs to be connected to a host computer running Windows or a Raspberry Pi over USB. ultraArm P340 specifications: Control board based on Microchip ATMega2560 8-bit AVR microcontroller @ 16MHz with 256KB flash, 4Kb EEPROM, 8KB SRAM DOF – 3 to 4 axis depending on accessories Working radius – 340mm Positioning Accuracy – ±0.1 mm Payload – Up to 650 grams High-performance stepper motor Maximum speed – 100mm/s Communication interfaces – RS485 and USB serial Attachment […]
Sipeed MetaSense RGB ToF 3D depth cameras are made for MCUs & ROS Robots (Crowfunding)
We’ve just written about the Arducam ToF camera to add depth sensing to Raspberry Pi, but there are now more choices, as Sipeed has just introduced its MetaSense ToF (Time-of-Flight) camera family for microcontrollers and robots running ROS with two models offering different sets of features and capabilities. The MetaSense A075V USB camera offers 320×240 depth resolution plus an extra RGB sensor, an IMU unit, and a CPU with built-in NPU that makes it ideal for ROS 1/2 robots, while the lower-end MetaSense A010 ToF camera offers up to 100×100 resolution, integrates a 1.14-inch LCD display to visualize depth data in real-time and can be connected to a microcontroller host, or even a Raspberry Pi, through UART or USB. MetaSense A075V specifications: SoC – Unnamed quad-core Arm Cortex-A7 processor @ 1.5 GHz with 0.4 TOPS NPU System Memory – 128 MB RAM Storage – 128MB flash Cameras 800×600 @ 30 […]
AAEON launches UP Xtreme i11 & UP Squared 6000 robotic development kits
AAEON’s UP Bridge the Gap community has partnered with Intel to release two robotic development kits based on either UP Xtreme i11 or UP Squared 6000 single board computers in order to simplify robotics evaluation and development. Both robotic development kits come with a four-wheeled robot prototype that can move omnidirectionally, sense and map its environment, avoid obstacles, and detect people and objects. Since those capabilities are already implemented, users can save time and money working on further customization for specific use cases. Key features and specifications: SBC UP Squared 6000 SBC – Recommended for power efficiency and extended battery power SoC – Intel Atom x6425RE Elkhart Lake processor clocked up to 1.90 GHz with Intel UHD Graphics System Memory – 8GB LPDDR4 Storage – 64GB eMMC flash, SATA port UP Xtreme i11 SBC – Recommended for higher performance SoC Intel Core i7-1185GRE clocked at up to 4.40 GHz and […]
$349 AMD Kria KR260 Robotics Starter Kit takes on NVIDIA Jetson AGX Xavier devkit
AMD Xilinx Kria KR260 Robotics Starter Kit features the Kria K26 Zynq UltraScale+ XCK26 FPGA MPSoC system-on-module (SoM) introduced last year together with the Kria KV260 Vision AI Starter Kit. Designed as a development platform for robotics and industrial applications, the KR260 is said to deliver nearly 5x productivity gain, up to 8x better performance per watt and 3.5x lower latency compared to Nvidia Jetson AGX Xavier or Jetson Nano kits. We’ll have a better look at the details below. Kria KR260 Robotics Starter Kit specifications: SoM – Kria K26 module with: MPSoC – Xilinx Zynq Ultrascale+ custom-built XCK26 with quad-core Arm Cortex-A53 processor up to 1.5GHz, dual-core Arm Cortex-R5F real-time processor up to 600MHz, Mali-400 MP2 GPU up to 667MHz, 4Kp60 VPU, 26.6Mb On-Chip SRAM, 256K logic cells, 1,248 DSP slices, 144 Block RAM blocks, 64 UltraRAM blocks System Memory – 4GB 64-bit DDR4 (non-ECC) Storage – 512 Mbit […]
OASIS – ROS 2 based Smart Home operating system integrates with Kodi
OASIS is a Smart Home operating system based on ROS 2 that currently implements computer vision, input streaming, and general automation features, and can be integrated into Kodi media center. The operating system was recently released by Garrett Brown (a.k.a. garbear or eigendude), who is also known for being the RetroPlayer developer from Team Kodi/XBMC, and provides a complete implementation of the Firmata protocol for communicating with Arduino boards, plus additional support for temperature and humidity sensors, I2C, servos, sonar, SPI, stepper motors, and 4-wire CPU fans. Two main use cases are computer vision and input streaming at this time. The illustration above shows the former with the Kinect 2 driver ported to ROS 2, a background subtractor on all camera feeds using bgslibrary C++ background subtraction library, and Kodi as the visual interface. The second, input streaming, can be seen below with a Lego train (including a Falcon spaceship!) […]
Mini Pupper – Raspberry Pi 4-based robot dog teaches ROS, SLAM, navigation, computer vision (Crowdfunding)
Mini Pupper is a Raspberry Pi 4 powered robot dog inspired by Stanford Pupper open-source quadruped robot, and designed in “light collaboration” with Nathan Kau, the original creator of Stanford Pupper. Just like the original design, MangDang’s Mini Pupper is open-source, based on Ubuntu and ROS (Robot Operating System), and designed for robotics education in schools, homeschool families, enthusiasts and others, with notably students being able to learn out to use ROS, SLAM, navigation, and OpenCV computer vision through online courses that will come with the robot. Mini Pupper key features and specifications: SBC – Raspberry Pi 4 Model B with 2GB RAM Storage – 2GB microSD card Display – 320×240 LCD for facial animation Camera – Support for OpenCV AI Kit Lite 12 DOF via MangDang’s custom servos Optional Lidar module for SLAM (Simultaneous localization and mapping) Battery – 800 mAh Charger – Input voltage – 100-240V AC 50/60Hz, […]