Femto Mega 3D depth camera review – OrbbecViewer and Orbbec SDK in Ubuntu 22.04, body tracking in Windows 11

Orbbec Femto Mega Review

We had a quick look at the Orbbec Femto Mega 3D depth and 4K RGB camera at the end of last year with a look at the hardware and a quick try with the OrbbecViewer program in Ubuntu 22.04. I’ve now had the time to test the OrbbecViewer in more detail, check out the Orbbec SDK in Linux and various samples, and finally test the Femto Mega with a body tracking application using Unity in Windows 11.

A closer look at OrbbecViewer program and settings

As noted in the first part of the review, the OrbbecViewer program provides color, depth, and IR views for the camera. Let’s go into details for each.

OrbbecViewer Color

The Color mode would be the same as your standard USB color camera. The Femto Mega supports resolution from 1280×720 at 30 fps to 3840×2160 at 25 fps using MJPG, H.264, H.265, or RGB (converted from MJPG). We can flip/mirror the image, and there are some usual parameters for cameras such has while balance, contrast, saturation, and so on.

OrbbecViewer Depth Mode Options

The Depth mode – which I would call the “Fun” mode – represents depth data with colors ranging from red (far) to dark blue (close) although this can be reversed in the “Jet Inv” colormap mode. The resolution can be set from 320×288 at 30 FPS to 1024×1024 at 15 fps using Y16 format. Other parameters include the selection of the Color Map, optional preprocessing options, and setting the visual range from 0 to 12,000 mm, with the default being 100mm to 5,000mm.

YouTube video player

You can see how the color changes within a room and when an object (yours truly) moves elegantly inside the room in the screencast above

OrbbecViewer Infrared IR mode

The Infrared (IR) view also supports 320×288 at 30 FPS to 1024×1024 at 15 fps using Y16 format. IR mode can be set to positive or passive, and the default color map is set to gray, but you can also switch it to “Jet” or “Jet Inv” to have an effect similar to the depth mode. The visual range can also be set from 0 to 12,000 mm.

The IMU section did not work in OrbbecViewer 1.8.3, but once I updated the firmware to version 1.2.8 and the application to version 1.9.3 (see below), both the accelerometer and gyroscope data would show properly.

Orbbec Femto Mega IMU data

This data could be useful when the camera is mounted on a robot or to detect vibration when mounted on a machine.

Femto Mega Pointcloud XYZ
Pointcloud style: xyz

The Pointcloud is part of the Advanced settings and will show both color and depth data with X, Y, and Z coordinates. There are 2D and 3D controls, ROI (region of interest) range, and the ability to record and playback 2D data.

Femto Mega Pointcloud XYZRGB
Pointcloud style: xyzrgb

Switching to xyzrgb style will show objects within the D2C ROI range.

Testing the Femto Mega with the Orbbec SDK in Ubuntu 22.04

The OrbbecViewer is nice to evaluate and play with the Femto Mega camera, but if you’re going to integrate the camera into your application, you’ll need to use the Orbbec SDK.

The documentation is distributed as MD files which may be fine for Windows users, but not be the smartest move for Linux users. I had to spend close to one hour to find a program that renders the documentation properly with illustrations and Typora does the trick (to some extent):


That part of the documentation is somewhat OK, but I’d consider the overall documentation to be mediocre at best. There’s plenty of it, but it’s not always correct, and sometimes clearly incomplete or confusing.

I’ll be using the Khadas Mind mini PC with Ubuntu 22.04 to check out the Orbbec SDK. The documentation also mentions:

Note: supported Arm platforms: Jetson nano (arm64), AGX Orin(arm64), Orin NX (arm64), Orin Nano (arm64), A311D (arm64), Raspberry Pi 4 (arm64), Raspberry Pi 3 (arm32), rk3399 (arm64), other Arm systems may need to Cross-compile.

So I had plans to test it with the Raspberry Pi 5, but I wasted a lot of time on a specific part for this Femto Mega review (body tracking sample) and due to time constraints I had to skip that part.

Before installing the Orbbec SDK, we’ll need to install dependencies (some not listed in the docs):


After downloading the Orbbec C/C++ SDK 1.8.3, I unzipped and ran a script to install udev rules so that normal users can access the camera:


USBFS buffer size needs to be set to 128MB to process high-resolution images from the camera. But by default, it is set to 16MB:


This can be changed to 128MB temporarily (until the next reboot);


or permanently by adding parameters to grub, for example (the docs have the complete procedure to follow):


Now we can prepare the build for the examples:


Output:


If you have some errors here you may have to install extra dependencies.

Let’s build the samples:


It just took 5 seconds to compile on my system, and we have a range of samples:


The description for each is available in readme and the documentation has ore details explaining each functions:

  • HelloOrbbec – Demonstrate connect to device to get SDK version and device information
  • DepthViewer – Demonstrate using SDK to get depth data and draw display, get resolution and set, display depth image
  • ColorViewer – Demonstrate using SDK to get color data and draw display, get resolution and set, display color image |
  • InfraredViewer – Demonstrate using SDK to obtain infrared data and draw display, obtain resolution and set, display infrared image
  • DoubleInfraredViewer – Demonstrate obtain left and right IR data of binocular cameras
  • SensorControl – Demonstrate the operation of device, sensor control commands
  • DepthWorkMode – Demonstrate get current depth work mode, obtain supported depth work mode list, switch depth work mode.
  • Hotplugin – Demonstrate device hot-plug monitoring, automatically connect the device to open depth streaming when the device is online, and automatically disconnect the device when it detects that the device is offline
  • PointCloud – Demonstrate the generation of depth point cloud or RGBD point cloud and save it as ply format file
  • NetDevice – Demonstrates the acquisition of depth and color data through network mode
  • FirmwareUpgrade – Demonstrate upgrade device firmware
  • CommonUsages – Demonstrate the setting and acquisition of commonly used control parameters
  • IMUReader – Get IMU data and output display
  • SyncAlignViewer –  Demonstrate operations on sensor data stream alignment

Most of the samples are available in both C and C++ languages, but some have only been written in either one of the languages.

Let’s run the most basic sample:


This shows the device name, firmware version, serial number, connection type, and sensors.

For reference, here’s the C source code for the “hello orbbec” program in Example/c/Sample-HelloOrbbec:


And the C++ source code can be found in Example/cpp/Sample-HelloOrbbec:


I wanted to test the NetDevice application next since I didn’t try the Ethernet port with OrbbecViewer. The IP address still needs to be set in OrbbecViewer since DHCP did not work in version 1.8.3, so I to use fixed IP. Note that OrbecViewer fixes that issue.

We can run the application and enter the IP address we’ve defined in OrbbecViewer:


This will start a new window called “MultiDeviceViewer” with a representation of depth data.

Femto Mega Ethernet NetDevice demo

Firmware upgrade

Since we have an older firmware (Firmware version: 1.1.5), I decided to upgrade it to a more recent version 1.2.8 released on GitHub just a few days ago.


We get a directory with a bunch of files, but the sample in the SDK does not seem to be designed for this as it’s expecting a bin file instead of a directory…


Playing it safe I decided to do the update with the latest version of OrbbecViewer (1.9.3) and follow the instructions provided on GitHub. We need a micro USB to USB-A cable to do the update. It’s not provided with the kit, so I use the cable that comes with my headphones for charging. So I connected the micro USB cable to the Khadas Mind, inserted a needle in the “Registration Pin” pinhole, and connected the USB-C cable for power, before removing the needle. But nothing happened.

After several tries, I decided to look for another micro USB cable, and this time the OrbbecViewer “found a device in upgrade mode”.  It probably failed the first time because my headphones’ micro USB cable likely lacks data wires… It wasted a little over 30 minutes on this.

OrbbecViewer Firmware Upgrade

But I then selected “one firmware file”, which really means “one firmware directory” and could complete the update successfully in two minutes.

Orbbec Femto Mega Firmware Upgrade FW Version 1.2.8

After the update, I could confirm the camera was still working normally and noted some improvements such as DHCP working and the IMU was now supported properly.

Femto Mega’s Body Tracking sample with Unity (Windows 11)

I was first expecting to be able to run high-level samples such as body segmentation with the Orbbec SDK, but all samples are pretty much low-level ones showing how to get data from the Femto Mega camera, but not do something useful with it.

The Femto Mega, and some other Orbbec cameras, are compatible with Microsoft Azure Kinect and Microsoft provides several samples including a Body Tracking demo that works with Unity and can enable applications such as people counting, fall detection, etc…  But we’re told the following:

Due to the samples being provided from various sources they may only build for Windows or Linux. To keep the barrier for adding new samples low we only require that the sample works in one place.

The body tracking sample looks to only work in Windows since it involved running a bat script, and no other script is present. So I’ll reboot the Khadas Mind into Windows 11… and give it a try.

The first step of the instructions read as follows:

Open the sample_unity_bodytracking project in Unity. Open the Visual Studio Solution associated with this project. If there is no Visual Studio Solution yet you can make one by opening the Unity Editor and selecting one of the csharp files in the project and opening it for editing. You may also need to set the preferences->External Tools to Visual Studio

So I installed the latest version of Visual Studio (2022 CE) and Unity Editor (2022). But when I tried to add the Body Tracking demo project to Unity Hub, a “Missing Editor Version” warning popped up telling me to install Unity Editor 2019.1.2f1 instead…

Unity Hub Missing Editor Version

Oh well, so I did that and it also installed Visual Studio 2017 automatically. So I removed both Unity 2022 and Visual Studio 2022 CE which I had installed to avoid potential conflicts.

Femto Mega Body Tracking Unity Open Project

I could open the project in Unity. The next step was to find a CSharp file and click on it to open Visual Studio. There was no “Solution” in VS, and it took me a little while to find out what it meant. So I had to go to the top menu to create a New Project, select “Visual Studio Solutions”, and give it a name.

Visual Studio Create Solution

Once done, I could add the body tracking sample to the Solution. At this point, I’m supposed to go to Tools->NuGet Package Manager-> Package Manager Console. But there’s nothing called NuGet Package Manager. That’s because it needs to be installed via “Tools->Get Tools and Features”. All good. I could then start the console and type:


The first time it failed because I had not created a Solution. Then something failed again (not sure in Unity or Visual Studio, that was 10 days ago…) because of path length limitations in Windows… I had checked out of the GitHub repo in Documents/Orbbec, so I extracted the sample project in a directry in C:\ to shorten the path length and that part now works…

The next part of the instructions is a bit insane. You have to copy a bunch of files (around 30) from various directories. But luckily a script “MoveLibraryFile.bat” is supposed to help with that. The Visual C++ Redistributable must be installed too, and some other files must be copied manually.

The “\sample_unity_bodytracking\Assets\Scripts\SkeletalTrackingProvider.cs” file supports several tracking modes for rendering on the host machine and it is set to TrackerProcessingMode.GPU, which means DirectML in Windows, and I did not change that initially. The final step is to open the Unity Project, select Kinect4AzureSampleScene under Scenes, and click on the play button at the top. In theory, I can now dance in front of the camera and the system will track my movements.

Unity K4A_RESULT_FAILED

But it failed with the error “catching exception for background thread result = K4A_RESULTS_FAILED” and nothing happened. The error message is pretty much useless and failed to find a solution, so I asked on Orbbec Forums and was told to use the CPU renderer instead:


I did that, but still no luck. So I printed the file of files in the documentation and checked them one by one. All were there. I eventually contacted the PR company that arranged the sample and was put in contact with Orbbec engineers. After a few back-and-forth, I received the project folder from one of the engineers are could make it work.

Orbbec Femto Mega Unity Body Tracking Success

I was initially confused because of the Timeout error shown above. After clicking on Play I waved my hand in front of the camera and nothing happened so I assumed it did not work. But it turns out the timeout message can be safely ignored, it takes a while for the program to start (5 seconds on Khadas Mind Premium), and you need to be at an adequate distance from the camera for this to work properly.

Orbbec Femto Mega Body Tracking

CPU rendering is a bit slow (so there’s a lag), but Orbbec confirmed GPU rendering is not working right now. You can see some of my Kung Fu moves and a Wai greeting in the video below.

YouTube video player

The issue with my project was that I clicked the wrong link in Orbbec documentation and it led me to Microsoft’s official Azure Kinet samples instead of the fork by Orbbec.  The fun part is that I had installed the Microsoft sample in Windows while reading the documentation from the right repo in Linux… Anyway, it’s working now. I just wish overall documentation was more clear and error messages in Unity were more relevant… The body tracking sample can serve as a starting point to develop fall detection solutions or games based on the Femto Mega 3D depth camera.

I’d like to thank Orbbec for sending the Femto Mega 3D depth camera for review. The camera can be purchased on Amazon for $909, but at this time it’s cheaper to purchase it on Orbbec’s website for $694.99.

Share this:
FacebookTwitterHacker NewsSlashdotRedditLinkedInPinterestFlipboardMeWeLineEmailShare

Support CNX Software! Donate via cryptocurrencies, become a Patron on Patreon, or purchase goods on Amazon or Aliexpress

ROCK 5 ITX RK3588 mini-ITX motherboard

Leave a Reply

Your email address will not be published. Required fields are marked *

Boardcon Rockchip and Allwinner SoM and SBC products
Boardcon Rockchip and Allwinner SoM and SBC products