reServer Jetson-50-1-H4 is an AI Edge server powered by NVIDIA Jetson AGX Orin 64GB

reServer Jetson-50-1-H4 is an AI inference edge server powered by Jetson AGX Orin 64GB with up to 275 TOPS of AI performance, and based on the same form factor as Seeed Studio’s reServer 2-bay multimedia NAS introduced last year with an Intel Core Tiger Lake single board computer.

The 12-core Arm server comes with 32GB LPDDR5, a 256GB NVMe SSD pre-loaded with the Jetpack SDK and the open-source Triton Inference server, two SATA bays for 2.5-inch and 3.5-inch drives, up to 10 Gbps Ethernet, dual 8K video output via HDMI and DisplayPort, USB 3.2 ports, and more.

Jetson AGX Orin 64GB AI inference serverreServer Jetson-50-1-H4 (preliminary) specifications:

  • SoM – Jetson AGX Orin module with
    • CPU – 12-core Arm Cortex-A78AE v8.2 64-bit processor with 3MB L2 + 6MB L3 cache
    • GPU / AI accelerators
      • NVIDIA Ampere architecture with 2048 NVIDIA CUDA cores and 64 Tensor Cores @ 1.3 GHz
      • DL Accelerator – 2x NVDLA v2.0
      • Vision Accelerator – PVA v2.0 (Programmable Vision Accelerator)
      • AI Performance – Up to 275 TOPS (INT8) @ 60W
    • Video Encode – 2x 4K60 | 4x 4K30 | 8x 1080p60 | 16x 1080p30 (H.265)
    • Video Decode – 1x 8K30 | 3x 4K60 | 7x 4K30 | 11x 1080p60| 22x 1080p30 (H.265)
    • System Memory – 32GB 256-bit LPDDR5 @ 204.8 GB/s
    • Storage – 64GB eMMC 5.1 flash
  • Storage – 256GB M.2  NVMe SSD, 2x SATA ports and bays for 2.5-inch and 3.5-inch drives
  • Video Output – HDMI 2.1 and DisplayPort 1.4a up to 8Kp60 resolution
  • Networking
    • 10Gbps Ethernet RJ45 port
    • Gigabit Ethernet RJ45 port
    • WiFi and Bluetooth
    • M.2 B-Key for Lora, 4G LTE, 5G
  • USB – 3x USB 3.2 Type-A ports, 1x USB 3.2 Type-C port
  • Debugging – UART header for serial console
  • Misc – Passive heat dissipation with a large vapor chamber heat sink, fan connector (a fan appears to be included at the bottom), buzzer, RTC, AutoPwrOn jumper
  • Power Supply – 24V via DC power jack
  • Dimensions – 233 x 132 x 124 mm

reServer Jetson-50-1-H4 hard drives

The reServer Jetson-50-1-H4 inference server will ship with the Jetson AGX Orin 64GB module as well as a 24V power adapter.

I had never heard about the Triton inference server and here’s how NVIDIA describes it:

Triton Inference Server streamlines AI inference by enabling teams to deploy, run and scale trained AI models from any framework on any GPU- or CPU-based infrastructure. It provides AI researchers and data scientists the freedom to choose the right framework for their projects without impacting production deployment. It also helps developers deliver high-performance inference across cloud, on-prem, edge, and embedded devices.

(It) supports all major frameworks, such as TensorFlow, NVIDIA TensorRT, PyTorch, MXNet, Python, ONNX, RAPIDS™ FIL (for XGBoost, scikit-learn, etc.), OpenVINO, custom C++, and more

Seeed Studio lists several applications with SOHO servers, industrial automation, robotics, healthcare, Smart Agriculture, and NAS. I can’t imagine anybody using it as an office server or NAS, as it will be pricy. The company did not provide pricing yet and says it is currently beta hardware, but considering NVIDIA Jetson AGX Orin Developer Kit is selling for about $2,000, and the module itself for $1,599 (in 1K quantities), the reServer Jetson-50-1-H4 should cost around $2,500 or more at launch. More details may soon become available on the Seeed Studio store.

Share this:
FacebookTwitterHacker NewsSlashdotRedditLinkedInPinterestFlipboardMeWeLineEmailShare

Support CNX Software! Donate via cryptocurrencies, become a Patron on Patreon, or purchase goods on Amazon or Aliexpress

ROCK 5 ITX RK3588 mini-ITX motherboard

2 Replies to “reServer Jetson-50-1-H4 is an AI Edge server powered by NVIDIA Jetson AGX Orin 64GB”

  1. I wondered what “Edge” means in this context. I found the below. Is that it? So “Edge” just means a local computer, and not cloud?

    “Edge computing is the process of making computation and data storage more accessible to users. It runs operations on local devices such as a computer, an Internet of Things (IoT) device, or a dedicated edge server. Unlike the cloud, it is unaffected by latency and bandwidth issues that can hamper performance.”

    1. Yes, that’s correct. Edge devices are usually gateways, SBCs, or computers handling data from sensors.

      You may also see reference to the “very edge”, albeit less frequently for now, which means processing happens directly on sensors instead of Edge devices.

Leave a Reply

Your email address will not be published. Required fields are marked *

Boardcon Rockchip and Allwinner SoM and SBC products
Boardcon Rockchip and Allwinner SoM and SBC products