NVIDIA Jetson Orin Nano Creates New Standard For Entry-Level Edge AI & Robotics

Jason R. Wilson
NVIDIA Jetson Orin Nano Creates New Standard For Entry-Level Edge AI & Robotics 1

NVIDIA announced the arrival of the Jetson Orin Nano SOM series, or system-on-modules, at this year's GTC 2022. The Jetson Orin Nano will set the bar for all entry-level robotics applications and artificial intelligence, producing up to eighty times the performance in AI of the original NVIDIA Jetson Nano series.

NVIDIA Jetson Orin Nano will usher developers closer to entry-level AI at lower costs with better performance

The NVIDIA Jetson line includes the company's proprietary Orin-referenced modules that will be used in all NVIDIA Jetson models, from the Jetson Orin Nano to the Jetson AGX Orin, allowing more clients to scale their projects with ease using the company's Jetson AGX Orin Developer Kit.

Related Story Cocoa Bean’s Correlation With NVIDIA Hits Over 90 Percent Amid a Cascade of Margin Calls and Exploding Price

As AI evolves, requiring immediate processing in several industries, NVIDIA recognizes the demands for high-level edge computing with lower-latency levels while remaining efficient, inexpensive, and more minute.

The company intends to sell Jetson Orin Nano series production modules starting January 2023, beginning at the entry-level price of $199. The new modules will produce up to 40 TOPS of performance in AI workloads in a Jetson SFF module, allowing for power consumption levels between 5W to 15W and in two different memory sizes — the Jetson Orin Nano 4GB and 8GB versions.

NVIDIA Jetson Orin Nano utilizes the Ampere-based GPU, along with eight streaming multiprocessors containing 1,024 CUDA cores and 32 Tensor Cores, which will be used for processing artificial intelligence workloads. The Ampere-based GPU Tensor Cores offer improved performance per watt support for sparsity, allowing for twice the Tensor Core throughput. The Jetson Orin Nano will also offer the six-core Arm Cortex A78AE processor onboard and various support in a video decode engine and image compositor, ISP, an audio processing engine, and a video input block.

  • As many as seven PCIe Gen3 lanes
  • Three 10Gbps USB 3.2 Gen2 connection ports
  • Eight MIPI CSI-2 camera port lanes
  • Numerous sensor inputs and outputs

Another beneficial improvement of the Jetson Orin Nano and Jetson Orin NX modules is their form factor and pin-compatibilities.

Jetson Orin Nano 4GB Jetson Orin Nano 8GB
AI Performance 20 Sparse TOPs | 10 Dense TOPs 40 Sparse TOPs | 20 Dense TOPs
GPU 512-core NVIDIA Ampere Architecture GPU with 16 Tensor Cores 1024-core NVIDIA Ampere Architecture GPU with 32 Tensor Cores
GPU Max Frequency 625 MHz
CPU 6-core Arm Cortex-A78AE v8.2 64-bit CPU 1.5 MB L2 + 4 MB L3
CPU Max Frequency 1.5 GHz
Memory 4GB 64-bit LPDDR5 34 GB/s 8GB 128-bit LPDDR5 68 GB/s
Storage
(Supports external NVMe)
Video Encode   1080p30 supported by 1-2 CPU cores
Video Decode 1x 4K60 (H.265) | 2x 4K30 (H.265) | 5x 1080p60 (H.265) | 11x 1080p30 (H.265)
Camera Up to 4 cameras (8 through virtual channels*) 8 lanes MIPI CSI-2 D-PHY 2.1 (up to 20 Gbps)
PCIe 1 x4 + 3 x1 (PCIe Gen3, Root Port, & Endpoint)
USB 3x USB 3.2 Gen2 (10 Gbps) 3x USB 2.0
Networking 1x GbE
Display 1x 4K30 multimode DisplayPort 1.2 (+MST)/e DisplayPort 1.4/HDMI 1.4*
Other I/O 3x UART, 2x SPI, 2x I2S, 4x I2C, 1x CAN, DMIC and DSPK, PWM, GPIOs
Power 5W – 10W 7W – 15W
Mechanical 69.6 mm x 45 mm 260-pin SO-DIMM connector
Price $199 $299

The developer kit for the Jetson AGX Orin and the remainder of the Jetson Orin series will be able to emulate each of the various modules used to get developers started toward working in the newest environment with the use of NVIDIA JetPack.

As you can see in the following two charts, the Jetson Orin Nano series was pitted against its predecessors in some high-end HPC AI workloads to show the difference in power and efficiency. The first table shows the FPS difference in generations, while the second bar graph shows the AI Performance Inference per second of the four groups tested. Readers will notice that the 8GB shows thirty times better increases in performance, and NVIDIA states that they plan to improve off that amount and shoot for forty-five times better performance in the future.

screenshot-2022-10-24-211809
screenshot-2022-10-24-211826

Various frameworks will be available for developers, including:

  • NVIDIA Isaac (robotics)
  • NVIDIA DeepStream (vision AI)
  • NVIDIA Riva (conversational AI)
  • NVIDIA Omniverse Replicator (Synthetic Data Generation, or SDG)
  • NVIDIA TAO Toolkit (optimizing pre-trained AI models)

For those developers wishing to learn more about the toolkit, please visit the Jetson AGX Orin Developer Kit page to find out more and the resources available.

News Source: NVIDIA Developer blog

Share this story

Deal of the Day

Comments