Key Features
-
High Performance: Equipped with an NVIDIA Ampere GPU featuring 1,024 CUDA cores and 32 Tensor cores, delivering up to 67 TOPS (trillion operations per second) of AI performance.
-
Enhanced Memory Bandwidth: Offers a 50% increase in memory bandwidth (102 GB/s) compared to its predecessor, enabling smoother AI workloads.
-
Affordable Pricing: Priced at US$249, making it accessible for hobbyists, students, and developers.
-
Comprehensive Software Support: Supports NVIDIA’s AI software ecosystem, including frameworks like NVIDIA Isaac for robotics and NVIDIA Metropolis for vision AI.
The Jetson Orin Nano Super includes both a GPU and VRAM, ensuring high-speed AI model execution:
-
GPU: NVIDIA Ampere architecture, 1,024 CUDA cores, 32 Tensor cores.
-
VRAM: 16GB LPDDR5, 102GB/s memory bandwidth.
Now that we understand its power, let’s get hands-on by setting it up from scratch!
Setting Up
Step 1: Preparing a Client Machine
To set up your Jetson Orin Nano Super, you need a client machine to flash the OS. The recommended setup:
-
Operating System: Ubuntu 20.04 or 22.04 (preferably Linux-based machine)
-
Installed Software:
-
NVIDIA SDK Manager (for flashing JetPack OS)
-
Balena Etcher (alternative method for flashing SD cards)
-
Note: SDK Manager is required to install essential drivers, CUDA, TensorRT, and AI software.
Step 2: Download JetPack SDK
-
Visit NVIDIA Jetson Developer Site.
-
Download the JetPack SDK (Ubuntu-based OS).
-
Install NVIDIA SDK Manager on your client machine.
Step 3: Put Jetson Orin Nano Super into Recovery Mode
To flash the OS, set the Jetson Orin Nano Super into recovery mode:
-
Power Off the Jetson device.
-
Connect a USB-C cable from Jetson to your client machine.
-
Set jumpers 9 & 10 (for the developer kit) to enable boot mode.
-
Press and hold the Recovery button (Force Recovery Mode).
-
Power On the device while holding the Recovery button.
-
Release the button after a few seconds.
Your Jetson Orin Nano Super is now in boot mode and ready to be flashed.
Step 4: Flash the OS
-
Open the NVIDIA SDK Manager on your client machine.
-
Select Jetson Orin Nano Super as the target device.
-
Choose JetPack 5+ (Ubuntu-based OS with CUDA, cuDNN, and TensorRT).
-
Click Flash and wait for the process to complete.
-
Once done, reboot the Jetson device .
Step 5: Initial Setup
-
On first boot, complete the Ubuntu setup wizard (remember to remove the jumper wire before first boot).
-
Ensure internet connectivity (via Ethernet or WiFi).
-
Open a terminal and update the system:
sudo apt update && sudo apt upgrade -y
-
Install essential dependencies:
sudo apt install python3-pip
Running Large Language Models (LLMs)
To run AI models like DeepSeek, install the required AI frameworks:
Step 1: Install CUDA, cuDNN, and TensorRT
NVIDIA Jetson requires CUDA, cuDNN, and TensorRT for AI workloads:
sudo apt install nvidia-jetpack
Step 2: Install PyTorch with CUDA Support
Download and install PyTorch for Jetson:
wget https://developer.download.nvidia.com/compute/redist/jp/v51/pytorch/torch-2.0.0-cp38-cp38-linux_aarch64.whl
pip install torch-2.0.0-cp38-cp38-linux_aarch64.whl
Step 3: Run an AI Model
Try running a simple AI model inference:
import torch
print(torch.cuda.is_available())
If it returns True
, CUDA is working!
Step 4: Install DeepSeek
pip install deepseek
Run a simple test:
from deepseek import LLM
model = LLM("deepseek-mistral")
response = model.generate("Hello, Jetson!")
print(response)
Conclusion
The NVIDIA Jetson Orin Nano Super is an incredible platform for edge AI development. This guide walked you through setting up the device, flashing the OS, and running AI models like DeepSeek.
With a powerful GPU, fast VRAM, and full AI software support, you can now dive into robotics, deep learning, and generative AI projects right from your Jetson board.