The NVIDIA Jetson AGX Thor Developer Kit became generally available in August 2025. It is the most powerful Jetson ever built, and it marks a clear architectural turning point for the platform. Where previous Jetson kits were general-purpose edge AI computers, the AGX Thor was designed from the ground up for a specific and demanding purpose: running generative AI models on physical robots in real time.
In the last 90 days of sales data, the AGX Thor ranks fourth by revenue. For a product that only became available in the second half of 2025, that volume of orders reflects serious institutional interest. This is not a hobbyist purchase. The teams buying it are building advanced robotic systems for the years ahead.
What Is in the Box
The Jetson AGX Thor Developer Kit ships with the Jetson T5000 module mounted on a reference carrier board, a 140W power supply, a Wi-Fi 6E module, a 1TB NVMe SSD preloaded with Ubuntu 24.04 LTS via JetPack 7, and a quick start guide. The included SSD is a WD/SanDisk SN5000S, as confirmed by ServeTheHome's teardown review.
Unlike the AGX Orin Developer Kit, storage is included from the start. The 1TB drive provides enough headroom for large model weights, training data, and containers without an immediate upgrade purchase. The 140W power supply is also included.
Key Specifications
What Makes the Blackwell GPU Different
The AGX Orin used an Ampere GPU and achieved 275 TOPS of INT8 performance. The AGX Thor uses NVIDIA's Blackwell GPU architecture and introduces FP4 precision, a new lower-precision datatype that roughly doubles throughput over INT8 for models that support it. At FP4, the Thor delivers 2070 TFLOPS, which is where NVIDIA's 7.5x performance claim over the Orin originates.
Multi-Instance GPU (MIG): Why It Matters for Robotics
MIG allows the GPU to be partitioned into up to seven isolated slices, each running a separate model independently. This is a genuinely significant change for robotics system design. A robot can now run four distinct AI models simultaneously on one chip, with no context-switching overhead between them.
The 128GB of unified LPDDR5X memory is a shared pool across CPU and GPU with no static partitioning. This matches the memory architecture of Apple Silicon: the full 128GB is available to any workload regardless of whether it runs on the CPU or GPU.
Real-World Performance from Hands-On Reviews
Who Is Already Using the Jetson AGX Thor
The list of early adopters confirmed at the AGX Thor general availability announcement in August 2025 reflects the scale of institutional adoption across robotics, healthcare, and industrial automation.
Humanoid and Advanced RoboticsLEM Surgical uses NVIDIA Isaac for Healthcare and Cosmos Transfer to train the autonomous arms of its Dynamis surgical robot. XRlabs uses Thor and Isaac for Healthcare to guide surgeons with real-time AI analysis through surgical scopes. Franka Robotics uses the GR00T N model to power its dual-arm manipulator. NEURA Robotics launched a Gen 3 humanoid at CES 2026 powered by Jetson Thor.
Software and Ecosystem
The AGX Thor runs JetPack 7, based on Ubuntu 24.04 LTS. This is the first Jetson platform to ship on Ubuntu 24.04, bringing updated package versions that reduce the need to compile dependencies from source for most standard AI frameworks.
- Isaac GR00T: The primary reason most buyers at this price point are choosing the Thor over the AGX Orin. GR00T N1.5 and N1.6 are vision-language-action models that allow robots to learn from human demonstrations, generalize tasks across environments, and reason about language instructions during operation. The AGX Thor is the reference compute platform for GR00T.
- NVIDIA Isaac: Complete robotics platform covering perception, manipulation, navigation, and Omniverse-based simulation.
- NVIDIA Holoscan: Real-time sensor processing for surgical systems, industrial cameras, and high-frequency sensor data ingestion via the new Holoscan Sensor Bridge.
- NVIDIA Metropolis: Visual AI agents for smart-city and industrial-monitoring applications.
- NVIDIA Cosmos Reason: Vision-language model for building video analytics AI agents at the edge. The Video Search and Summarization blueprint runs on Thor for edge video intelligence applications.
- ROS 2 Humble: Runs natively on Ubuntu 24.04 with JetPack 7, integrating cleanly into the standard robotics software stack.
⚠ Container workflow note: Container support is functional but relies primarily on Docker and NVIDIA's Jetson-containers project. Teams with Kubernetes, Podman, or systemd-based container workflows will encounter friction and may need to adapt configurations or build dependencies from source. Reviewers noted that documentation for non-Docker runtimes is fragmented as of initial release, with some important repositories having moved locations.
Primary Use Cases
The Jetson AGX Thor is purpose-built for a narrower but more demanding set of applications than the AGX Orin.
- Humanoid robotics: The combination of 2070 FP4 TFLOPS, MIG support, 128GB unified memory, and Isaac GR00T integration makes Thor the only Jetson capable of running full humanoid robot stacks with simultaneous locomotion control, dexterous manipulation, multimodal perception, and language reasoning.
- Surgical robotics and medical devices: LEM Surgical and XRlabs are both in production or development with Thor-powered systems. The 100GB of total Ethernet bandwidth from the 4x 25 GbE ports supports the high-speed sensor feeds required by surgical systems.
- Industrial automation and inspection: Thor with Holoscan Sensor Bridge is designed to ingest high-speed data from cameras, LiDARs, IMUs, and encoders for real-time processing pipelines in demanding environments.
- Agricultural robotics: Noted explicitly by NVIDIA, particularly for systems that need to identify, navigate to, and manipulate individual crops using visual and spatial reasoning.
- Agentic AI at the edge: The Video Search and Summarization blueprint enables teams to build video analytics agents that reason over camera feeds without sending data to the cloud.
What Reviewers and Developers Are Saying
The AGX Thor received substantial hands-on coverage from HotHardware, ServeTheHome, Hackster.io, and TechRadar at its August 2025 launch. The feedback reflects a platform with strong robotics performance and a few practical rough edges.
Pricing: India and Global
Why buy from ThinkRobotics? ThinkRobotics is an authorized NVIDIA distributor in India, which means manufacturer warranty, authentic hardware, and local technical support. For institutional buyers, enterprise customers, and research institutions, ThinkRobotics can provide custom volume pricing and advice on deployment configurations.
How It Compares
Who Should Buy This?
Practical Notes Before You Buy
The NVIDIA Jetson AGX Thor Developer Kit is the right platform for teams that have outgrown what the AGX Orin can offer, specifically because their AI workloads now involve generative reasoning, VLA models, multi-model concurrent pipelines on a single device, or humanoid robot development. The Blackwell GPU, 128GB unified memory pool, MIG support, and Isaac GR00T integration collectively define a new category of robotic compute. Boston Dynamics, Figure, Franka, Amazon Robotics, and LEM Surgical are building production systems on it. That adoption record reflects a capable platform, not just a specification announcement. The price of $3,499 is significant. The documentation for non-Docker container environments is a known rough edge. But for teams whose robotics applications genuinely require this level of compute, there is currently no comparable alternative at this price point.
Frequently Asked Questions
The kit ships with a 1TB NVMe SSD preloaded with Ubuntu 24.04 LTS via JetPack 7. This is different from the AGX Orin Developer Kit, which ships without onboard storage beyond eMMC. You can get started with the Thor immediately after unboxing without purchasing additional storage.
MIG allows the Blackwell GPU to be divided into up to seven isolated partitions, each of which can run a separate AI model independently. In a robotic system, this means you can assign one partition to locomotion control, another to grasp planning, another to perception, and another to a VLA policy, all running simultaneously on the same chip. Previous Jetson platforms required context-switching between models, which added latency. MIG eliminates that overhead for multi-model robotic pipelines.
Yes. The AGX Thor is the reference compute platform for NVIDIA Isaac GR00T. JetPack 7 includes the necessary CUDA and TensorRT stack, and GR00T N1.5 and N1.6 models are available through the NVIDIA Isaac GR00T developer portal. Multiple reviewers confirmed that GR00T workflows run correctly out of the box. Teams building humanoid or generalist robots can begin using GR00T workflows directly on the developer kit without custom setup beyond the standard NVIDIA software installation.
For standard computer vision pipelines, multi-sensor fusion, high-frame-rate object detection, or LLM inference with models up to 13B parameters, the AGX Orin 64GB at $1,999 remains a capable and cost-effective option. The Thor's primary advantages over the Orin come from MIG support, FP4 precision for large generative models, Isaac GR00T compatibility, and the 128GB unified memory pool. If your workload does not require any of these, the AGX Orin 64GB is a more practical starting point at $1,500 less.
Yes. ThinkRobotics is an authorized NVIDIA distributor in India and supports enterprise, research institution, and university purchases of the AGX Thor. For organizations evaluating the platform for humanoid robotics programs, advanced AI research, or surgical robotics development, the team can assist with volume pricing, deployment guidance, and integration advice. Contact ThinkRobotics directly for bulk or institutional pricing on the AGX Thor Developer Kit.
Shop the Jetson AGX Thor in India
Authorized NVIDIA distributor. Manufacturer warranty, local support, and competitive pricing on the most powerful Jetson ever built.