Omniperf bundles two things for performance work across Omniverse, Isaac Sim, and Isaac Lab:
- Performance AI Skills — a library of Claude Code agent skills that install, benchmark, profile, and diagnose Kit-based applications. See Agent Skills below.
- Performance Dashboards — a static site that tracks benchmark results (FPS, GPU utilization, memory, startup times) across GPUs and historical CI runs.
Live dashboard: https://nvidia.github.io/omniperf/
The dashboards are a lightweight, zero-dependency static site driven by JSON benchmark data generated by internal CI/CD pipelines and committed to this repo. The AI skills live under .agents/skills/ and encode the install, benchmark, and profile-analysis workflows used to produce and interpret that data.
- Multi-GPU comparison — switch between GPU datasets (e.g. NVIDIA L40, RTX PRO 6000)
- Overview tab — sparkline grid of all benchmarks grouped by workflow (Non-RL, RL Games, RSL RL)
- Detail tab — full time-series charts, per-run summaries, and an all-benchmarks overview table
- Search & filter — autocomplete search to quickly find specific benchmarks
- Metric selection — explore runtime metrics (FPS, GPU/CPU utilization, memory) and startup metrics
git clone https://github.com/NVIDIA/omniperf.git
cd omniperf
./serve.sh
# Open http://localhost:8765Requires Python 3 (for the built-in HTTP server). No build step or dependencies needed.
Note: The default clone skips the benchmark preview images (~1.5 GB) to keep clones lightweight. To also download them, run
git lfs pull --exclude=""after cloning.
Benchmark data lives in docs/data/. The manifest.json file lists available GPU datasets:
{
"generated_at": "2026-03-04T10:47:26Z",
"datasets": [
{
"file": "ada_l40.json",
"gpu_display_name": "NVIDIA L40",
"total_runs": 228
}
]
}Each GPU data file contains an array of benchmark runs with commit metadata, runtime metrics, and startup metrics.
- Generate the benchmark JSON using the omniperf tooling
- Place the JSON file in
docs/data/ - Update
docs/data/manifest.jsonto reference the new file - Commit and push — GitHub Pages will automatically deploy
The site is deployed automatically via GitHub Actions to GitHub Pages on every push to main. The workflow:
- Checks out the repo (with LFS)
- Uploads
docs/as a Pages artifact - Deploys to GitHub Pages
To enable GitHub Pages for a fresh clone:
- Go to Settings → Pages in the GitHub repo
- Under Source, select GitHub Actions
- Data files are tracked with Git LFS — install it before cloning
- Chart.js 4.4.7 is loaded from CDN (no npm install needed)
- Python 3 for local development server
This repo ships a set of Claude Code agent skills for working with Omniverse, Isaac Sim, Isaac Lab, and their profiling tools. They encode install steps, benchmark recipes, and profile-analysis workflows validated from live testing.
- install-isaacsim — install Isaac Sim via pip or source build
- install-isaaclab — install Isaac Lab and link it to Isaac Sim
- install-profilers — set up Nsight Systems, Tracy, and related tooling
- benchmark-isaacsim — run Isaac Sim benchmarks
- benchmark-isaaclab — run Isaac Lab RL and environment benchmarks
- profiling — capture traces with Tracy and Nsight Systems
- nsys-analyze — analyze Kit-based
.nsys-repprofiles and compare versions - diagnose-perf — first-responder triage for slow FPS, stutter, or latency
- Vulnerability disclosure: See
SECURITY.md
See CONTRIBUTING.md for guidelines, including the Developer Certificate of Origin (DCO) requirement.
This project is licensed under the Apache License 2.0.