back to home

NVlabs / instant-ngp

Instant neural graphics primitives: lightning fast NeRF and more

17,315 stars
2,055 forks
503 issues
CudaC++Python

AI Architecture Analysis

This repository is indexed by RepoMind. By analyzing NVlabs/instant-ngp in our AI interface, you can instantly generate complete architecture diagrams, visualize control flows, and perform automated security audits across the entire codebase.

Our Agentic Context Augmented Generation (Agentic CAG) engine loads full source files into context on-demand, avoiding the fragmentation of traditional RAG systems. Ask questions about the architecture, dependencies, or specific features to see it in action.

Source files are only loaded when you start an analysis to optimize performance.

Embed this Badge

Showcase RepoMind's analysis directly in your repository's README.

[![Analyzed by RepoMind](https://img.shields.io/badge/Analyzed%20by-RepoMind-4F46E5?style=for-the-badge)](https://repomind.in/repo/NVlabs/instant-ngp)
Preview:Analyzed by RepoMind

Repository Overview (README excerpt)

Crawler view

Instant Neural Graphics Primitives Ever wanted to train a NeRF model of a fox in under 5 seconds? Or fly around a scene captured from photos of a factory robot? Of course you have! Here you will find an implementation of four __neural graphics primitives__, being neural radiance fields (NeRF), signed distance functions (SDFs), neural images, and neural volumes. In each case, we train and render a MLP with multiresolution hash input encoding using the __tiny-cuda-nn__ framework. > __Instant Neural Graphics Primitives with a Multiresolution Hash Encoding__ > Thomas Müller, Alex Evans, Christoph Schied, Alexander Keller > _ACM Transactions on Graphics (__SIGGRAPH__), July 2022_ > __Project page / Paper / Video / Presentation / Real-Time Live / BibTeX__ For business inquiries, please submit the NVIDIA research licensing form. Installation If you have Windows, download one of the following releases corresponding to your graphics card and extract it. Then, start . • **RTX 5000 series** and other Blackwell cards • **RTX 3000 & 4000 series, RTX A4000–A6000**, and other Ampere & Ada cards • **RTX 2000 series, Titan RTX, Quadro RTX 4000–8000**, and other Turing cards • **GTX 1000 series, Titan Xp, Quadro P1000–P6000**, and other Pascal cards Keep reading for a guided tour of the application or, if you are interested in creating your own NeRF, watch the video tutorial or read the written instructions. If you use Linux, or want the developer Python bindings, or if your GPU is not listed above (e.g. Hopper, Volta, or Maxwell generations), you need to build __instant-ngp__ yourself. Usage __instant-ngp__ comes with an interactive GUI that includes many features: • comprehensive controls for interactively exploring neural graphics primitives, • VR mode for viewing neural graphics primitives through a virtual-reality headset, • saving and loading "snapshots" so you can share your graphics primitives on the internet, • a camera path editor to create videos, • and conversion, • camera pose and lens optimization, • and many more. NeRF fox Simply start and drag the folder into the window. Or, alternatively, use the command line: You can use __any__ NeRF-compatible dataset, e.g. from original NeRF, the SILVR dataset, or the DroneDeploy dataset. **To create your own NeRF, watch the video tutorial or read the written instructions.** SDF armadillo Drag into the window or use the command: Image of Einstein Drag into the window or use the command: To reproduce the gigapixel results, download, for example, the Tokyo image and convert it to using the script. This custom format improves compatibility and loading speed when resolution is high. Now you can run: Volume renderer Download the nanovdb volume for the Disney cloud, which is derived from here (CC BY-SA 3.0). Then drag into the window or use the command: Keyboard shortcuts and recommended controls Here are the main keyboard controls for the __instant-ngp__ application. | Key | Meaning | | :-------------: | ------------- | | WASD | Forward / pan left / backward / pan right. | | Spacebar / C | Move up / down. | | = or + / - or _ | Increase / decrease camera velocity (first person mode) or zoom in / out (third person mode). | | E / Shift+E | Increase / decrease exposure. | | Tab | Toggle menu visibility. | | T | Toggle training. After around two minutes training tends to settle down, so can be toggled off. | | { } | Go to the first/last training image camera view. | | [ ] | Go to the previous/next training image camera view. | | R | Reload network from file. | | Shift+R | Reset camera. | | O | Toggle visualization or accumulated error map. | | G | Toggle visualization of the ground truth. | | M | Toggle multi-view visualization of layers of the neural model. See the paper's video for a little more explanation. | | , / . | Shows the previous / next visualized layer; hit M to escape. | | 1-8 | Switches among various render modes, with 2 being the standard one. You can see the list of render mode names in the control interface. | There are many controls in the __instant-ngp__ GUI. First, note that this GUI can be moved and resized, as can the "Camera path" GUI (which first must be expanded to be used). Recommended user controls in __instant-ngp__ are: • __Snapshot:__ use "Save" to save the trained NeRF, "Load" to reload. • __Rendering -> DLSS:__ toggling this on and setting "DLSS sharpening" to 1.0 can often improve rendering quality. • __Rendering -> Crop size:__ trim back the surrounding environment to focus on the model. "Crop aabb" lets you move the center of the volume of interest and fine tune. See more about this feature in our NeRF training & dataset tips. The "Camera path" GUI lets you create a camera path for rendering a video. The button "Add from cam" inserts keyframes from the current perspective. Then, you can render a video of your camera path or export the keyframes to a file. There is a bit more information about the GUI in this post and in this video guide to creating your own video. VR controls To view the neural graphics primitive in VR, first start your VR runtime. This will most likely be either • __OculusVR__ if you have an Oculus Rift or Meta Quest (with link cable) headset, and • __SteamVR__ if you have another headset. • Any OpenXR-compatible runtime will work. Then, press the __Connect to VR/AR headset__ button in the __instant-ngp__ GUI and put on your headset. Before entering VR, we **strongly** recommend that you first finish training (press "Stop training") or load a pre-trained snapshot for maximum performance. In VR, you have the following controls. | Control | Meaning | | :--------------------: | ------------- | | Left stick / trackpad | Move | | Right stick / trackpad | Turn camera | | Press stick / trackpad | Erase NeRF around the hand | | Grab (one-handed) | Drag neural graphics primitive | | Grab (two-handed) | Rotate and zoom (like pinch-to-zoom on a sma…