BrushSizzleCompressedFrame.mp4
Massive thanks to @GradeEterna for the beautiful scenes
Brush is a 3D reconstruction engine using Gaussian splatting. It works on a wide range of systems: macOS/windows/linux, AMD/Nvidia/Intel cards, Android, and in a browser. To achieve this, it uses WebGPU compatible tech and the Burn machine learning framework.
Machine learning for real time rendering has tons of potential, but most ML tools don't work well with it: Rendering requires realtime interactivity, usually involve dynamic shapes & computations, don't run on most platforms, and it can be cumbersome to ship apps with large CUDA deps. Brush on the other hand produces simple dependency free binaries, runs on nearly all devices, without any setup.
Try the web demo
NOTE: Only works on Chrome and Edge. Firefox and Safari are hopefully supported soon)
Brush takes in COLMAP data or datasets in the Nerfstudio format. Training is fully supported natively, on mobile, and in a browser. While training you can interact with the scene and see the training dynamics live, and compare the current rendering to input views as the training progresses.
It also supports masking images:
- Images with transparency. This will force the final splat to match the transparency of the input.
- A folder of images called 'masks'. This ignores parts of the image that are masked out.
Brush also works well as a splat viewer, including on the web. It can load .ply & .compressed.ply files. You can stream in data from a URL (for a web app, simply append ?url=).
Brush also can load .zip of splat files to display them as an animation, or a special ply that includes delta frames (see cat-4D and Cap4D!).
Brush can be used as a CLI. Run brush --help to get an overview. Every CLI command can work with --with-viewer which also opens the UI, for easy debugging.
rerun_dash_compressed.mp4
While training, additional data can be visualized with the excellent rerun. To install rerun on your machine, please follow their instructions. Open the ./brush_blueprint.rbl in the viewer for best results.
First install rust 1.88+. You can run tests with cargo test --all. Brush uses the wonderful rerun for additional visualizations while training, run cargo install rerun-cli if you want to use it.
Use cargo run --release from the workspace root to make an optimized build. Use cargo run to run a debug build.
Brush can be compiled to WASM. Run npm run dev to start the demo website using Next.js, see the brush_nextjs directory.
Brush uses wasm-pack to build the WASM bundle. You can also use it without a bundler, see wasm-pack's documentation.
WebGPU is still an upcoming standard, and as such, only Chrome 134+ on Windows and macOS is currently supported.
As a one-time setup, make sure you have the Android SDK & NDK installed.
- Copy
.env.exampleto.envand fill in the Android/OpenCV paths you want Android Studio to use - Or set
sdk.dir/ndk.dirinlocal.properties - Add the Android target to rust:
rustup target add aarch64-linux-android - Install cargo-ndk to manage building a lib:
cargo install cargo-ndk
Building OpenCV Native Libraries Brush uses custom OpenCV native libraries for Android cross-compilation. You must clone and build OpenCV 4.13.0 from source before building the main application.
git clone https://github.com/opencv/opencv.git
cd opencv && git checkout 4.13.0
mkdir build && cd build
export ANDROID_NDK_HOME=/path/to/your/ndk/directory
export ANDROID_SDK_HOME=/path/to/your/sdk/directory
cmake -GNinja \
-DCMAKE_MAKE_PROGRAM=ninja-build \
-DCMAKE_TOOLCHAIN_FILE=$ANDROID_NDK_HOME/build/cmake/android.toolchain.cmake \
-DANDROID_ABI="arm64-v8a" \
-DANDROID_PLATFORM=android-30 \
-DANDROID_SDK=$ANDROID_SDK_HOME \
-DBUILD_SHARED_LIBS=ON \
-DBUILD_opencv_java=OFF \
-DBUILD_opencv_js=OFF \
-DBUILD_ANDROID_PROJECTS=OFF \
-DBUILD_ANDROID_EXAMPLES=OFF \
-DBUILD_opencv_videoio=OFF \
-DBUILD_opencv_video=OFF \
-DBUILD_opencv_dnn=OFF \
-DBUILD_opencv_ml=OFF \
-DBUILD_opencv_photo=OFF \
-DBUILD_opencv_gapi=OFF \
-DBUILD_opencv_objdetect=OFF \
-DWITH_TBB=ON \
-DBUILD_TESTS=OFF \
-DBUILD_PERF_TESTS=OFF \
-DBUILD_EXAMPLES=OFF \
..
ninja-buildAndroid Studio now checks configuration in this order where applicable:
- Gradle project properties /
local.properties - workspace
.env - regular process environment variables
Each time you change the rust code, run
cargo ndk -t arm64-v8a -o crates/brush-app/app/src/main/jniLibs/ build- Nb: Nb, for best performance, build in release mode. This is separate from the Android Studio app build configuration.
cargo ndk -t arm64-v8a -o crates/brush-app/app/src/main/jniLibs/ build --release
You can now either run the project from Android Studio (Android Studio does NOT build the rust code), or run it from the command line:
./gradlew build
./gradlew installDebug
adb shell am start -n com.splats.app/.MainActivity
- Choose MP4: Select the drone video file.
- Choose CSV: Select the corresponding telemetry log (DJI CSV format).
- Choose Config: (Optional) Select a JSON configuration file for SfM/Training parameters.
- Extract: Choose between "Uniform" extraction or "Telemetry" based extraction.
- Train: Start the full on-device SfM and Splatting pipeline.
You can also open this folder as a project in Android Studio and run things from there. Nb: Running in Android Studio does not rebuild the rust code automatically.
Rendering and training are generally faster than gsplat. You can run benchmarks of some of the kernels using cargo bench.
gSplat, for their reference version of the kernels
Peter Hedman, George Kopanas & Bernhard Kerbl, for the many discussions & pointers.
The Burn team, for help & improvements to Burn along the way
Raph Levien, for the original version of the GPU radix sort.
GradeEterna, for feedback and their scenes.
This is not an official Google product. This repository is a forked public version of the google-research repository