Highlights
- Pro
Lists (1)
Sort Name ascending (A-Z)
Stars
cuTile is a programming model for writing parallel kernels for NVIDIA GPUs
[NeurIPS'2025] MaterialRefGS: Reflective Gaussian Splatting with Multi-view Consistent Material Inference
a Fast and Robust Glue for Joint Point-Line Matching
EfficientSAM3 compresses SAM3 into lightweight, edge-friendly models via progressive knowledge distillation for fast promptable concept segmentation and tracking.
LLM Council works together to answer your hardest questions
The repository provides code for running inference with the SAM 3D Body Model (3DB), links for downloading the trained model checkpoints and datasets, and example notebooks that show how to use the…
Quick illustration of how one can easily read books together with LLMs. It's great and I highly recommend it.
A real-time 3D digital map of Tokyo's public transport system
[ICLR 2026] RF-DETR is a real-time object detection and segmentation model architecture developed by Roboflow, SOTA on COCO, designed for fine-tuning.
Code for "FlashWorld: High-quality 3D Scene Generation within Seconds" (ICLR 2026 Oral)
Easily and securely send things from one computer to another 🐊 📦
SAM-PT: Extending SAM to zero-shot video segmentation with point-based tracking.
Lightning fast C++/CUDA neural network framework
Thermodynamic Hypergraphical Model Library in JAX
Track-Anything is a flexible and interactive tool for video object tracking and segmentation, based on Segment Anything, XMem, and E2FGVI.
Framework agnostic sliced/tiled inference + interactive ui + error analysis plots
Native Multimodal Models are World Learners
Martingale posterior neural networks for fast sequential decision making @ Neurips 2025
FlinkSketch: Democratizing the Benefits of Sketches for the Flink Community
GPU-optimized version of the MuJoCo physics simulator, designed for NVIDIA hardware.
An open-source, GPU-accelerated physics simulation engine built upon NVIDIA Warp, specifically targeting roboticists and simulation researchers.
Geometry Meets Vision: Revisiting Pretrained Semantics in Distilled Fields




