MrNeRF (@janusch_patas)

2025-10-23 | โค๏ธ 187 | ๐Ÿ” 20


MoE-GS: Mixture of Experts for Dynamic Gaussian Splatting

Contributions: โ€ข MoE-GS: the first dynamic Gaussian splatting framework employing a Mixture-of-Experts architecture, enabling robust and adaptive reconstruction across diverse dynamic scenes.

โ€ข A novel Volume-aware Pixel Router integrates expert outputs through differentiable weight splatting, achieving spatially and temporally coherent adaptive blending.

โ€ข Efficiency of MoE-GS is improved through single-pass multi-expert rendering and gate-aware Gaussian pruning. A separate knowledge distillation strategy trains individual experts with pseudo-labels from the MoE model, enhancing quality without modifying the architecture.


Auto-generated bookmark

Tags

Vision-3D AI-ML Dev-Tools Web-Graphics