Official implementation of "Ultra-LSNT: An Efficient Sparse Mixture-of-Experts Network for Edge-Native Wind Power Forecasting"
Wind power forecasting is critical for grid stability but faces dual challenges: the inherent stochasticity of wind data and the strict computational constraints of edge devices.
Ultra-LSNT is a noise-robust Sparse Mixture-of-Experts (MoE) framework designed for edge deployment. Unlike standard Transformers ($O(L^2)$), Ultra-LSNT achieves linear complexity
-
⚡ Linear Complexity:
$O(L)$ inference speed, optimized for edge devices (e.g., Raspberry Pi, Smart Meters). - 🧠 Sparse MoE: Top-2 dynamic routing activates only necessary experts, saving 50% compute.
- 🛡️ Noise Robustness: Superior stability under sensor noise (20%) compared to DLinear and Transformers.
-
📉 Plug-and-Play: Single-file implementation (
ultra_lsnt_timeseries.py) with no complex dependencies.
Figure 1: Overview of Ultra-LSNT Framework. The model decomposes time series into trend and seasonal components, processed by a Sparse MoE layer.
- Clone the repository:
git clone https://github.com/b1ue13e/Ultra-LSNT.git cd Ultra-LSNT