Skip to content

b1ue13e/Ultra-LSNT

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Ultra-LSNT: Efficient Sparse Mixture-of-Experts for Edge-Native Wind Power Forecasting

License PyTorch Paper

Official implementation of "Ultra-LSNT: An Efficient Sparse Mixture-of-Experts Network for Edge-Native Wind Power Forecasting"

📖 Abstract

Wind power forecasting is critical for grid stability but faces dual challenges: the inherent stochasticity of wind data and the strict computational constraints of edge devices.

Ultra-LSNT is a noise-robust Sparse Mixture-of-Experts (MoE) framework designed for edge deployment. Unlike standard Transformers ($O(L^2)$), Ultra-LSNT achieves linear complexity $O(L)$ and reduces inference latency by 45x. It utilizes a multi-scale decomposition to route continuous temporal patterns to specialized experts, maintaining high accuracy ($R^2 > 0.83$) even under 20% sensor noise where linear baselines collapse.

🚀 Key Features

  • ⚡ Linear Complexity: $O(L)$ inference speed, optimized for edge devices (e.g., Raspberry Pi, Smart Meters).
  • 🧠 Sparse MoE: Top-2 dynamic routing activates only necessary experts, saving 50% compute.
  • 🛡️ Noise Robustness: Superior stability under sensor noise (20%) compared to DLinear and Transformers.
  • 📉 Plug-and-Play: Single-file implementation (ultra_lsnt_timeseries.py) with no complex dependencies.

🏗️ Architecture

Model Architecture Figure 1: Overview of Ultra-LSNT Framework. The model decomposes time series into trend and seasonal components, processed by a Sparse MoE layer.

🛠️ Installation

  1. Clone the repository:
    git clone https://github.com/b1ue13e/Ultra-LSNT.git
    
    cd Ultra-LSNT
    

About

Official PyTorch implementation of "Ultra-LSNT: An Efficient Sparse Mixture-of-Experts for Edge-Native Wind Power Forecasting".

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages