A Python library offering scalable and user-friendly implementations of state-of-the-art neural forecasting models.
NeuralForecast is a Python library that provides a large collection of state-of-the-art neural network models for time series forecasting. It focuses on delivering scalable, accurate, and user-friendly implementations to improve forecasting pipeline efficiency and performance. The library includes models ranging from classic RNNs to the latest transformers, addressing the practical challenges of applying neural methods to forecasting tasks.
Data scientists, machine learning engineers, and researchers working on time series forecasting who need robust, production-ready neural models with support for probabilistic forecasts, exogenous variables, and automatic hyperparameter optimization.
Developers choose NeuralForecast for its extensive, battle-tested model implementations, seamless integration with the Nixtla ecosystem (like StatsForecast), and emphasis on usability without sacrificing accuracy or scalability.
Scalable and user friendly neural :brain: forecasting algorithms.
Open-Awesome is built by the community, for the community. Submit a project, suggest an awesome list, or help improve the catalog on GitHub.
Includes over 30 state-of-the-art models like NHITS, NBEATSx, and PatchTST, offering a wide range for various forecasting scenarios, as listed in the README's highlights.
Provides uncertainty quantification through quantile losses and parametric distributions, essential for risk-aware decision-making in production pipelines.
Supports both static and temporal exogenous regressors, allowing incorporation of external factors like weather or prices, with dedicated documentation examples.
Integrates with Ray and Optuna for distributed tuning, simplifying model optimization and scalability, as highlighted in the features section.
Uses .fit and .predict methods similar to scikit-learn, reducing the learning curve for users already versed in Python ML libraries.
Training neural models, especially with automatic tuning, often requires GPUs and significant memory, making it costly and inaccessible for resource-constrained teams.
Despite tuning integrations, neural models are highly sensitive to hyperparameters, and finding optimal settings can be time-consuming and complex, adding to development time.
Heavily integrated with the Nixtla ecosystem (e.g., StatsForecast), which might limit flexibility and create vendor lock-in if switching to other forecasting libraries.