back to home

sktime / pytorch-forecasting

Time series forecasting with PyTorch

4,830 stars
841 forks
715 issues
PythonShell

AI Architecture Analysis

This repository is indexed by RepoMind. By analyzing sktime/pytorch-forecasting in our AI interface, you can instantly generate complete architecture diagrams, visualize control flows, and perform automated security audits across the entire codebase.

Our Agentic Context Augmented Generation (Agentic CAG) engine loads full source files into context on-demand, avoiding the fragmentation of traditional RAG systems. Ask questions about the architecture, dependencies, or specific features to see it in action.

Source files are only loaded when you start an analysis to optimize performance.

Embed this Badge

Showcase RepoMind's analysis directly in your repository's README.

[![Analyzed by RepoMind](https://img.shields.io/badge/Analyzed%20by-RepoMind-4F46E5?style=for-the-badge)](https://repomind.in/repo/sktime/pytorch-forecasting)
Preview:Analyzed by RepoMind

Repository Overview (README excerpt)

Crawler view

_PyTorch Forecasting_ is a PyTorch-based package for forecasting with state-of-the-art deep learning architectures. It provides a high-level API and uses PyTorch Lightning to scale training on GPU or CPU, with automatic logging. | | **Documentation** · **Tutorials** · **Release Notes** | |---|---| | **Open Source** | | | | **Community** | | | **CI/CD** | [![Code Coverage][coverage-image]][coverage-url] | | **Code** | | | **Downloads** | ) | [coverage-image]: https://codecov.io/gh/sktime/pytorch-forecasting/branch/main/graph/badge.svg [coverage-url]: https://codecov.io/github/sktime/pytorch-forecasting?branch=main --- Our article on Towards Data Science introduces the package and provides background information. PyTorch Forecasting aims to ease state-of-the-art timeseries forecasting with neural networks for real-world cases and research alike. The goal is to provide a high-level API with maximum flexibility for professionals and reasonable defaults for beginners. Specifically, the package provides • A timeseries dataset class which abstracts handling variable transformations, missing values, randomized subsampling, multiple history lengths, etc. • A base model class which provides basic training of timeseries models along with logging in TensorBoard and generic visualizations such as actual vs predictions and dependency plots • Multiple neural network architectures for timeseries forecasting that have been enhanced for real-world deployment and come with in-built interpretation capabilities • Multi-horizon timeseries metrics • Hyperparameter tuning with optuna The package is built on pytorch-lightning to allow training on CPUs, single and multiple GPUs out-of-the-box. Installation If you are working on windows, you need to first install PyTorch with . Otherwise, you can proceed with Alternatively, you can install the package via conda PyTorch Forecasting is now installed from the conda-forge channel while PyTorch is install from the pytorch channel. To use the MQF2 loss (multivariate quantile loss), also install Documentation Visit https://pytorch-forecasting.readthedocs.io to read the documentation with detailed tutorials. Available models The documentation provides a comparison of available models. • Temporal Fusion Transformers for Interpretable Multi-horizon Time Series Forecasting which outperforms DeepAR by Amazon by 36-69% in benchmarks • N-BEATS: Neural basis expansion analysis for interpretable time series forecasting which has (if used as ensemble) outperformed all other methods including ensembles of traditional statical methods in the M4 competition. The M4 competition is arguably the most important benchmark for univariate time series forecasting. • N-HiTS: Neural Hierarchical Interpolation for Time Series Forecasting which supports covariates and has consistently beaten N-BEATS. It is also particularly well-suited for long-horizon forecasting. • DeepAR: Probabilistic forecasting with autoregressive recurrent networks which is the one of the most popular forecasting algorithms and is often used as a baseline • Simple standard networks for baselining: LSTM and GRU networks as well as a MLP on the decoder • A baseline model that always predicts the latest known value To implement new models or other custom components, see the How to implement new models tutorial. It covers basic as well as advanced architectures. Usage example Networks can be trained with the PyTorch Lightning Trainer on pandas Dataframes which are first converted to a TimeSeriesDataSet.