PSML: A Multi-scale Time-series Dataset for Machine Learning in Decarbonized Energy Grids

Overview

PSML: A Multi-scale Time-series Dataset for Machine Learning in Decarbonized Energy Grids

The electric grid is a key enabling infrastructure for the ambitious transition towards carbon neutrality as we grapple with climate change. With deepening penetration of renewable energy resources and electrified transportation, the reliable and secure operation of the electric grid becomes increasingly challenging. In this paper, we present PSML, a first-of-its-kind open-access multi-scale time-series dataset, to aid in the development of data-driven machine learning (ML) based approaches towards reliable operation of future electric grids. The dataset is generated through a novel transmission + distribution (T+D) co-simulation designed to capture the increasingly important interactions and uncertainties of the grid dynamics, containing electric load, renewable generation, weather, voltage and current measurements at multiple spatio-temporal scales. Using PSML, we provide state-of-the-art ML baselines on three challenging use cases of critical importance to achieve: (i) early detection, accurate classification and localization of dynamic disturbance events; (ii) robust hierarchical forecasting of load and renewable energy with the presence of uncertainties and extreme events; and (iii) realistic synthetic generation of physical-law-constrained measurement time series. We envision that this dataset will enable advances for ML in dynamic systems, while simultaneously allowing ML researchers to contribute towards carbon-neutral electricity and mobility.

Dataset Navigation

We put Full dataset in Zenodo. Please download, unzip and put somewhere for later benchmark results reproduction and data loading and performance evaluation for proposed methods.

wget https://zenodo.org/record/5130612/files/PSML.zip?download=1
7z x 'PSML.zip?download=1' -o./

Minute-level Load and Renewable

  • File Name
    • ISO_zone_#.csv: CAISO_zone_1.csv contains minute-level load, renewable and weather data from 2018 to 2020 in the zone 1 of CAISO.
  • Field Description
    • Field time: Time of minute resolution.
    • Field load_power: Normalized load power.
    • Field wind_power: Normalized wind turbine power.
    • Field solar_power: Normalized solar PV power.
    • Field DHI: Direct normal irradiance.
    • Field DNI: Diffuse horizontal irradiance.
    • Field GHI: Global horizontal irradiance.
    • Field Dew Point: Dew point in degree Celsius.
    • Field Solar Zeinth Angle: The angle between the sun's rays and the vertical direction in degree.
    • Field Wind Speed: Wind speed (m/s).
    • Field Relative Humidity: Relative humidity (%).
    • Field Temperature: Temperature in degree Celsius.

Minute-level PMU Measurements

  • File Name
    • case #: The case 0 folder contains all data of scenario setting #0.
      • pf_input_#.txt: Selected load, renewable and solar generation for the simulation.
      • pf_result_#.csv: Voltage at nodes and power on branches in the transmission system via T+D simualtion.
  • Filed Description
    • Field time: Time of minute resolution.
    • Field Vm_###: Voltage magnitude (p.u.) at the bus ### in the simulated model.
    • Field Va_###: Voltage angle (rad) at the bus ### in the simulated model.
    • Field P_#_#_#: P_3_4_1 means the active power transferring in the #1 branch from the bus 3 to 4.
    • Field Q_#_#_#: Q_5_20_1 means the reactive power transferring in the #1 branch from the bus 5 to 20.

Millisecond-level PMU Measurements

  • File Name
    • Forced Oscillation: The folder contains all forced oscillation cases.
      • row_#: The folder contains all data of the disturbance scenario #.
        • dist.csv: Three-phased voltage at nodes in the distribution system via T+D simualtion.
        • info.csv: This file contains the start time, end time, location and type of the disturbance.
        • trans.csv: Voltage at nodes and power on branches in the transmission system via T+D simualtion.
    • Natural Oscillation: The folder contains all natural oscillation cases.
      • row_#: The folder contains all data of the disturbance scenario #.
        • dist.csv: Three-phased voltage at nodes in the distribution system via T+D simualtion.
        • info.csv: This file contains the start time, end time, location and type of the disturbance.
        • trans.csv: Voltage at nodes and power on branches in the transmission system via T+D simualtion.
  • Filed Description

    trans.csv

    • Field Time(s): Time of millisecond resolution.
    • Field VOLT ###: Voltage magnitude (p.u.) at the bus ### in the transmission model.
    • Field POWR ### TO ### CKT #: POWR 151 TO 152 CKT '1 ' means the active power transferring in the #1 branch from the bus 151 to 152.
    • Field VARS ### TO ### CKT #: VARS 151 TO 152 CKT '1 ' means the reactive power transferring in the #1 branch from the bus 151 to 152.

    dist.csv

    • Field Time(s): Time of millisecond resolution.
    • Field ####.###.#: 3005.633.1 means per-unit voltage magnitude of the phase A at the bus 633 of the distribution grid, the one connecting to the bus 3005 in the transmission system.

Installation

  • Install PSML from source.
git clone https://github.com/tamu-engineering-research/Open-source-power-dataset.git
  • Create and activate anaconda virtual environment
conda create -n PSML python=3.7.10
conda activate PSML
  • Install required packages
pip install -r ./Code/requirements.txt

Package Usage

We've prepared the standard interfaces of data loaders and evaluators for all of the three time series tasks:

(1) Data loaders

We prepare the following Pytorch data loaders, with both data processing and splitting included. You can easily load data with a few lines for different tasks by simply modifying the task parameter.

from Code.dataloader import TimeSeriesLoader

loader = TimeSeriesLoader(task='forecasting', root='./PSML') # suppose the raw dataset is downloaded and unzipped under Open-source-power-dataset
train_loader, test_loader = loader.load(batch_size=32, shuffle=True)

(2) Evaluators

We also provide evaluators to support fair comparison among different approaches. The evaluator receives the dictionary input_dict (we specify key and value format of different tasks in evaluator.expected_input_format), and returns another dictionary storing the performance measured by task-specific metrics (explanation of key and value can be found in evaluator.expected_output_format).

from Code.evaluator import TimeSeriesEvaluator
evaluator = TimeSeriesEvaluator(task='classification', root='./PSML') # suppose the raw dataset is downloaded and unzipped under Open-source-power-dataset
# learn the appropriate format of input_dict
print(evaluator.expected_input_format) # expected input_dict format
print(evaluator.expected_output_format) # expected output dict format
# prepare input_dict
input_dict = {
    'classification': classfication,
    'localization': localization,
    'detection': detection,
}
result_dict = evaluator.eval(input_dict)
# sample output: {'#samples': 110, 'classification': 0.6248447204968943, 'localization': 0.08633372048006195, 'detection': 42.59349593495935}

Code Navigation

Please see detailed explanation and comments in each subfolder.

  • BenchmarkModel
    • EventClassification: baseline models for event detection, classification and localization
    • LoadForecasting: baseline models for hierarchical load and renewable point forecast and prediction interval
    • Synthetic Data Generation: baseline models for synthetic data generation of physical-laws-constrained PMU measurement time series
  • Joint Simulation: python codes for joint steady-state and transient simulation between transmission and distribution systems
  • Data Processing: python codes for collecting the real-world load and weather data

License

The PSML dataset is published under CC BY-NC 4.0 license, meaning everyone can use it for non-commercial research purpose.

Suggested Citation

  • Please cite the following paper when you use this data hub:
    X. Zheng, N. Xu, L. Trinh, D. Wu, T. Huang, S. Sivaranjani, Y. Liu, and L. Xie, "PSML: A Multi-scale Time-series Dataset for Machine Learning in Decarbonized Energy Grids." (2021).

Contact

Please contact us if you need further technical support or search for cooperation. Pull requests are welcome. For major changes, please open an issue first to discuss what you would like to change.
Email contact:   Le Xie,   Yan Liu,   Xiangtian Zheng,   Nan Xu,   Dongqi Wu,   Loc Trinh,   Tong Huang,   S. Sivaranjani.

You might also like...
EMNLP 2021: Single-dataset Experts for Multi-dataset Question-Answering

MADE (Multi-Adapter Dataset Experts) This repository contains the implementation of MADE (Multi-adapter dataset experts), which is described in the pa

PyTorch implementation of Algorithm 1 of "On the Anatomy of MCMC-Based Maximum Likelihood Learning of Energy-Based Models"

Code for On the Anatomy of MCMC-Based Maximum Likelihood Learning of Energy-Based Models This repository will reproduce the main results from our pape

 Learning Energy-Based Models by Diffusion Recovery Likelihood
Learning Energy-Based Models by Diffusion Recovery Likelihood

Learning Energy-Based Models by Diffusion Recovery Likelihood Ruiqi Gao, Yang Song, Ben Poole, Ying Nian Wu, Diederik P. Kingma Paper: https://arxiv.o

[NeurIPS 2021] Code for Unsupervised Learning of Compositional Energy Concepts

Unsupervised Learning of Compositional Energy Concepts This is the pytorch code for the paper Unsupervised Learning of Compositional Energy Concepts.

tsai is an open-source deep learning package built on top of Pytorch & fastai focused on state-of-the-art techniques for time series classification, regression and forecasting.
tsai is an open-source deep learning package built on top of Pytorch & fastai focused on state-of-the-art techniques for time series classification, regression and forecasting.

Time series Timeseries Deep Learning Pytorch fastai - State-of-the-art Deep Learning with Time Series and Sequences in Pytorch / fastai

A universal framework for learning timestamp-level representations of time series

TS2Vec This repository contains the official implementation for the paper Learning Timestamp-Level Representations for Time Series with Hierarchical C

Code of U2Fusion: a unified unsupervised image fusion network for multiple image fusion tasks, including multi-modal, multi-exposure and multi-focus image fusion.

U2Fusion Code of U2Fusion: a unified unsupervised image fusion network for multiple image fusion tasks, including multi-modal (VIS-IR, medical), multi

A PyTorch implementation of
A PyTorch implementation of "Multi-Scale Contrastive Siamese Networks for Self-Supervised Graph Representation Learning", IJCAI-21

MERIT A PyTorch implementation of our IJCAI-21 paper Multi-Scale Contrastive Siamese Networks for Self-Supervised Graph Representation Learning. Depen

Releases(v1.0.0)
  • v1.0.0(Nov 10, 2021)

    The electric grid is a key enabling infrastructure for the ambitious transition towards carbon neutrality as we grapple with climate change. With deepening penetration of renewable energy resources and electrified transportation, the reliable and secure operation of the electric grid becomes increasingly challenging. In this paper, we present PSML, a first-of-its-kind open-access multi-scale time-series dataset, to aid in the development of data-driven machine learning based approaches towards reliable operation of future electric grids. The dataset is generated through a novel transmission + distribution co-simulation designed to capture the increasingly important interactions and uncertainties of the grid dynamics, containing electric load, renewable generation, weather, voltage and current measurements at multiple spatio-temporal scales. Using PSML, we provide state-of-the-art ML baselines on three challenging use cases of critical importance to achieve: (i) early detection, accurate classification and localization of dynamic disturbance events; (ii) robust hierarchical forecasting of load and renewable energy with the presence of uncertainties and extreme events; and (iii) realistic synthetic generation of physical-law-constrained measurement time series. We envision that this dataset will provide use-inspired ML research in dynamic safety-critical systems, while simultaneously enabling ML researchers to contribute towards decarbonization of energy sectors.

    Source code(tar.gz)
    Source code(zip)
Owner
Texas A&M Engineering Research
Texas A&M Engineering Research
S2s2net - Sentinel-2 Super-Resolution Segmentation Network

S2S2Net Sentinel-2 Super-Resolution Segmentation Network Getting started Install

Wei Ji 10 Nov 10, 2022
Official MegEngine implementation of CREStereo(CVPR 2022 Oral).

[CVPR 2022] Practical Stereo Matching via Cascaded Recurrent Network with Adaptive Correlation This repository contains MegEngine implementation of ou

MEGVII Research 309 Dec 30, 2022
​ This is the Pytorch implementation of Progressive Attentional Manifold Alignment.

PAMA This is the Pytorch implementation of Progressive Attentional Manifold Alignment. Requirements python 3.6 pytorch 1.2.0+ PIL, numpy, matplotlib C

98 Nov 15, 2022
This repository contains a set of codes to run (i.e., train, perform inference with, evaluate) a diarization method called EEND-vector-clustering.

EEND-vector clustering The EEND-vector clustering (End-to-End-Neural-Diarization-vector clustering) is a speaker diarization framework that integrates

45 Dec 26, 2022
Canonical Capsules: Unsupervised Capsules in Canonical Pose (NeurIPS 2021)

Canonical Capsules: Unsupervised Capsules in Canonical Pose (NeurIPS 2021) Introduction This is the official repository for the PyTorch implementation

165 Dec 07, 2022
CLOOB training (JAX) and inference (JAX and PyTorch)

cloob-training Pretrained models There are two pretrained CLOOB models in this repo at the moment, a 16 epoch and a 32 epoch ViT-B/16 checkpoint train

Katherine Crowson 64 Nov 27, 2022
Code for the paper "Curriculum Dropout", ICCV 2017

Curriculum Dropout Dropout is a very effective way of regularizing neural networks. Stochastically "dropping out" units with a certain probability dis

Pietro Morerio 21 Jan 02, 2022
PyTorch implementation of CVPR'18 - Perturbative Neural Networks

This is an attempt to reproduce results in Perturbative Neural Networks paper. See original repo for details.

Michael Klachko 57 May 14, 2021
Black-Box-Tuning - Black-Box Tuning for Language-Model-as-a-Service

Black-Box-Tuning Source code for paper "Black-Box Tuning for Language-Model-as-a-Service". Being busy recently, the code in this repo and this tutoria

Tianxiang Sun 149 Jan 04, 2023
HNECV: Heterogeneous Network Embedding via Cloud model and Variational inference

HNECV This repository provides a reference implementation of HNECV as described in the paper: HNECV: Heterogeneous Network Embedding via Cloud model a

4 Jun 28, 2022
Self-Supervised Multi-Frame Monocular Scene Flow (CVPR 2021)

Self-Supervised Multi-Frame Monocular Scene Flow 3D visualization of estimated depth and scene flow (overlayed with input image) from temporally conse

Visual Inference Lab @TU Darmstadt 85 Dec 22, 2022
Sleep staging from ECG, assisted with EEG

Sleep_Staging_Knowledge Distillation This codebase implements knowledge distillation approach for ECG based sleep staging assisted by EEG based sleep

2 Dec 12, 2022
A PyTorch implementation of Radio Transformer Networks from the paper "An Introduction to Deep Learning for the Physical Layer".

An Introduction to Deep Learning for the Physical Layer An usable PyTorch implementation of the noisy autoencoder infrastructure in the paper "An Intr

Gram.AI 120 Nov 21, 2022
A Simple Framwork for CV Pre-training Model (SOCO, VirTex, BEiT)

A Simple Framwork for CV Pre-training Model (SOCO, VirTex, BEiT)

Sense-GVT 14 Jul 07, 2022
The source code for Generating Training Data with Language Models: Towards Zero-Shot Language Understanding.

SuperGen The source code for Generating Training Data with Language Models: Towards Zero-Shot Language Understanding. Requirements Before running, you

Yu Meng 38 Dec 12, 2022
Using OpenAI's CLIP to upscale and enhance images

CLIP Upscaler and Enhancer Using OpenAI's CLIP to upscale and enhance images Based on nshepperd's JAX CLIP Guided Diffusion v2.4 Sample Results Viewpo

Tripp Lyons 5 Jun 14, 2022
Transferable Unrestricted Attacks, which won 1st place in CVPR’21 Security AI Challenger: Unrestricted Adversarial Attacks on ImageNet.

Transferable Unrestricted Adversarial Examples This is the PyTorch implementation of the Arxiv paper: Towards Transferable Unrestricted Adversarial Ex

equation 16 Dec 29, 2022
A Unified Framework and Analysis for Structured Knowledge Grounding

UnifiedSKG 📚 : Unifying and Multi-Tasking Structured Knowledge Grounding with Text-to-Text Language Models Code for paper UnifiedSKG: Unifying and Mu

HKU NLP Group 370 Dec 21, 2022
Dataset and Code for ICCV 2021 paper "Real-world Video Super-resolution: A Benchmark Dataset and A Decomposition based Learning Scheme"

Dataset and Code for RealVSR Real-world Video Super-resolution: A Benchmark Dataset and A Decomposition based Learning Scheme Xi Yang, Wangmeng Xiang,

Xi Yang 92 Jan 04, 2023
Towards Interpretable Deep Metric Learning with Structural Matching

DIML Created by Wenliang Zhao*, Yongming Rao*, Ziyi Wang, Jiwen Lu, Jie Zhou This repository contains PyTorch implementation for paper Towards Interpr

Wenliang Zhao 75 Nov 11, 2022