Redirecting to original paper in 30 seconds...

Click below to go immediately or wait for automatic redirect

arxiv_ml 92% Match Research Paper Energy Analysts,Traders,RL Researchers,Economists 20 hours ago

Natural-gas storage modelling by deep reinforcement learning

reinforcement-learning › multi-agent
📄 Abstract

Abstract: We introduce GasRL, a simulator that couples a calibrated representation of the natural gas market with a model of storage-operator policies trained with deep reinforcement learning (RL). We use it to analyse how optimal stockpile management affects equilibrium prices and the dynamics of demand and supply. We test various RL algorithms and find that Soft Actor Critic (SAC) exhibits superior performance in the GasRL environment: multiple objectives of storage operators - including profitability, robust market clearing and price stabilisation - are successfully achieved. Moreover, the equilibrium price dynamics induced by SAC-derived optimal policies have characteristics, such as volatility and seasonality, that closely match those of real-world prices. Remarkably, this adherence to the historical distribution of prices is obtained without explicitly calibrating the model to price data. We show how the simulator can be used to assess the effects of EU-mandated minimum storage thresholds. We find that such thresholds have a positive effect on market resilience against unanticipated shifts in the distribution of supply shocks. For example, with unusually large shocks, market disruptions are averted more often if a threshold is in place.

Key Contributions

Introduces GasRL, a simulator coupling a calibrated natural gas market model with Deep RL-trained storage-operator policies. It demonstrates that SAC achieves superior performance in optimizing storage management for profitability, market clearing, and price stabilization, closely matching real-world price dynamics without explicit price calibration.

Business Value

Provides a powerful tool for energy companies to optimize natural gas storage operations, leading to increased profitability, market stability, and better resource allocation.