Skip to content

vmadala2020/awesome_energy_LLM

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

25 Commits
 
 

Repository files navigation

Awesome Energy LLM papers

An Awesome Collection for energy LLM papers

GitHub stars GitHub issues GitHub forks

This repository collects papers, code, and resources related to applying Large Language Models (LLMs) in the energy domain. The goal is to help researchers, engineers, and students stay up-to-date with recent advances, implementation practices, and emerging trends in this interdisciplinary field.


🔍 Scope

We focus on applications of LLMs in areas including but not limited to:

  • Load forecasting
  • Predicting
  • Optimal Power Flow (OPF)
  • Deep Reinforcement Learning related
  • ⭐️Bonus⭐️: LLM-based Time Series Analysis

📄 Papers

Survey

  • [Arxiv] From Transformers to Large Language Models: A systematic review of AI applications in the energy sector towards Agentic Digital Twins. 2025.06

Forecasting and Predicting

  • [Applied Energy] MMGPT4LF: Leveraging an optimized pre-trained GPT-2 model with multi-modal cross-attention for load forecasting. 2025.08

  • [Expert Systems with Applications] Prompting large language model for multi-location multi-step zero-shot wind power forecasting.2025.06

  • [arXiv] A Novel Distributed PV Power Forecasting Approach Based on Time-LLM. 2025.05

  • [Applied Energy] Implementing a provincial-level universal daily industrial carbon emissions prediction by fine-tuning the large language model. 2025.04

  • [Applied Energy] Toward Large Energy Models: A comparative study of Transformers’ efficacy for energy forecasting. 2025.04

  • [IEEE TPS] Empower Pre-Trained Large Language Models for Building-Level Load Forecasting. 2025.03

  • [arXiv] Zero-shot Load Forecasting for Integrated Energy Systems: A Large Language Model-based Framework with Multi-task Learning. 2025.02

  • [Applied Energy] Domain-specific large language models for fault diagnosis of heating, ventilation, and air conditioning systems by labeled-data-supervised fine-tuning. 2025.01

  • [Applied Energy] TimeGPT in load forecasting: A large time series model perspective. 2025.01

  • [Applied Energy] Applying fine-tuned LLMs for reducing data needs in load profile analysis. 2025.01

  • [Applied Energy] STELLM: Spatio-temporal enhanced pre-trained large language model for wind speed forecasting. 2024.11

  • [arXiv] Zero-Shot Load Forecasting with Large Language Models. 2024.11

  • [arXiv] A General Framework for Load Forecasting based on Pre-trained Large Language Model. 2024.09

  • [Applied Energy] EPlus-LLM: A large language model-based computing platform for automated building energy modeling. 2024.08

  • [TechRxiv] LFLLM: A Large Language Model for Load Forecasting. 2024.01

Optimal Power Flow (OPF)

  • [Scientific reports] A large language model for advanced power dispatch. 2025.03

  • [arXiv] SafePowerGraph-LLM: Novel Power Grid Graph Embedding and Optimization with Large Language Models. 2025.01

  • [IEEE TPS] Real-Time Optimal Power Flow With Linguistic Stipulations: Integrating GPT-Agent and Deep Reinforcement Learning. 2023.11

Deep Reinforcement Learning Related

  • [Applied Energy] Adaptive infinite-horizon control of hybrid EV/FCEV charging hubs: A large-model based deep reinforcement learning approach. 2025.07

  • [IEEE TPS] Real-Time Optimal Power Flow With Linguistic Stipulations: Integrating GPT-Agent and Deep Reinforcement Learning. 2023.11

other topic

  • [IEEE TSG] A Large Language Model for Determining Partial Tripping of Distributed Energy Resources. 2025.01

  • [IEEE TSG] Large Language Model for Smart Inverter Cyber-Attack Detection via Textual Analysis of Volt/VAR Commands. 2024.09

  • [Arxiv] GAIA -- A Large Language Model for Advanced Power Dispatch. 2024.08

  • [IEEE TSG] Applying Large Language Models to Power Systems: Potential Security Threats. 2024.03

  • [Arxiv] GAIA -- A Large Language Model for Advanced Power Dispatch. 2024.08

⭐️BONUS⭐️: LLM-based Time Series Analysis

survey

benchmark

  • [arXiv] Intervention-Aware Forecasting: Breaking Historical Limits from a System Perspective. 2025.05 code1 code2 dataset 微信公众号

  • [arXiv] Time-MQA: Time Series Multi-Task Question Answering with Context Enhancement. 2025.02 code

  • [arXiv] Context is Key: A Benchmark for Forecasting with Essential Textual Information. 2025.02 code

technique paper

  • [arXiv] Time Series Representations for Classification Lie Hidden in Pretrained Vision Transformers. 2025.07

  • [arXiv] TSRating: Rating Quality of Diverse Time Series Data by Meta-learning from LLM Judgment. 2025.06

  • [arXiv] Time Series Forecasting as Reasoning: A Slow-Thinking Approach with Reinforced LLMs. 2025.06

  • [arXiv] Efficient Multivariate Time Series Forecasting via Calibrated Language Models with Privileged Knowledge Distillation. 2025.05

  • [VLDB 25] ChatTS: Aligning Time Series with LLMs via Synthetic Data for Enhanced Understanding and Reasoning. 2025.04 code 微信公众号

  • [arXiv] Position: Empowering Time Series Reasoning with Multimodal LLMs. 2025.02 code 微信公众号

  • [ICLR 25] Towards Neural Scaling Laws for Time Series Foundation Models. 2025.01 code 微信公众号

  • [WWW 25] Exploiting Language Power for Time Series Forecasting with Exogenous Variables (EXOLLM). 2025.01 code 微信公众号

  • [mdpi] TS-HTFA: Advancing Time-Series Forecasting via Hierarchical Text-Free Alignment with Large Language Models. 2025.01

  • [TMLR] Chronos: Learning the Language of Time Series. 2024.11 code blog

  • [ICLR 2025] Timer-XL: Long-Context Transformers for Unified Time Series Forecasting. 2024.11 code 微信公众号

  • [NeurIPS 2024] AutoTimes: Autoregressive Time Series Forecasters via Large Language Models. 2024.10 code 微信公众号

  • [NeurIPS 2024] Are Language Models Actually Useful for Time Series Forecasting? 2024.10 code

  • [ArXiv] Moirai-MoE: Empowering Time Series Foundation Models with Sparse Mixture of Experts. 2024.10

  • [WWW 2024] UniTime: A Language-Empowered Unified Model for Cross-Domain Time Series Forecasting. 2024.05 code

  • [ArXiv] Unified Training of Universal Time Series Forecasting Transformers (Moirai). 2024.05 code

  • [ICLR 2024] MOMENT: A Family of Open Time-series Foundation Models. 2024.05 code

  • [ICLR 2024] TEMPO: Prompt-based Generative Pre-trained Transformer for Time Series Forecasting. 2024.05 code

  • [ICML 2024] A decoder-only foundation model for time-series forecasting. 2024.04 code

  • [arXiv] TEST: Text Prototype Aligned Embedding to Activate LLM's Ability for Time Series. 2024.02

  • [ICLR 2024] Time-LLM: Time Series Forecasting by Reprogramming Large Language Models. 2024.01 code

  • [NeurIPS 2023] One Fits All: Power General Time Series Analysis by Pretrained LM (FPT). 2023.10 code

  • [NeurIPS 2023] Large Language Models Are Zero-Shot Time Series Forecasters (LLMTime). 2023.10 code

  • [TKDE] PromptCast: A New Prompt-Based Learning Paradigm for Time Series Forecasting. 2022.09


📌 Other related repo

Awesome-Multimodal-LLMs-Time-Series

Multi-modal-Time-Series-Analysis


🚧 Contributions

Contributions are welcome! If you have papers, code, or datasets to share, feel free to open a pull request or submit an issue.


📬 Contact

Maintained by Weilong Chen. If you find this project useful, feel free to ⭐️ star and share it.

Star History

Star History Chart

About

energy paper with LLM

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors