LLM x HPC
2025 International Workshop on Large Language Models (LLMs) and HPC
September 2-5 (exact date TBA), Edinburgh, United Kingdom. Co-hosted with CLUSTER 2025 conference.

Call for Papers
High-Performance Computing (HPC) systems have become critical for meeting the computational and data-intensive needs of training Large Language Models (LLMs). Simultaneously, in the domain of HPC research, LLMs are emerging as transformative tools to understand and improve HPC system productivity and efficiency. There are clear synergies between these areas and meaningful coordination of efforts holds great promise. This workshop brings together researchers and developers to explore the intersection of HPC and LLMs, offering a comprehensive look at how these two domains can mutually benefit and drive each other's advancement.
The workshop has two key focus areas: (i) co-design and deployment of HPC systems to support LLM training and (ii) using LLMs to understand and optimize/tune HPC systems. A combination of paper presentations, panel discussion, and keynote will be included in the program to highlight salient research and development activities, promote diverse perspectives and visions, and stimulate discussion in the community.
Topics to be covered in this workshop include, but are not limited to,
- The computational and data needs of LLM training
- Exploring architectural advancements that support LLM training
- GPU-accelerated computing
- High-bandwidth memory systems
- Advanced networking capabilities
- LLM-HPC co-design efforts
- Utilizing LLMs to improve HPC deployment and operations
- Analyzing extensive system logs for performance
- Energy efficiency
- Reliability
- Fine-tuning complex HPC hardware and software stacks;
- HPC design space exploration using LLMs
Important Dates
2025 June 20 | Submission deadline |
2025 July 18 | Author Notification |
2025 August 06 | Camera Ready |
How to submit
Workshop papers will be included in the IEEE Cluster 2025 proceedings.
The papers should be- IEEE format
- Full (or invited) paper (8 pages + 2 additional pages to address reviewers' comments in camera-ready version)
- Short (or invited) paper (4 pages + 1 additional page to address reviewers' comments in camera-ready version)
Guidelines for Artificial Intelligence (AI)-Generated Text
The use of content generated by artificial intelligence (AI) in a paper (including but not limited to text, figures, images, and code) shall be disclosed in the acknowledgments section of any paper submitted to an IEEE publication. The AI system used shall be identified, and specific sections of the paper that use AI-generated content shall be identified and accompanied by a brief explanation regarding the level at which the AI system was used to generate the content.
The use of AI systems for editing and grammar enhancement is common practice and, as such, is generally outside the intent of the above policy. In this case, disclosure as noted above is recommended.
Full IEEE submission policies can be found here.
Submit your papers COMING SOON
Please direct any inquiries to llmhpc-workshop@lists.anl.gov
Accepted papers will be included in the IEEE Cluster 2024 proceedings and published in the IEEE Xplore digital library
Program
TBAOrganizers
- Tanwi Mallick (Argonne National Laboratory)
- Aleksandr Drozd (RIKEN Center for Computational Science)
- Matthieu Dorier (Argonne National Laboratory)
- Rosa Filgueira (University of Edinburgh)
Steering Committee
- Ian Foster
- Kevin A. Brown (Argonne National Laboratory)