Session
SIGN IN TO VIEW THIS PRESENTATION Sign In
Second Workshop on Enabling Predictive Science with Optimization and Uncertainty Quantification in HPC (EPSOUQ-HPC 2024)
Session Chairs
DescriptionThrust 1 – Methods driven
Uncertainty quantification theory and algorithms play a pivotal role in reducing the impact of uncertainties during both optimization and decision-making processes. Bayesian approximation and ensemble learning techniques are among the most used UQ methods. We are looking to extend this notion to include in the workshop work from researchers representing a variety of UQ methods, focusing perhaps on examining performance with a view to general applicability in a wide variety of applications, such as those mentioned in the application section, as well as e.g. computer vision, image processing, medical image analysis, natural language processing, bioinformatics, etc.
Thrust 2-Model/Data driven
With the advent of the Exascale era we expect to see more of these existing uncertainty propagation methodologies embedded within or wrapped around deterministic simulation codes. For example, sampling-based methods repeatedly call a deterministic simulation code for different values of the model inputs. The independent instantiations of this method makes it a good fit for parallel programming paradigms and deployment to compute nodes in different HPC systems. However, it is unlikely an Exascale computer will provide enough concurrency for a thousand-fold increase in sample evaluations for uncertainty propagation applied in this manner. A key topic in multi-scale simulations is the handling of uncertainty in coupled codes where two or more separate codes work together. Coupled codes could typically include particle-based and continuum (PDE) based models to bridge scales. This thrust will also include methods and applications that can bring together AI/ML and physics-based methods to enable UQ and optimization. Example topics in this area may include in-situ surrogate construction, intrusive reduced order modeling, feature/manifold learning in large data sets.
Thrust 3 – Use-inspired HPC challenges and solutions
Of particular interest are real-world experiences with HPC systems and articulations of active challenges in the use of these systems for optimization and UQ. Potential topic areas motivated by the desire to perform optimization and UQ include strategies for overcoming supercomputing bottlenecks such as the impact of queue policies and restrictions on scientific workflows, limitations with I/O and methods for I/O coupling when using multiple distinct simulation/analysis tools, in-memory and file-based coupling strategies, mixed-language and mixed-precision computing, and general memory/compute limitations
Uncertainty quantification theory and algorithms play a pivotal role in reducing the impact of uncertainties during both optimization and decision-making processes. Bayesian approximation and ensemble learning techniques are among the most used UQ methods. We are looking to extend this notion to include in the workshop work from researchers representing a variety of UQ methods, focusing perhaps on examining performance with a view to general applicability in a wide variety of applications, such as those mentioned in the application section, as well as e.g. computer vision, image processing, medical image analysis, natural language processing, bioinformatics, etc.
Thrust 2-Model/Data driven
With the advent of the Exascale era we expect to see more of these existing uncertainty propagation methodologies embedded within or wrapped around deterministic simulation codes. For example, sampling-based methods repeatedly call a deterministic simulation code for different values of the model inputs. The independent instantiations of this method makes it a good fit for parallel programming paradigms and deployment to compute nodes in different HPC systems. However, it is unlikely an Exascale computer will provide enough concurrency for a thousand-fold increase in sample evaluations for uncertainty propagation applied in this manner. A key topic in multi-scale simulations is the handling of uncertainty in coupled codes where two or more separate codes work together. Coupled codes could typically include particle-based and continuum (PDE) based models to bridge scales. This thrust will also include methods and applications that can bring together AI/ML and physics-based methods to enable UQ and optimization. Example topics in this area may include in-situ surrogate construction, intrusive reduced order modeling, feature/manifold learning in large data sets.
Thrust 3 – Use-inspired HPC challenges and solutions
Of particular interest are real-world experiences with HPC systems and articulations of active challenges in the use of these systems for optimization and UQ. Potential topic areas motivated by the desire to perform optimization and UQ include strategies for overcoming supercomputing bottlenecks such as the impact of queue policies and restrictions on scientific workflows, limitations with I/O and methods for I/O coupling when using multiple distinct simulation/analysis tools, in-memory and file-based coupling strategies, mixed-language and mixed-precision computing, and general memory/compute limitations
Event TypeWorkshop
TimeMonday, 18 November 20249am - 12:30pm EST
LocationB301
Applications and Application Frameworks
Algorithms
Performance Evaluation and/or Optimization Tools
W
Presentations
9:00am - 9:20am EST | Welcome and Introduction | |
9:20am - 9:40am EST | A Scalable Training-Free Diffusion Model for Uncertainty Quantification | |
9:40am - 10:00am EST | Ensemble Simulations on Leadership Computing Systems | |
10:00am - 10:30am EST | EPSOUQ-HPC 2024 — Morning Break | |
10:30am - 11:30am EST | Panel discussion | |
11:30am - 11:50am EST | Scrutinizing Variables for Checkpoint Using Automatic Differentiation | |
11:50am - 12:10pm EST | Optimizing Uncertainty Estimation on Scientific Visualizations Using Learning Models | |
12:10pm - 12:30pm EST | Closing Remarks, OLCF/SNL updates, best paper announcement |