Difference between revisions of "High Performance Computing (HPC) Technical Committee"

From OpenFOAM Wiki
Jump to navigation Jump to search
(85 intermediate revisions by the same user not shown)
Line 3: Line 3:
 
==Commitment==
 
==Commitment==
  
As part of the OpenFOAM Governance structure, HPC Technical Committee' commitment is to work together with the community to overcome the actual HPC bottlenecks of OpenFOAM. In order to demonstrate improvements in performance and scalability to move forward from actual near petascale to pre- and exascale class performances.  
+
As part of the OpenFOAM Governance structure, HPC Technical Committee' commitment is to work together with the community to overcome the actual HPC bottlenecks of OpenFOAM. In order to demonstrate improvements in performance and scalability to move forward from actual near petascale to pre- and exascale class performances. An important part of the work is formulating code improvement proposal and recommendations to the [https://www.openfoam.com/governance/steering-committee.php Steering Committee], which makes the final decision.  
 
Here, you can find information about us, our current activities and also how to get in touch.
 
Here, you can find information about us, our current activities and also how to get in touch.
  
Line 9: Line 9:
  
 
This is the list of the current active committee members, led by Ivan Spisso who acts as chairman.  
 
This is the list of the current active committee members, led by Ivan Spisso who acts as chairman.  
The members represent a well balanced and geographically distributed mix of Release and Maintenace Authority (ESI-OpenCFD, Wikki), HPC experts (CINECA, KAUST, ORNL), Hardware OEM (NVIDIA, Intel, ARM), HPC system integrators (E4), domain-specific expert (FM Global, GM)  
+
The members represent a well balanced and geographically distributed mix of Release and Maintenace Authority (ESI-OpenCFD, Wikki), HPC experts (CINECA, KAUST, ORNL, Shimizu Corp.), Hardware OEM (NVIDIA, Intel, ARM), HPC system integrators (E4, AWS), domain-specific expert (FM Global, GM)  
  
 
A brief description follows:
 
A brief description follows:
  
 
<gallery mode="packed" heights=160px>  
 
<gallery mode="packed" heights=160px>  
File:is.jpg| Ivan Spisso, Committee chair
+
File:ivan.jpg| Ivan Spisso, Committee chair
 
File:Olesen_cropped.jpg | Mark Olesen
 
File:Olesen_cropped.jpg | Mark Olesen
 
File:bna_cropped.jpg | Simone Bna
 
File:bna_cropped.jpg | Simone Bna
 
File:rusche_cropped.jpg | Henrik Rusche
 
File:rusche_cropped.jpg | Henrik Rusche
File: kemm_cropped.jpg | Michael Klemm
 
 
File:rossi.jpg | Giacomo Rossi
 
File:rossi.jpg | Giacomo Rossi
 
File:Magugliani_half.jpg | Fabrizio Magugliani  
 
File:Magugliani_half.jpg | Fabrizio Magugliani  
 +
File:mathieu_amd.jpg | Mathieu Gontier
 
File:olly.jpg | Olivier Perks
 
File:olly.jpg | Olivier Perks
 +
FIle:stan.jpg | Stan Posey
 +
File:william.jpg | William F. Godoy
 +
File:Pham.jpg | Pham Van Phuc
 +
File:fspiga_cropped.jpg | Filippo Spiga
 +
File:Ashton.jpeg | Neil Ashton
 
FIle:Zampini_slim.jpg | Stefano Zampini
 
FIle:Zampini_slim.jpg | Stefano Zampini
 +
File:luwi_cropped.jpg | Oluwayemisi Oluwole
 +
File:gregor_olenik.jpg | Gregor Olenik
 +
File:Wasserman_cropped.JPG | Mark Wasserman
 +
 
</gallery>
 
</gallery>
  
'''Chair: Ivan Spisso''', ''HPC specialist for academic and industrial CFD applications'' [https://www.linkedin.com/in/ivan-spisso-324094a/], CINECA (Itay)  
+
'''Chair: Ivan Spisso''', ''Senior HPC specialist for CFD applications'' [https://www.linkedin.com/in/ivan-spisso-324094a/], Leonardo Labs, Leonardo Finmeccanica (Itay)  
  
 
'''Mark Olesen''': ''Principal Engineer'', ESI-OpenCFD (Germany)
 
'''Mark Olesen''': ''Principal Engineer'', ESI-OpenCFD (Germany)
  
'''Simone Bnà''': ''HPC developer'', CINECA (Italy)
+
'''Simone Bnà''': ''HPC developer'', SuperComputing Applications and Innovation (SCAI) Department, CINECA (Italy)
 +
 
 +
'''Henrik Rusche''', Wikki Ltd. (Germany)
 +
 
 +
'''Fabrizio Magugliani''' ''Strategic Planning and Business'', E4 (Italy)
  
'''Henrik Rusche''', Wikki Ltd. (Germany)  
+
'''Mathieu Gontier''' ''Field Support Manager, HPC CPU apps'', Santa Clara, CA (USA)
  
'''Fabrizio Magugliani''' ''Strategic Planning and Business'', E4 (Italy)  
+
'''Giacomo Rossi''' ''Application Engineer'', Intel (Italy)
  
'''Michael Klemm''' ''Principal Engineer'', '''Giacomo Rossi''' ''Application Software Eng.'', Intel (Germany / Italy)
+
'''Oliver Perks'''
  
'''Oliver Perks''', ''Staff Field Application Engineer'', ARM  (UK)
+
'''Stan Posey''', ''CFD Domain world-wide HPC Program Manager'', '''Filippo Spiga''' ''EMEA HPC Developer Relations'', NVIDIA (US/UK)
  
'''Stan Posey''', '''Filippo Spiga''', NVIDIA (US/UK)
+
'''Neil Ashton''', ''Principal CFD Specialist'', Amazon Web Services, (UK)
  
 
'''William F. Godoy''', ''Scientific Data Group'', Oak Ridge National Lab (US)
 
'''William F. Godoy''', ''Scientific Data Group'', Oak Ridge National Lab (US)
  
'''Pham Van Phuc''', Institute of Technology, Shimizu Corporation (JP)               
+
'''Pham Van Phuc''', ''Senior Researcher'', Institute of Technology, Shimizu Corporation (Japan)               
  
 
'''Stefano Zampini''', ''Research Scientist'', Extreme Computing Research Center, KAUST (Saudi Arabia). Member of PETSC dev. Team  
 
'''Stefano Zampini''', ''Research Scientist'', Extreme Computing Research Center, KAUST (Saudi Arabia). Member of PETSC dev. Team  
  
'''Luwayemisi Oluwole''' Fire Dynamics Group, FM Global  
+
'''Oluwayemisi Oluwole''', ''Lead Research Scientist'', ''Fire Dynamics Group'', FM Global (USA)
 +
 
 +
'''Moududur Rahman''', '''Raman Bansal''', ''HPC SW Innovation Group'', General Motors (USA)
  
'''Moududur Rahman''', '''Raman Bansal''', HPC SW Innovation Group, GM
+
'''Gregor Olenik''', ''Post-doctoral researcher at KIT'', Karlsruher Institut fur Technologie (Germany)
  
Repository: [https://develop.openfoam.com/committees/hpc Code repository]
+
'''Mark Wasserman''', ''Senior HPC Applications Architect'', Huawei Tel-Aviv Research Center (Israel)
  
 
== How to contact us ==
 
== How to contact us ==
  
To contact the committee, a specific email-address [mailto:openfoam-hpc-tc@googlegroups.com] has been set-up. This alias will automatically forward the incoming request to all current members of the committee. The chairman is responsible for processing any incoming emails and give an appropriate and timely answer.   
+
To contact the committee, a specific email-address [mailto:openfoam-hpc-tc@googlegroups.com] has been set-up. This alias will automatically forward the incoming requests to all current members of the committee. The chairman is responsible for processing any incoming emails and providing an appropriate and timely answer.   
  
 
==Our Workflow==
 
==Our Workflow==
  
The Committee meets with biannual meetings:
+
The Committee meets biannually:
  
* One physical, at the annual ESI OpenFOAM Conference. Typically, October time  
+
* One physical, at the annual ESI OpenFOAM Conference (typically, October time)
 
* One virtual, to be held six months after the physical one (around April)   
 
* One virtual, to be held six months after the physical one (around April)   
  
Line 68: Line 83:
 
==Remits of the Committee==
 
==Remits of the Committee==
  
An important part of the work is formulating code improvement proposal and recommendations to the Steering Committee, which makes the final decision.
+
The recommendations to Steering Committee in respect of HPC technical area are:
 
 
The recommendations to Steering Committee in respect of HPC technical area are :
 
  
 
*Work together with the Community to overcome the actual HPC bottlenecks of OpenFOAM, to name a few:
 
*Work together with the Community to overcome the actual HPC bottlenecks of OpenFOAM, to name a few:
•Scalability of linear solvers
+
**Scalability of linear solvers
•Adapt/modify data structures of Sparse Linear System to enable vectorization/hybridization
+
**Adapt/modify data structures for SpMV (Sparse Matrix-Vector Multiply) to enable vectorization/hybridization
•Improve memory access on new architectures
+
**Improve memory access on new architectures
•Improve memory bandwidth
+
**Improve memory bandwidth
•Parallel pre- and post-processing, parallel I/O  
+
**Porting to new and emerging technologies
 +
**Parallel pre- and post-processing, parallel I/O  
 +
**Load balancing
 +
**In-situ Visualization
 
*Strong co-design approach
 
*Strong co-design approach
 
*Identify algorithm improvements to enhance HPC scalability  
 
*Identify algorithm improvements to enhance HPC scalability  
*Interaction with other the Technical Committee (Numerics, Documentations)
+
*Interaction with other [https://wiki.openfoam.com/Technical_Committees Technical Committees] (Numerics, Documentations, etc.)
  
 
==Priorities==
 
==Priorities==
 +
The current priorities with respect to the aforementioned remits are:
 
*Improve scalability of linear algebra solvers
 
*Improve scalability of linear algebra solvers
 
*HPC Benchmarks
 
*HPC Benchmarks
Line 88: Line 105:
 
*Parallel I/O
 
*Parallel I/O
  
==Repository==
+
==Tasks==
 +
 
 +
* OpenFOAM HPC Benchmark Project ([[Media:OpenFOAM 2020 CINECA Spisso.pdf| Recent Presentation at 8th OpenFOAM Virtual Conference]])
 +
* HPC Performance Improvements for OpenFOAM linear solvers: [https://prace-ri.eu/hpc-access/preparatory-access/preparatory-access-awarded-projects/prace-preparatory-access-cut-off-34/#TypeC PRACE Preparatory Access Project]. '''News''': [https://prace-ri.eu/wp-content/uploads/WP294-PETSc4FOAM-A-Library-to-plug-in-PETSc-into-the-OpenFOAM-Framework.pdf White paper public available]
 +
*GPU enabling of OpenFOAM: i ) [[Media:OpenFOAM 2020 KAUST Zampini.pdf | Acceleration using PETSc4OAM]] ii) [[Media:OpenFOAM 2020 NVIDIA Martineau.pdf | AmgX GPU solver development]]
  
The [https://develop.openfoam.com/committees/hpc Code repository] for the HPC Technical Committee: is an open and shared repository with HPC relevant data-sets and terms of references.
+
==Activity Log==
 +
(Reverse chronological order)
 +
* 2021-04-01: kick-off of EU funded project [https://exafoam.eu/ exaFOAM]
 +
* 2020-10-13/15 Several members present relevant work at the HPC session of [https://www.esi-group.com/company/events/2020/8th-openfoam-conference-2020 8th OpenFOAM Virtual Conference].
 +
* 2020-01-14 submitted proposal to EuroHPC-03-2019 call: [https://ec.europa.eu/info/funding-tenders/opportunities/portal/screen/opportunities/topic-details/eurohpc-03-2019;freeTextSearchKeyword=;typeCodes=1;statusCodes=31094501,31094502,31094503;programCode=H2020;programDivisionCode=null;focusAreaCode=null;crossCuttingPriorityCode=null;callCode=H2020-JTI-EUROHPC-2019-1;sortQuery=openingDate;orderBy=asc;onlyTenders=false;topicListKey=callTopicSearchTableState Industrial software codes for extreme-scale computing environments and applications]
 +
* 2019-10-17 Committee Meeting held at the [https://www.esi-group.com/it/lazienda/eventi/2019/7th-openfoam-conference-2019 7th OpenFOAM Conference] in Berlin.
 +
* 2019-10-16/15 Several members present relevant work at the [https://www.esi-group.com/it/lazienda/eventi/2019/7th-openfoam-conference-2019 7th OpenFOAM Conference in Berlin]. [https://www.esi-group.com/it/lazienda/eventi/2019/7th-openfoam-conference-2019/agenda HPC Session], chaired by Ivan Spisso.
 +
* 2019-04-17 Committee officially ratified by the Steering Committee.
  
==Activities==
+
==Repository==
 +
The [https://develop.openfoam.com/committees/hpc Code repository] for the HPC Technical Committee:  is an open and shared repository with HPC relevant data-sets and terms of references. '''Work in progress!'''
  
* Several members present relevant work at the 7th OpenFOAM Conference in Berlin, session. Dedicate Session to HPC, chaired by Ivan Spisso.  
+
==Code contributions==
 +
The code contributions of the HPC TC are available as regular [https://www.openfoam.com/documentation/guides/latest/api/modules.html modules]:
 +
* Parallel I/O with [https://develop.openfoam.com/Community/adiosfoam OpenFOAM  Adios 2]: Since the release of OpenFOAM v1912 the adiosWrite function object has been rewritten to use the parallel I/O
 +
* A collection of visualization [https://develop.openfoam.com/modules/visualization interfaces] for OpenFOAM, primarily VTK/ParaView based.
 +
* PETSc4FOAM [https://develop.openfoam.com/modules/external-solver library]: a library for OpenFOAM that provides a solver for embedding Petsc and its external dependencies (i.e. Hypre) into arbitrary OpenFOAM simulations.
  
==Activity Log==
+
==Planned / Future activities==
 +
*First Italian OpenFOAM User Meeting (around November 2020 tbd)
 +
*Parallele I/O Test at scale with Adios2
 +
*Include GPUs support for ''PETSc4FOAM library''
  
* 2019-01-14 submitted proposal to EuroHPC-03-2019 call: Industrial software codes for extreme-scale computing environments and applications
 
* 2019-10-17 Committee Meeting at the 7th OpenFOAM Conference in Berlin.
 
* 2019-04 Committee officially ratified by the Steering Committee.
 
  
Planned/On-going: First Italian OpenFOAM User Meeting (second semester 2020)
 
  
==Code contributions==
+
{{OpenFOAM Governance}}
  
On-going
+
''last modified {{CURRENTDAY2}} {{CURRENTMONTHNAMEGEN}} {{CURRENTYEAR}}''

Revision as of 07:41, 13 June 2022

A warm welcome to the High-Performance Computing (HPC) Technical Committee's wiki page!

Commitment

As part of the OpenFOAM Governance structure, HPC Technical Committee' commitment is to work together with the community to overcome the actual HPC bottlenecks of OpenFOAM. In order to demonstrate improvements in performance and scalability to move forward from actual near petascale to pre- and exascale class performances. An important part of the work is formulating code improvement proposal and recommendations to the Steering Committee, which makes the final decision. Here, you can find information about us, our current activities and also how to get in touch.

Members of the Committee

This is the list of the current active committee members, led by Ivan Spisso who acts as chairman. The members represent a well balanced and geographically distributed mix of Release and Maintenace Authority (ESI-OpenCFD, Wikki), HPC experts (CINECA, KAUST, ORNL, Shimizu Corp.), Hardware OEM (NVIDIA, Intel, ARM), HPC system integrators (E4, AWS), domain-specific expert (FM Global, GM)

A brief description follows:

Chair: Ivan Spisso, Senior HPC specialist for CFD applications [1], Leonardo Labs, Leonardo Finmeccanica (Itay)

Mark Olesen: Principal Engineer, ESI-OpenCFD (Germany)

Simone Bnà: HPC developer, SuperComputing Applications and Innovation (SCAI) Department, CINECA (Italy)

Henrik Rusche, Wikki Ltd. (Germany)

Fabrizio Magugliani Strategic Planning and Business, E4 (Italy)

Mathieu Gontier Field Support Manager, HPC CPU apps, Santa Clara, CA (USA)

Giacomo Rossi Application Engineer, Intel (Italy)

Oliver Perks

Stan Posey, CFD Domain world-wide HPC Program Manager, Filippo Spiga EMEA HPC Developer Relations, NVIDIA (US/UK)

Neil Ashton, Principal CFD Specialist, Amazon Web Services, (UK)

William F. Godoy, Scientific Data Group, Oak Ridge National Lab (US)

Pham Van Phuc, Senior Researcher, Institute of Technology, Shimizu Corporation (Japan)

Stefano Zampini, Research Scientist, Extreme Computing Research Center, KAUST (Saudi Arabia). Member of PETSC dev. Team

Oluwayemisi Oluwole, Lead Research Scientist, Fire Dynamics Group, FM Global (USA)

Moududur Rahman, Raman Bansal, HPC SW Innovation Group, General Motors (USA)

Gregor Olenik, Post-doctoral researcher at KIT, Karlsruher Institut fur Technologie (Germany)

Mark Wasserman, Senior HPC Applications Architect, Huawei Tel-Aviv Research Center (Israel)

How to contact us

To contact the committee, a specific email-address [2] has been set-up. This alias will automatically forward the incoming requests to all current members of the committee. The chairman is responsible for processing any incoming emails and providing an appropriate and timely answer.

Our Workflow

The Committee meets biannually:

  • One physical, at the annual ESI OpenFOAM Conference (typically, October time)
  • One virtual, to be held six months after the physical one (around April)

In between the meetings, we carry out planned activities and common projects on-going keep in touch via online collaboration tools.

Remits of the Committee

The recommendations to Steering Committee in respect of HPC technical area are:

  • Work together with the Community to overcome the actual HPC bottlenecks of OpenFOAM, to name a few:
    • Scalability of linear solvers
    • Adapt/modify data structures for SpMV (Sparse Matrix-Vector Multiply) to enable vectorization/hybridization
    • Improve memory access on new architectures
    • Improve memory bandwidth
    • Porting to new and emerging technologies
    • Parallel pre- and post-processing, parallel I/O
    • Load balancing
    • In-situ Visualization
  • Strong co-design approach
  • Identify algorithm improvements to enhance HPC scalability
  • Interaction with other Technical Committees (Numerics, Documentations, etc.)

Priorities

The current priorities with respect to the aforementioned remits are:

  • Improve scalability of linear algebra solvers
  • HPC Benchmarks
  • GPU enabling of OpenFOAM
  • Parallel I/O

Tasks

Activity Log

(Reverse chronological order)

Repository

The Code repository for the HPC Technical Committee: is an open and shared repository with HPC relevant data-sets and terms of references. Work in progress!

Code contributions

The code contributions of the HPC TC are available as regular modules:

  • Parallel I/O with OpenFOAM Adios 2: Since the release of OpenFOAM v1912 the adiosWrite function object has been rewritten to use the parallel I/O
  • A collection of visualization interfaces for OpenFOAM, primarily VTK/ParaView based.
  • PETSc4FOAM library: a library for OpenFOAM that provides a solver for embedding Petsc and its external dependencies (i.e. Hypre) into arbitrary OpenFOAM simulations.

Planned / Future activities

  • First Italian OpenFOAM User Meeting (around November 2020 tbd)
  • Parallele I/O Test at scale with Adios2
  • Include GPUs support for PETSc4FOAM library


OpenFOAM Governance
Overview | Structure | Steering Committee | Get involved | Documents
Functionality-based Committees Documentation and Tutorials | HPC | Meshing | Multiphase | Optimisation | Turbulence
Application Committees Marine | Nuclear

last modified 28 March 2024