Supporting High-Performance and High-Throughput Computing for Experimental Science

10/06/2018
by   E. A. Huerta, et al.
0

The advent of experimental science facilities, instruments and observatories, such as the Large Hadron Collider (LHC), the Laser Interferometer Gravitational Wave Observatory (LIGO), and the upcoming Large Synoptic Survey Telescope (LSST), has brought about challenging, large-scale computational and data processing requirements. Traditionally, the computing infrastructures to support these facility's requirements were organized into separate infrastructure that supported their high-throughput needs and those that supported their high-performance computing needs. We argue that in order to enable and accelerate scientific discovery at the scale and sophistication that is now needed, this separation between High-Performance Computing (HPC) and High-Throughput Computing (HTC) must be bridged and an integrated, unified infrastructure must be provided. In this paper, we discuss several case studies where such infrastructures have been implemented. These case studies span different science domains, software systems, and application requirements as well as levels of sustainable. A further aim of this paper is to provide a basis to determine the common characteristics and requirements of such infrastructures, as well as to begin a discussion of how best to support the computing requirements of existing and future experimental science facilities.

READ FULL TEXT

page 3

page 7

page 8

research
06/23/2022

The LBNL Superfacility Project Report

The Superfacility model is designed to leverage HPC for experimental sci...
research
03/31/2023

Workflows Community Summit 2022: A Roadmap Revolution

Scientific workflows have become integral tools in broad scientific comp...
research
02/04/2020

CHIPP: INAF pilot project for HTC, HPC and HPDA

CHIPP (Computing HTC in INAF Pilot Project) is an Italian project funded...
research
01/21/2023

mkite: A distributed computing platform for high-throughput materials simulations

Advances in high-throughput simulation (HTS) software enabled computatio...
research
08/07/2019

Performance Comparison for Neuroscience Application Benchmarks

Researchers within the Human Brain Project and related projects have in ...
research
07/17/2020

Workflows in AiiDA: Engineering a high-throughput, event-based engine for robust and modular computational workflows

Over the last two decades, the field of computational science has seen a...
research
02/18/2021

Using Jupyter for reproducible scientific workflows

Literate computing has emerged as an important tool for computational st...

Please sign up or login with your details

Forgot password? Click here to reset