Floor-based gravitational wave interferometers like LIGO have detected tens of gravitational wave resources. More scientific development would be a lot quicker if AI frameworks for output scale gravitational wave detection were being produced.
A latest paper on arXiv.org introduces an solution that optimizes AI styles for accelerated inference. The researchers mix AI and high-general performance computing (HPC) to speed up the education of AI models, improve them and optimize their science by distributing inference above tens of GPUs.
Using the accelerated sign detection with HPC at scale, just one thirty day period of innovative LIGO pressure details is processed within just 50 seconds utilizing 20 nodes in the ThetaGPU supercomputer. These versions keep the exact same sensitivity as standard AI designs, do not report any misclassifications, and cut down time-to-perception by up to 3x.
We introduce an ensemble of synthetic intelligence designs for gravitational wave detection that we skilled in the Summit supercomputer using 32 nodes, equivalent to 192 NVIDIA V100 GPUs, inside of 2 hours. As soon as completely skilled, we optimized these styles for accelerated inference applying NVIDIA TensorRT. We deployed our inference-optimized AI ensemble in the ThetaGPU supercomputer at Argonne Leadership Laptop Facility to conduct distributed inference. Utilizing the whole ThetaGPU supercomputer, consisting of 20 nodes each individual of which has 8 NVIDIA A100 Tensor Core GPUs and 2 AMD Rome CPUs, our NVIDIA TensorRT-optimized AI ensemble porcessed an full thirty day period of superior LIGO details (which include Hanford and Livingston knowledge streams) in just 50 seconds. Our inference-optimized AI ensemble retains the exact sensitivity of regular AI products, specifically, it identifies all known binary black gap mergers previously determined in this state-of-the-art LIGO dataset and reviews no misclassifications, though also providing a 3X inference speedup in comparison to classic artificial intelligence versions. We employed time slides to quantify the general performance of our AI ensemble to course of action up to 5 years well worth of innovative LIGO details. In this synthetically enhanced dataset, our AI ensemble reports an common of 1 misclassification for just about every month of searched highly developed LIGO information. We also current the receiver working characteristic curve of our AI ensemble utilizing this 5 12 months long highly developed LIGO dataset. This solution presents the needed equipment to carry out accelerated, AI-driven gravitational wave detection at scale.
Analysis paper: Chaturvedi, P., Khan, A., Tian, M., Huerta, E. A., and Zheng, H., “Inference-optimized AI and high efficiency computing for gravitational wave detection at scale”, 2022. Backlink: https://arxiv.org/abs/2201.11133