Fermilab builds and operates high-energy, high-intensity particle accelerators. The accelerators are a prime testbed for a diverse suite of possible machine learning applications, including multivariate time series predictions for high-precision accelerator tuning, anomaly detection for equipment failure prediction, and surrogate modeling to supplement detailed numerical simulations, etc. Our strict demands for safety and scientific rigor drive innovation in machine learning.
Fermilab researchers are using cutting-edge AI algorithms to bring astronomy research into the big-data era. Fermilab is deeply involved in the biggest current and future astronomical surveys, such as the Dark Energy Survey and surveys at the South Pole Telescope and the Vera C. Rubin Observatory. Researchers use AI to detect and study astronomical objects and related phenomena, as well as for automation and self-driving telescopes.
Detectors and Sensors
Reconfigurable detectors with dynamically selectable sensing and readout modes are highly desirable for implementing edge computing as well as enabling intelligent data collection with efficient bandwidth usage. The concept of a detector system capable of simultaneously performing different functions in different regions of the detector and dynamically changing based on observations would lead to an adaptive-autonomous detector with optimized spatial and temporal control.
Foundational AI Algorithms
Fermilab scientists are developing novel AI algorithms. Partnering with other labs, universities and industry, Fermilab is driving forward the developments of AI for high-energy particle physics and beyond. Some examples are quantifying uncertainties in machine learning algorithms, carrying out computations on graphs, ultra efficient AI optimization and normalizing flows for phase space integration.
Fermilab is a leader in the CMS experiment at CERN’s Large Hadron Collider and is deploying AI techniques across a broad range of applications and technologies. Current developments include early data-processing tasks, reconstruction of particle events, pattern recognition, improving efficiency in event generation and detector simulation with neural networks, and analysis and extraction of physical observables.
Subatomic particles called neutrinos are among the most elusive in the particle kingdom. Fermilab is the premier U.S. laboratory for studying neutrinos and hosts the Deep Underground Neutrino Experiment, an international flagship experiment to unlock the mysteries of these particles, bringing together scientists from 30-plus countries. At Fermilab, researchers are using AI and developing state-of-the-art methods for detecting and studying nature’s most mysterious particles, including expediting experiment work flow and enhancing event reconstruction
Quantum Machine Learning
Quantum machine learning is the use of quantum resources in machine learning problems or the use of machine learning to control or optimize quantum resources. Most applications to date have studied QML applied to classical data, but the most promising applications are actually using quantum data, for example from quantum simulation or from a quantum sensing experiment.
High energy physics experiments are among the biggest data science in the world with experiments generating datasets approaching exabytes. Fermilab is building infrastructure for cutting edge deep learning workflows for physicists which enables high performance computing at large scales. Active research includes developing tools and working with industry to integrate novel compute hardware into experimental physics software frameworks.
At the LHC, one of the bottlenecks for Monte Carlo event generation is what is the low unweighting efficiency for high multiplicity events. The amount of computing resources required to get the desired precision will soon exceed the compute budget allocated to event generation. To resolve this issue, the theory group at Fermilab has developed i-flow, a normalizing flow neural network to improve the unweighting efficiency.