High Energy Physics
HEP researchers at SLAC are playing leading roles in ML application development for DOE flagship experiments including Deep Underground Neutrino Experiment (DUNE), the Large Synoptic Survey Telescope (LSST) Dark Energy Science Collaboration (DESC), and the ATLAS experiment at the Large Hadron Collider. Having ML leaders from all three HEP frontiers makes SLAC uniquely suited for a cross-frontier ML R&D program: HXML.
Advancements in detector technologies continue to produce a big volume of high precision physics data at high rate. Our detectors (ATLAS, LSST, DUNE) record a trajectory of particle and deformation of a galaxy in millions to billions of pixels at 100GB/s to TB/s data rate, approaching the level of exascale imaging physics. Development of high quality, fast data reconstruction and analysis techniques to maximally extract physics information is a common challenge across HEP frontiers. We utilize modern image analysis techniques from the field of Computer Vision, in particular deep convolutional neural networks (CNNs), with customization to improve performance in our science domains (e.g. sparse linear algebra, hierarchical modeling, etc.). Applications include data analysis (image classification, object detection, pixel segmentation, pixel/object clustering, etc.) as well as rapid, high fidelity physics simulation using generative models.
Model Optimization & Uncertainty
A measure of confidence and uncertainty is crucial for reporting any scientific measurement, and it requires robust understanding of how uncertainties in the environment (detector state, assumptions in underlying physics models, etc.) affect the results. HEP researchers at SLAC utilize the technology of hierarchical Bayesian inference to instill physics dependencies in our ML algorithms, and deep generative models for individual object's density estimation to measure and constrain the impact of uncertainty. The interface of these two techniques is an active area of research with applications in all HEP frontiers.