The LINAC Coherent Light Source (LCLS) fires powerful x-ray laser pulses onto a variety of targets; atoms, molecules, biological structures, etc.  Before target destruction, scattered x-ray photons and emitted photons, electrons and ions reveal scales of material struction from microns to Angstroms with dynamics resolved in time from microseconds to attoseconds.  Divided into two x-ray photon energy regimes, soft (250 eV to 2 keV) and hard (6 keV to 21 keV) various experimental methods use specialized detectors that readout tremendous amounts of data for every of the120 x-ray shots fired per second.  These detectors are traditionally charged particle detectors (soft x-ray spectroscopies) and photon detectors (both hard and soft x-ray scattering and photon spectroscopy).  Such detectors currently produce on the order of TB of raw data per hour.

The upcoming LCLS-II(-HE) will increase the shot rate from 120 pulses per second to 100 thousand or even up to 1 million pulses per second.  At these rates, the transfer of raw data, even to local storage farms, is unreasonable if not impossible.  Scientists are therefore beginning to re-consider what they view as "raw" when it comes to data written to disk.  In the past, gain calibration, baseline subtraction, zero-suppression, and even waveform shaping for charged particle detectors were routinely considered as standard pre-processing whose output was accepted as a sufficiently "raw" form of data.  We purport that no scientist has ever actually stored the number of electrons produced in each silicon well for an image pixel; there is always pre-procssing if at least analog shaping before digitization.  Extending this notion of pre-processing to include a more sophisticated analysis chain, implemented as a pipelined data flow, is now a growing effort at LCLS.

The development of Edge Machine Learning for use in the LCLS-II Data Reduction Pipeline will focus on deploying trained inference networks to the FPGAs that are near or ideally directly in the detector electronics.  These inference networks will partially analyze the continuous stream of data, generating veto and categorization meta-data that serves as on-the-fly analysis control switches for later stage analysis before it is sent to the local storage farm.  

TimeTool example image courtesy Dr. Ryan Coffee
Cartoon depiction of the data pipeline for the 2d-TimeTool project courtesy of Dr. Ryan Coffee
Cartoon of CookieBox project with human/AI collaboration control courtesy of Dr. Ryan Coffee
Cartoon of CookieBox project with human/AI collaboration control courtesy of Dr. Ryan Coffee
Scheme of EdgeML paradigm for user science courtesy of Dr. Ryan Coffee
Scheme of EdgeML paradigm for user science courtesy of Dr. Ryan Coffee

We are using two internally driven R&D projects as exemplars of the EdgeML paradigm: the 2d-TimeTool project which is targeting a spctrogram x-ray/optical relative delay measurement with sub-fs precision and the CookieBox angle resolved electron detector project which is targeting the attosecond scale recovery of x-ray pulse shapes as well as angle resolved photo-electron and Auger electron spectroscopy.  Both use case example are targeting data ingetstion rates in the 50 GB/s - 1 TB/s range.




CSPAD/protein image courtesy of Dr. Chun Yoon
Depiction of hard x-ray scattering experiment with diffraction peak identification based on CSPAD 16MPix detector.




PeakNet image courtesy of Dr. Chun Yoon
Hard x-ray diffraction peak identification and classification. Courtesy of Dr. Chun Yoon.

In the hard x-ray regime, many 10s of MPix detectors report the scattered x-ray photons.  Here zeros suprssion is a a must, but if diffraction peaks can be more intelligently identified and calssified, the subsequent analysis can handle much higher throuput and even use accumulation statistics to decide if image buffers should be aggregated or stored as individual anomalies.