AI Seminar: How can we use advanced decision-making Deep Learning for optimization of complex neutron experiments?
Heloisa N. Bordallo, Associate Professor in X-rays and Neutron Science at the Niels Bohr Institute, University of Copenhagen.
Science is rapidly pushing the frontier of materials research and technology to control atomic and electronic interactions at the 10th of a nm scale. Here quantum effects dominate from the physics of charge transport of quantum materials to the interactions of molecules in complex biological systems. To image these complex interactions, we have constructed multi-billion large-scale research infrastructures such as the European X-ray Free electron laser (XFEL) in Hamburg and next generation neuron source, the European Spallation Source (ESS), in Lund. These facilities operate in a regime where the paradigm of making measurements has shifted from scanning specific regions of k-space to imaging vast volumes of it, providing unprecedented access to information on complex new science that govern functions in materials as well as to the physical laws governing the behavior of complex biological systems.
Traditionally, huge amount of human based decisions in the collection, reduction and analysis of scattering data is required, and over the years, the scientific community has continuously contributed to the development of computational tools to model and analyse these data. However, with the paradigm shift resulting from the development of these new large-scale facilities, the amounts of data have increased and are reaching the limit of established data handling and processing. Despite these advances serious limiting steps are not only the extensive computational processing of the acquired data that is not yet fully available but also the limitations of human interactions in making timely and appropriate critical experimental decisions within a high speed/high throughout information rich environment.
In this talk I will put forward ideas and questions in how Deep Learning (DL) Algorithms can be designed and trained for decision making in complex scientific experiments and analysis to overcome the limitations of current experimental protocols and data handling from these large-scale facilities. To make this advance we need to develop next generation decision making algorithms based on DL that can recognize the signatures of various physical phenomena in the data, sufficiently early to modulate experimental parameters and to optimize high quality data collection. Key outcome of such initiative will advance decision-making DL algorithms as well as provide powerful tools to overcome data bottlenecks.
To conclude, this DL approach to conducting experiments and reducing and analysing data would represent a new paradigm shift in how we perform experiments in large scale facilities and accelerate both efficiency of data collection as well as analysis, and subsequent publication of data. This would vastly increase the scientific value harnessed from these large-scale facilities, and in particular for ESS. In terms of Computer Science, this constitutes a highly demanding application scenario for DL, which is expected to stretch the state of the art towards high impact solutions that are interpretable at the decision level (output), and at the functional level (intermediate values and hidden biases). This is non-trivial and to achieve such goal collaborations are essential.
The seminar is free and open for everyone.
This seminar is a part of the AI Seminar Series organised by SCIENCE AI Centre. The series highlights advances and challenges in research within Machine Learning, Data Science, and AI. Like the AI Centre itself, the seminar series has a broad scope, covering both new methodological contributions, ground-breaking applications, and impacts on society.