Schools and academies
Academies
Academies
With the biennial VDSP Academy, the Vienna Doctoral School in Physics provides a platform for international and local students at the end of their Bachelor or beginning of their Master studies to dvelve into topical advances in physical sciences. The lectures are delivered by VDSP Faculty Members and international guest speakers.
Schools
VDSP schools connect PhD students with internationally renowned experts. We particularly encourage hands-on formats to complement the keynote lectures. The schools are typically scheduled during the winter or summer breaks.
The VDSP-ESI Winter School on Machine Learning in Physics took place in February 2020.
We invited Master and PhD students of physics, chemistry and materials science as well as early postdocs from all over the world to join us for keynote talks by leading experts in the field which were complemented by hands-on class activities.
The Winter School provided us with a better understanding of fundamental concepts and practical applications of machine learning as well as skills to develop and apply this technique to concrete research tasks.

Between 10 and 20 February 2020, over 100 participants from all over the world gathered at the Faculty of Physics of the University of Vienna for the VDSP-ESI Winter School "Machine Learning in Physics".

Master and PhD students of physics, chemistry and materials science as well as early postdocs attended for keynote talks by leading experts discussing how machine learning is currently changing science and technology, including physics.

The Winter School provided the participants with a better understanding of fundamental concepts...

...and practical applications of machine learning.

After an introduction to machine learning the participants were able to explore the utilization of the novel techniques in materials science, quantum science and particle physics.

The morning sessions were complemented by hands-on class activities in the afternoon.

The tutorials equipped the participants with the skills to develop and apply ML techniques to concrete research tasks.

The Winter School "Machine Learning in Physics" was organised jointly by the Vienna Doctoral School in Physics (VDSP) and the Erwin Schrödinger International Institute for Mathematics and Physics (ESI). It was co-financed by the Vienna Doctoral Program on Complex Quantum Systems (CoQuS) and the Doctoral College Particles and Interactions (DKPI).

Christoph Dellago welcomed the speakers and participants...

...representing the local organizing committee, Markus Arndt, Massimiliano Procura and Christiane Maria Losert-Valiente Kroon.

The winter school was opened by Marylou Gabrié (New York University, USA)...

...who focused on how statistical physics can be leveraged for theory and applications in machine learning.

The contribution by Philipp Grohs (University of Vienna, AT) introduced the audience to the foundations of machine learning...

...and gave a quick overview on various machine learning techniques applied in physics research.

Jörg Behler (Göttingen, DE) provided an overview about the general methodology of machine learning potentials using high-dimensional neural network potentials for atomistic simulations.

The main unsupervised learning techniques, and their application to atomistic modelling were presented by Michele Ceriotti (EPFL Lausanne) who also discussed how supervised and unsupervised machine-learning models can be used in tandem to achieve more effective and insightful analyses.

Based on the talks by Hans Briegel (University of Innsbruck, AT), ...

...the participants were able to build their own physics-inspired reinforcement learning (RL) environment featuring simulations of a few ions manipulated by laser pulses in the tutorial.

Florian Marquardt (MPI Erlangen, DE) examined the application of neural-networks based "reinforcement learning" to quantum physics, where a computer develops from scratch useful sequences of quantum operations e.g. showing how a network-based "agent" can discover complete quantum-error-correction strategies, protecting a collection of qubits against noise.

Among the vast number of applications of machine learning in high-energy physics, Wolfgang Waltenberger (HEPHY, AT) discussed self-supervised learning techniques...

...for simulation of data and anomaly detection algorithms, the exploration of more exotic hardware and reinforcement learning for detector control.

Gregor Kasieczka (Hamburg University, DE) highlighted that generative models offer a promising way to replace or speed-up programs for synthetic data generation and that anomaly, outlier or overdensity detection is a viable alternative to dedicated hypothesis testing.

The public event “Case Study Day” welcomed everyone interested in the topic of machine learning in the natural sciences.

Plenary talks on “Learning Science from Data” saw four distinguished speakers from academia and the private sector providing their perspective on applying machine learning methods in science and technology behind physics.

In the plenary “Democratizing Data Science - the human in the loop” Torsten Möller (University of Vienna, AT) discussed the field of Visual Data Science. Creating visual tools for very specific applications can make modeling accessible to a broad audience, especially to those people who have data and want to answer questions based on this data and who are not experts in statistics or computer science furthering the democratization process.

Peter Klimek (MedUni Vienna & CSH Vienna, AT) discussed how, in a nation-wide medical claims dataset, individual health care trajectories can be modeled as a diffusion process on temporal networks and how the health trajectories of patients can be characterized using a simple machine learning method.

Briefly reviewing AlphaZero, a general reinforcement learning algorithm that masters chess, shogi, and Go through self-play, and AlphaFold which allows improved protein structure prediction using potentials from deep learning, Peter Wirnsberger (DeepMind, UK) presented recent progress on targeted free energy perturbation as a machine learning problem in which the mapping in configuration space is parameterized as a neural network that is optimized so as to increase overlap of the underlying distributions.

Representatives of Artificial Intelligence Start-Ups Patrick Blies, AI Research Engineer at EnliteAI, Jason Hölscher-Obermaier, VP R&D at ONDEWO, Valentin Stauber, Senior Research Engineer at Iris.ai, and Paul Tiwald, Senior Data Scientist at Mostly AI gave insights into the mission of their companies in flash talks and in personal conversations at the AI Start-Up World Café.

Taking inspiration from the talk by Cameron Buckner (University of Houston, US) “Is model transparency good or bad for scientific applications of deep learning?” all plenary speakers were available to take questions from the audience in the joint panel discussion.