A lot of my research is applied to the data collected by the Large Hadron Collider (LHC), the world’s largest particle physics experiment and a technological wonder of our time. The LHC operates at the highest energy frontier and pushes the boundaries of our knowledge about the Universe by studying matter in extreme conditions of the early Universe a few moments after the Big Bang. By recreating these conditions in the laboratory we can study the nature of particles and forces and infer about possible new physics.

The Large Hadron Collider, hosted by the CERN laboratory near Geneva, Switzerland collides protons and heavy ions at the energy of 13 TeV (Trillion Electron Volts) at a rate of about a billion times per second. Not all of these collisions produce new physics, and for that reason powerful algorithms are needed to extract very rare but interesting physics gems from the myriad of debris of the collision.

My research focuses on building intelligent algorithms based on machine learning and data science to improve the detection ability of our measurement devices (particle detectors) to look for new physics, such as dark matter.

As the founder and convener of the Inter-experimental LHC Machine Learning (IML) Working Group, I organize many machine learning activities in particle physics.

I additionally co-convene the CMS Machine Learning Forum and coordinate the machine learning software development effort of the CERN’s Software for Experiments (EP-SFT) group.

For more details about my software development please consult the software section.


I am also one of the experimental contacts of the LHC Physics Centre Machine Learning Working Group.

Mentorship and Training

As one of the principal nodes of the INSIGHTS Marie-Curie Innovative Training Network, CERN provides research mentorship to early career scientists in the area of statistics and machine learning in particle physics. One early-stage researcher is based full-time at CERN (2018-2021) and working under my supervision. Six others will be spending from three to six months on temporary visits implementing new statistical and machine learning into tools useful to the high-energy physics community.


I am a frequent lecturer in the area of machine learning both in and outside of particle physics. For more details, please consult my teaching page.


Consulting and Knowledge Transfer

I periodically consult for the CERN Knowledge Transfer group in the area of machine learning and data science.

I have consulted various companies and enterprises (both large and startups) on applying modern deep learning techniques to solve various problems. For example, I have consulted a global pharmaceutical company Sanofi Pasteur on advanced machine learning methods with the goal of optimizing vaccine production and also several startup companies on using latest machine learning advancements in areas such as bio-technology and medicine. If you are interested in consulting in these areas, please get it touch by email.


Public Engagement

I frequently participate in other outreach activities, in particular of the International Particle Physics Outreach Group (IPPOG).

As part of Project CODER, I also co-organize the Open Data working group of CERN’s High School Teacher Program.



For more information on my outreach activities and interests, please see my outreach page.