Research

I am interested in methods that let us infer knowledge from data. A focus of my research is simulation-based (or likelihood-free) inference: statistical methods for the case when we can model a phenomenon with a computer simulation, but we cannot calculate its likelihood function. This may sound like an obscure special case, but is actually extremely common in science, with examples from neuroscience to epidemiology and from elementary particle physics to cosmology! New algorithms based on machine learning, active learning, and the tight integration of simulation and inference are changing what we can do in this field very rapidly, read our opinionated review. We also introduced new inference methods that combine some understanding of the latent processes in the simulator, a dash of classical statistics, and a heavy dose of machine learning.

Applied to problems in particle physics, these methods allow us to measure fundamental physics properties more precisely than before. I am the lead developer of the MadMiner library, which automates these algorithms and makes it straightforward to apply these algorithms to almost any measurement problem at the LHC experiments. I also worked on forecasting methods based on information geometry for particle physics.

But these methods are not limited to particle physics. We used the same techniques to learn about dark matter through gravitational lensing. I am excited to find out how they can help us understand what is happening in problems in many other scientific domains.

Beyond scientific use cases, I am generally interested in probabilistic and generative models such as normalizing flows, (approximate) inference, and the quantification of uncertainty. I have led the development of manifold-learning flows, a new class of generative models that simultaneously learn the data manifold as well as a tractable probability density on that manifold.

Publications

For an almost complete list of my publication, see Google Scholar, NASA/ADS, or INSPIRE.

I would like to highlight the following papers:

  • Johann Brehmer and Kyle Cranmer:
    Flows for simultaneous manifold learning and density estimation.
    [arXiv] [bibtex] [code]
  • Johann Brehmer and Kyle Cranmer:
    NOTAGAN: Flows for the data manifold.
    ICML workshop on Invertible Neural Networks, Normalizing Flows, and Explicit Likelihood Models (2020).
  • Johann Brehmer, Kyle Cranmer, Siddharth Mishra-Sharma, Felix Kling, and Gilles Louppe:
    Mining gold: Improving simulation-based inference with latent information.
    NeurIPS workshop on Machine Learning and the Physical Sciences (2019). [workshop]
  • Kyle Cranmer, Johann Brehmer, and Gilles Louppe:
    The frontier of simulation-based inference.
    Proceedings of the National Academy of Science. [arXiv] [journal] [bibtex]
  • Johann Brehmer, Siddharth Mishra-Sharma, Joeri Hermans, Gilles Louppe, and Kyle Cranmer:
    Mining for Dark Matter Substructure: Inferring subhalo population properties from strong lenses with machine learning.
    The Astrophysical Journal 886 (2019). [arXiv] [journal] [bibtex] [code]
  • Johann Brehmer, Felix Kling, Irina Espejo, and Kyle Cranmer:
    MadMiner: Machine learning-based inference for particle physics.
    Computing and Software for Big Science 4 (2020). [arXiv] [journal] [bibtex] [code]
  • Markus Stoye, Johann Brehmer, Gilles Louppe, Juan Pavez, and Kyle Cranmer:
    Likelihood-free inference with an improved cross-entropy estimator.
    NeurIPS workshop on Machine Learning and the Physical Sciences (2019). [arXiv] [workshop] [bibtex]
  • Johann Brehmer, Gilles Louppe, Juan Pavez, and Kyle Cranmer:
    Mining gold from implicit models to improve likelihood-free inference.
    Proceedings of the National Academy of Science 117 (2020). [arXiv] [journal] [bibtex]
  • Johann Brehmer, Kyle Cranmer, Gilles Louppe, and Juan Pavez:
    Constraining Effective Field Theories with Machine Learning.
    Physical Review Letters 121 (2018). [arXiv] [journal] [bibtex] [code]
  • Isaac Henrion, Johann Brehmer, Joan Bruna, Kyunghun Cho, Kyle Cranmer, Gilles Louppe, and Gaspar Rochette:
    Neural Message Passing for Jet Physics.
    NeurIPS workshop on Deep Learning for the Physical Sciences (2017). [workshop]
  • Johann Brehmer, Kyle Cranmer, Felix Kling, Tilman Plehn:
    Better Higgs Measurements Through Information Geometry.
    Physical Review D 95 (2017). [arXiv] [journal] [bibtex]
  • Johann Brehmer, Ayres Freitas, David Lopez-Val, Tilman Plehn:
    Pushing Higgs Effective Theory to its Limits.
    Physical Review D 93 (2016). [arXiv] [journal] [bibtex]

Code

Together with Felix Kling, Irina Espejo, and Kyle Cranmer, I have developed the MadMiner library, which automates machine learning–powered techniques for simulation-based inference for particle physics experiments.

You may also be interested in my GitHub repository, which contains the code used for most of my papers, for instance the code for manifold-learning flows.

Talks

My Github repository also contains PDF versions of slides from most of my talks.

Contact

Write me at johann.brehmer@nyu.edu. In non-pandemic times, you can usually find me either in room 614 at the Center for Data Science, or in room 840 in the Department of Physics.

What else?

I like to travel and / or take pictures.