## Selected papers

You can find a more or less complete list of my papers on Google Scholar.
Here I want to highlight a few favorites:

### Geometry

**EDGI: Equivariant Diffusion for Planning with Embodied Agents**.

Johann Brehmer, Joey Bose, Pim de Haan, and Taco Cohen.

ICLR workshop on reincarnating reinforcement learning (2023).
[arXiv]

**Flows for simultaneous manifold learning and density estimation**.

Johann Brehmer and Kyle Cranmer.

NeurIPS 2020.
[arXiv]
[conference]
[bibtex]
[code]

### Causality

**Weakly supervised causal representation learning**.

Johann Brehmer, Pim de Haan, Phillip Lippe, and Taco Cohen.

NeurIPS 2022.
[arXiv]
[bibtex]

**Deconfounded imitation learning**.

Risto Vuorio, Johann Brehmer, Hanno Ackermann, Daniel Dijkman, Taco Cohen, and Pim de Haan.

NeurIPS workshop on deep reinforcement learning (2022).
[arXiv]

### Simulation-based inference

**The frontier of simulation-based inference**.

Kyle Cranmer, Johann Brehmer, and Gilles Louppe.

Proceedings of the National Academy of Science 117 (2020).
[arXiv]
[journal]
[bibtex]

**Simulation-based inference methods for particle physics**.

Johann Brehmer and Kyle Cranmer.

Book chapter in Artificial Intelligence for High-Energy Physics (World Scientific, 2022).
[arXiv]
[book]
[bibtex]

**Mining for Dark Matter Substructure: Inferring subhalo population properties from strong lenses with machine learning**.

Johann Brehmer, Siddharth Mishra-Sharma, Joeri Hermans, Gilles Louppe, and Kyle Cranmer.

The Astrophysical Journal 886 (2019).
[arXiv]
[journal]
[bibtex]
[code]

**MadMiner: Machine learning-based inference for particle physics**.

Johann Brehmer, Felix Kling, Irina Espejo, and Kyle Cranmer.

Computing and Software for Big Science 4 (2020).
[arXiv]
[journal]
[bibtex]
[code]

**Likelihood-free inference with an improved cross-entropy estimator**.

Markus Stoye, Johann Brehmer, Gilles Louppe, Juan Pavez, and Kyle Cranmer.

NeurIPS workshop on Machine Learning and the Physical Sciences (2019).
[arXiv]
[workshop]
[bibtex]

**Mining gold from implicit models to improve likelihood-free inference**.

Johann Brehmer, Gilles Louppe, Juan Pavez, and Kyle Cranmer.

Proceedings of the National Academy of Science 117 (2020).
[arXiv]
[journal]
[bibtex]

**Constraining effective field theories with machine learning**.

Johann Brehmer, Kyle Cranmer, Gilles Louppe, and Juan Pavez.

Physical Review Letters 121 (2018).
[arXiv]
[journal]
[bibtex]
[code]

**A guide to constraining effective field theories with machine learning**.

Johann Brehmer, Kyle Cranmer, Gilles Louppe, and Juan Pavez.

Physical Review D 98 (2018).
[arXiv]
[journal]
[bibtex]
[code]

### More deep learning and statistics for physics

**Hierarchical clustering in particle physics through reinforcement learning**.

Johann Brehmer, Sebastian Macaluso, Duccio Pappadopulo, and Kyle Cranmer.

NeurIPS workshop on Machine Learning and the Physical Sciences (2020).
[arXiv]
[workshop]
[bibtex]
[code]

**Better Higgs Measurements Through Information Geometry**.

Johann Brehmer, Kyle Cranmer, Felix Kling, Tilman Plehn.

Physical Review D 95 (2017).
[arXiv]
[journal]
[bibtex]

### Other flavours of deep learning

**Instance-adaptive video compression: Improving neural codecs by training on the test set**.

Ties van Rozendaal, Johann Brehmer, Yunfan Zhang, Reza Pourreza, and Taco Cohen.

Under review (2022).
[arXiv]
[bibtex]

### Particle physics theory

**Pushing Higgs Effective Theory to its Limits**.

Johann Brehmer, Ayres Freitas, David Lopez-Val, Tilman Plehn.

Physical Review D 93 (2016).
[arXiv]
[journal]
[bibtex]