Bereiche | Tage | Auswahl | Suche | Aktualisierungen | Downloads | Hilfe

T: Fachverband Teilchenphysik

T 7: Data, AI, Computing, Electronics I

T 7.4: Vortrag

Montag, 16. März 2026, 17:00–17:15, KH 00.024

Self-Supervised Pretraining of HPGe Waveforms for Pulse-Shape Discrimination for LEGEND — •Niko Lay1, Christoph Vogl1, Tommaso Comellato1, Konstantin Gusev1, Brennan Hackett2, Baran Hashemi1, Lukas Heinrich1, Patrick Krause1, Andreas Leonhardt1, Béla Majorovits2, Moritz Neuberger1, Nadezda Rumyantseva1, Mario Schwarz1, Michael Willers1, and Stefan Schönert11Technical University of Munich, Garching, Germany — 2Max Planck Institute for Physics, Garching, Germany

LEGEND searches for (0νββ)-decay using HPGe detectors enriched in 76Ge and operated in instrumented LAr. For LEGEND-1000, underground-sourced Ar, depleted in 42Ar, constitutes the baseline choice. In the event that such argon is unavailable, a well-established and experimentally validated mitigation strategy is under preparation. Using HPGe detectors operated in 42Ar-enriched LAr at the SCARF test facility at TUM, we compare three self-supervised pretraining objectives: a transformer-based autoencoder, an autoregressive objective, and masked contrastive modeling. We finetune the pretrained models to classify signal-proxy vs. background, and bulk vs. surface interaction. The resulting pretrained backbones provide a basis for future likelihood-amortization and simulation-based inference workflows, while this talk focuses on their impact on these two PSD tasks. We acknowledge support from the DFG under Germany’s Excellence Strategy – EXC 2094 (ORIGINS), through the Sonderforschungsbereich SFB 1258, and by TUM MDSI Seed Funds.

Keywords: Self-supervised Learning; Foundation Model; Finetuning; Likelihood-Ratio Estimation; Pretraining

100% | Bildschirmansicht | English Version | Kontakt/Impressum/Datenschutz
DPG-Physik > DPG-Verhandlungen > 2026 > Erlangen