TY - JOUR
T1 - Parameterized neural networks for high-energy physics
AU - Baldi, Pierre
AU - Cranmer, Kyle
AU - Faucett, Taylor
AU - Sadowski, Peter
AU - Whiteson, Daniel
N1 - Funding Information:
We thank Tobias Golling, Daniel Guest, Kevin Lannon, Juan Rojo, Gilles Louppe, and Chase Shimmin for useful discussions. KC is supported by the US National Science Foundation Grants PHY-0955626, PHY-1205376, and ACI-1450310. KC is grateful to UC-Irvine for their hospitality while this research was initiated and the Moore and Sloan foundations for their generous support of the data science environment at NYU. We thank Yuzo Kanomata for computing support. We also wish to acknowledge a hardware grant from NVIDIA, NSF Grant IIS-1550705, and a Google Faculty Research award to PB.
Publisher Copyright:
© 2016, The Author(s).
PY - 2016/5/1
Y1 - 2016/5/1
N2 - We investigate a new structure for machine learning classifiers built with neural networks and applied to problems in high-energy physics by expanding the inputs to include not only measured features but also physics parameters. The physics parameters represent a smoothly varying learning task, and the resulting parameterized classifier can smoothly interpolate between them and replace sets of classifiers trained at individual values. This simplifies the training process and gives improved performance at intermediate values, even for complex problems requiring deep learning. Applications include tools parameterized in terms of theoretical model parameters, such as the mass of a particle, which allow for a single network to provide improved discrimination across a range of masses. This concept is simple to implement and allows for optimized interpolatable results.
AB - We investigate a new structure for machine learning classifiers built with neural networks and applied to problems in high-energy physics by expanding the inputs to include not only measured features but also physics parameters. The physics parameters represent a smoothly varying learning task, and the resulting parameterized classifier can smoothly interpolate between them and replace sets of classifiers trained at individual values. This simplifies the training process and gives improved performance at intermediate values, even for complex problems requiring deep learning. Applications include tools parameterized in terms of theoretical model parameters, such as the mass of a particle, which allow for a single network to provide improved discrimination across a range of masses. This concept is simple to implement and allows for optimized interpolatable results.
UR - http://www.scopus.com/inward/record.url?scp=84964849143&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84964849143&partnerID=8YFLogxK
U2 - 10.1140/epjc/s10052-016-4099-4
DO - 10.1140/epjc/s10052-016-4099-4
M3 - Article
AN - SCOPUS:84964849143
SN - 1434-6044
VL - 76
JO - European Physical Journal C
JF - European Physical Journal C
IS - 5
M1 - 235
ER -