TY - JOUR
T1 - A perspective on machine learning in turbulent flows
AU - Pandey, Sandeep
AU - Schumacher, Jörg
AU - Sreenivasan, Katepalli R.
N1 - Funding Information:
SP is supported by the Deutsche Forschungsgemeinschaft with grant SCHU 1410/30-1. We thank Enrico Fonda for his initial work on predicting extreme events, whose outcome has been reported here, Ambrish Pandey and Chris Hanson for several discussions, and Robert Kräuter for his help with the analysis of the performance of the U-net for different Prandtl numbers. The research was also supported by supercomputing resources which were provided by the project grant HIL12 of the John von Neumann Institute for Computing (NIC) at the Jülich Supercomputing Centre and by the Large Scale Project pr62se of the Gauss Centre for Supercomputing (GCS) at the Leibniz Rechenzentrum Garching.
Publisher Copyright:
© 2020 Informa UK Limited, trading as Taylor & Francis Group.
PY - 2020/10/2
Y1 - 2020/10/2
N2 - The physical complexity and the large number of degrees of freedom that can be resolved today by direct numerical simulations of turbulent flows, and by the most sophisticated experimental techniques, require new strategies to reduce and analyse the data so generated, and to model the turbulent behaviour. We discuss a few concrete examples for which the turbulence data have been analysed by machine learning tools. We also comment on work in neighbouring fields of physics, particularly astrophysical (and astronomical) work, where Big Data has been the paradigm for some time. We discuss unsupervised, semi-supervised and supervised machine learning methods to direct numerical simulations data of homogeneous isotropic turbulence, Rayleigh-Bénard convection, and the minimal flow unit of a turbulent channel flow; for the last case, we discuss in some detail the application of echo state networks, this being one implementation of reservoir computing. The paper also provides a brief perspective on machine learning applications more broadly.
AB - The physical complexity and the large number of degrees of freedom that can be resolved today by direct numerical simulations of turbulent flows, and by the most sophisticated experimental techniques, require new strategies to reduce and analyse the data so generated, and to model the turbulent behaviour. We discuss a few concrete examples for which the turbulence data have been analysed by machine learning tools. We also comment on work in neighbouring fields of physics, particularly astrophysical (and astronomical) work, where Big Data has been the paradigm for some time. We discuss unsupervised, semi-supervised and supervised machine learning methods to direct numerical simulations data of homogeneous isotropic turbulence, Rayleigh-Bénard convection, and the minimal flow unit of a turbulent channel flow; for the last case, we discuss in some detail the application of echo state networks, this being one implementation of reservoir computing. The paper also provides a brief perspective on machine learning applications more broadly.
KW - Fully developed turbulence
KW - data-driven turbulence research
KW - machine learning
UR - http://www.scopus.com/inward/record.url?scp=85084997286&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85084997286&partnerID=8YFLogxK
U2 - 10.1080/14685248.2020.1757685
DO - 10.1080/14685248.2020.1757685
M3 - Article
AN - SCOPUS:85084997286
SN - 1468-5248
VL - 21
SP - 567
EP - 584
JO - Journal of Turbulence
JF - Journal of Turbulence
IS - 9-10
ER -