Noise, chaos, and (ε, τ)-entropy per unit time

Pierre Gaspard, Xiao Jing Wang

Research output: Contribution to journalReview articlepeer-review

Abstract

The degree of dynamical randomness of different time processes is characterized in terms of the (ε, τ)-entropy per unit time. The (ε, τ)-entropy is the amount of information generated per unit time, at different scales τ of time and ε of the observables. This quantity generalizes the Kolmogorov-Sinai entropy per unit time from deterministic chaotic processes, to stochastic processes such as fluctuations in mesoscopic physico-chemical phenomena or strong turbulence in macroscopic spacetime dynamics. The random processes that are characterized include chaotic systems, Bernoulli and Markov chains, Poisson and birth-and-death processes, Ornstein-Uhlenbeck and Yaglom noises, fractional Brownian motions, different regimes of hydrodynamical turbulence, and the Lorentz-Boltzmann process of nonequilibrium statistical mechanics. We also extend the (ε, τ)-entropy to spacetime processes like cellular automata, Conway's game of life, lattice gas automata, coupled maps, spacetime chaos in partial differential equations, as well as the ideal, the Lorentz, and the hard sphere gases. Through these examples it is demonstrated that the (ε, τ)-entropy provides a unified quantitative measure of dynamical randomness to both chaos and noises, and a method to detect transitions between dynamical states of different degrees of randomness as a parameter of the system is varied.

Original languageEnglish (US)
Pages (from-to)291-343
Number of pages53
JournalPhysics Reports
Volume235
Issue number6
DOIs
StatePublished - Dec 1993

ASJC Scopus subject areas

  • General Physics and Astronomy

Fingerprint

Dive into the research topics of 'Noise, chaos, and (ε, τ)-entropy per unit time'. Together they form a unique fingerprint.

Cite this