The "ability" of machine learning to predict the future of chaotic systems

Half a century ago, the founders of the theory of chaos found that the “butterfly effect” made it impossible for a long-term prediction of the behavior of a chaotic system. Even a minimal disturbance of a complex system (such as weather, economics, etc.) can trigger a chain of events that will make the future unpredictable. Unable to accurately determine the current state of such systems, we cannot predict how they will evolve in the future. But now machine learning comes to the rescue.


According to a series of experiments published in the journals Physical Review Letters and Chaos, scientists used machine learning (the same method that stands behind the latest successes of artificial intelligence) to predict the future - predicting the evolution of chaotic systems to distant horizons that are amazing. * This approach is recognized as innovative by outside experts and is likely to become widely available soon.


The findings are provided by chaos theory veteran Edward Ott and his four staff members from the University of Maryland. They used a machine learning algorithm called reservoir computing to “study” the dynamics of an archetypal chaotic system called the Kuramoto-Sivashinsky equation. A developing solution to this equation behaves like the front of a flame advancing through a combustible medium. The equation also describes plasma drift waves and other phenomena and, according to Jadidep Patak, graduate student Ott and lead author of the research, serves as a "test bench for the study of turbulence and space-time chaos."


After training on the data, a “reservoir computer” of researchers was able to accurately predict how a flame-like system would evolve over a period of eight Lyapunov times, in other words, eight times longer than previous methods allowed. “Lyapunov time” shows how much time is required for the exponential divergence of two almost identical states of a chaotic system. In fact, this is the time during which the system is brought to complete chaos.


“This is really very good,” says Holger Kantz, a chaos theory researcher at the Max Planck Institute for Complex System Physics. The fact that it was possible to predict a chaotic system for a period of eight “Lyapunov times”, figuratively speaking, means that using machine learning is almost the same as knowing the truth.

The algorithm knows nothing about the Kuramoto-Sivashinsky equation itself; he sees only a set of data on an evolving solution to the equation. This makes machine learning especially effective. In many cases, the equations describing a chaotic system are unknown, which impedes the modeling and prediction of such a system. The results of the studies of the Ott group suggest that equations are not needed at all - only data is needed. The above-mentioned Holger Kantz: “It is assumed that one day we will be able to predict the weather using machine learning algorithms, rather than using complex atmospheric models.”
In addition to weather forecasting, according to experts, the machine learning technique can help in monitoring cardiac arrhythmias with signs of impending heart attacks, as well as in monitoring neuronal excitations in the brain to indicate neural impulses. Theoretically, this could also help in predicting giant waves threatening marine vessels and, possibly, even in predicting earthquakes.

Ott, Patak, and their colleagues Brian Hunt, Michelle Girvan, and Jixin Lu (who now works at the University of Pennsylvania) achieved their results by combining a number of existing tools.

Ott and his colleagues used the localization of interactions in spatially extended chaotic systems. Localization means that variables in one place depend on variables in neighboring places, but do not depend on variables located far away. "Using this," Patak explains, "we can essentially break the task into pieces." That is, we can use one reservoir of neurons to learn about one piece of the system, another reservoir to learn about the next piece, etc., with a small overlap of neighboring domains to take into account their interactions.

Parallelization allows using the approach based on reservoir computing for processing chaotic systems of almost any size, if the necessary computing power is available to solve the problem.

Researchers show that their predicted solution to the Kuramoto-Sivashinsky equation exactly matches the true solution for a period of eight “Lyapunov times,” before chaos finally wins and the actual and predicted states of the system diverge.

The usual approach to forecasting a chaotic system is to measure its characteristics as accurately as possible at a single point in time, use this data to calibrate the physical model, and then improve the model. According to a rough estimate, in order to increase the period of prediction of the development of a chaotic system by eight times, it is necessary to measure the initial conditions of a typical system 100,000,000 times more accurately.
“That is why machine learning is“ a very useful and powerful approach, ”says Ulrich Parlitz of the Institute for Dynamics and Self-Organization. Max Planck, “I think that this not only works on the example that they examined, but is universal in a sense and can be applied to many processes and systems.”
After an article in Physical Review Letters, Ott, Patak, Girvan, Lou, and others approached the practical implementation of their forecasting technique. In new studies accepted for publication in Chaos, they showed that improving predictions of chaotic systems, such as the Kuramoto-Sivashinsky equation, is made possible through a hybridization of the data-based approach, machine learning, and traditional model-based forecasting. Ott sees this as a more likely way to improve weather forecasts and similar attempts, since we do not always have complete high-resolution data or ideal physical models. “We must use the good knowledge that we have where we have it,” he says, “and if we have ignorance, we must use machine learning to fill the gaps, where this ignorance abides. " Predictions based on reservoir calculations can significantly calibrate models; in the case of the Kuramoto-Sivashinsky equation, the accuracy of forecasts can be increased to 12 “Lyapunov times”.

The duration of the “Lyapunov time” varies for different systems, from milliseconds to millions of years (several days in weather forecasting). The shorter it is, the more unstable or more prone to the butterfly effect the system will be.

Wilkinson and Kantz define chaos in terms of stretching and folding, similar to re-rolling and folding dough in the manufacture of puff pastries. Each piece of dough is stretched horizontally under a rolling pin, quickly exposing in two spatial directions. Then the dough is folded and flattened, compressing adjacent spots in the vertical direction. Kants says weather, forest fires, the surface of the sun, and all other chaotic systems do just that. “In order to have this exponential divergence of trajectories, we need this stretching, and in order not to run to infinity, we need to bend a little,” folding is due to the non-linear relationships between the variables in the systems.

Precisely because reservoir calculations are so good at studying the dynamics of chaotic systems, they are still not well understood, except that the computer adjusts its own formulas in accordance with the input data until the formulas reproduce the dynamics of the system. The technique works so well that Ott and some other researchers from the University of Maryland now intend to use chaos theory as a way to better understand the internal mechanisms of neural networks.

Also popular now: