Neural Networks as Geometric Chaotic Maps

11 Dec 2019  ·  Ziwei Li, Sai Ravela ·

The use of artificial neural networks as models of chaotic dynamics has been rapidly expanding. Still, a theoretical understanding of how neural networks learn chaos is lacking. Here, we employ a geometric perspective to show that neural networks can efficiently model chaotic dynamics by becoming structurally chaotic themselves. We first confirm neural network's efficiency in emulating chaos by showing that a parsimonious neural network trained only on few data points can reconstruct strange attractors, extrapolate outside training data boundaries, and accurately predict local divergence rates. We then posit that the trained network's map comprises sequential geometric stretching, rotation, and compression operations. These geometric operations indicate topological mixing and chaos, explaining why neural networks are naturally suitable to emulate chaotic dynamics.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here