Billiards, Chaos, and the 2014 Abel Prize

by Jonathan Kujawa

6a01a510678336970c01a3fce2ebbf970b-120wi

Yakov Sinai

On March 26th it was announced that Yakov Sinai, a mathematician at Princeton University and the Landau Institute for Theoretical Physics, had won the 2014 Abel Prize. The Abel prize was established in 2001 by the government of Norway and was first given 2003. Unlike the more famous Fields Medal, which (in)famously can only be granted to those under the age of forty, the Abel prize recognizes an individual for the breadth and depth of their entire career. It has quickly become the highest award one can earn in mathematics. Indeed, the list of prizewinners over the past ten years reads like a who's who of influential mathematicians.

Dr. Sinai won the prize “for his fundamental contributions to dynamical systems, ergodic theory, and mathematical physics”. Fortunately, I'm completely unqualified to tell you about Dr. Sinai's work. I say fortunately because Jordan Ellenberg already does an excellent job explaining Dr. Sinai's work in layman's terms as part of the announcement of the winner. You can watch the video here. Dr. Ellenberg gives a very nice twenty-minute overview of Dr. Sinai's work starting at the nine minute mark. Highly recommended!

I also say fortunately because it gives me the excuse to tell you about some cool math. A big part of Dr. Sinai's work is in the area of “Dynamical Systems.” This is a rare case where the name of a mathematical discipline actually tells you what the field is all about. Simply put, researchers in dynamical systems are interested in studying how a given system changes over time. The artist Tristan Perich explores the same territory by examining the upredictable dynamics of using computer code to draw in an unsheltered environment.

Perich_Tristan_Machine_Drawing_Maliekel_Process

Tristan Perich's drawing machine in action [0].

This is the sort of math you would be interested in if you want to model and predict the weather, the climate, the stock market, the reaction in the combustion chamber of an engine or in a nuclear explosion, etc. Of course these are all wildly difficult problems. Even with all our modern computing power it's hard to make progress. So here we'll instead think about much, much simpler examples which still exhibit some of the same interesting phenomena.

A boring first example is the second hand on a clock. As each second ticks by the second hand turns another one-sixtieth of the way around the clock face. This system is completely deterministic: if I know where it is now, then I know its entire past and future [1].

This is the clockwork universe of Newtonian physics. If we knew the location, velocity, etc., of every particle in the universe at a given moment, then we could calculate the past and future with perfect accuracy. As it doesn't leave room for independent actions, the prospect of a clockwork universe is equally alarming to religious and free-will types. Indeed, Newton himself rejected such a simplistic view of the universe [2]:

The six primary Planets are revolv'd about the Sun, in circles concentric with the Sun, and with motions directed towards the same parts, and almost in the same plane…. But it is not to be conceived that mere mechanical causes could give birth to so many regular motions…. This most beautiful System of the Sun,

Planets, and Comets, could only proceed from the counsel and dominion of an intelligent and powerful Being.

— Newton in Principia Mathematica

But the universe is much more interesting than a clock. A dynamical system is chaotic if slight variations at the start evolve into dramatic differences in the future. Many real world systems are highly chaotic. A bit of sun in Texas in August may result in more snow in Chicago in February. If you drop a small bead in glass of carbonated water it will be jostled along a complicated path. That path would have been completely different if you dropped the bead a moment later or into a slightly different place in the glass.

Brownianmotion

a) being jostled by water molecules leads to b) an unpredictable path [3]

One of Dr. Sinai's early accomplishments was his work with Andrey Kolmogorov where they introduced entropy, a precise way to measure where a dynamical system lies on the continuum between deterministic and chaotic.

It turns out that chaos can happen in even seemingly simple dynamical systems. One place to find such systems is when you have a mathematical rule which uses numbers for both inputs and outputs. Like a snake eating its own tail, you can study what happens if you iterate the rule over and over. That is, you plug in an initial input and then whatever output you get is used as your next input, and the output you get from that is your next input, and so on. What happens if you do this over and over for hundreds or thousands of iterations?

A boring example would be the rule which squares each number. Starting with 2 you get 4, 16, 256, 65,356,…. Nothing too exciting happens. And if you start with a nearby number, say 2.1, then the sequence of numbers you get remains predictably close to 2's sequence. But a slight variation of this where instead you square and add a constant is a classic example of a chaotic dynamical system. When you color each starting input by how quickly the sequence grows you get the Mandelbrot Set. The infinite complexity you see as you zoom in along the edges is a nothing but the fact that even very close points can have dramatically divergent behavior as you iterate.

640px-Mandel_zoom_00_mandelbrot_set

The beauty of dynamical systems (aka the Mandelbrot Set). From Wikipedia.

Another natural question to ask of a dynamical system is: if you let it run long enough, does it ever repeat itself? This is another way to measure the system's predictability. Our clock's second hand repeats itself once a minute like, well, clockwork. For something like the weather it's unlikely to the point of impossibility that it will ever repeat itself. But again we can look for examples between these two extremes.

The famous Collatz conjecture is just such a dynamical system. The rule is quite simple. Start with a natural number (i.e., one of the counting numbers: 1,2,3,4,…). If it's even, divide by two. If it's odd, multiple by three and add one. Repeat over and over again and see what happens. Let's say we start with 5. Since it's odd we triple and add one, getting 16. Which is even, so we divide by two, getting 8. Continuing in this way we get 5,16,8,4,2,1,4,2,1,4,2,1,…. Notice that in this example we eventually obtained 1 and found ourselves trapped in the closed loop of 4,2,1. A closed loop like this in a dynamical system is also called a periodic orbit.

Lothar Collatz conjectured in 1937 that if you start with any natural number, then iterating his rule often enough inevitably leads to 1 and, hence, the closed loop. That is, every starting number leads to a periodic orbit. It's rather addictive to pick numbers and start iterating to see if and when you finally get to 1. You can do this by hand or use one of the online Collatz calculators. People have used computers to verify Collatz's conjecture for every number up to five quintillion! But, of course, that still leaves infinitely many numbers. Indeed, it's an open question if all numbers lead to 1. Paul Erdos is said to have once said that “Mathematics is not yet ripe for such problems.”

Lastly, I have to mention dynamical billiards. A bead bobbing about in soda water is an extremely complicated system. Instead of thinking about all those molecules pushing to and fro, we can instead consider a simplified scenario which is somewhat similar [4]. We will instead think about the two-dimensional surface of a billiard table and a single billiard ball. What happens if we give the ball an initial trajectory and watch as it bounces from wall to wall indefinitely [1]? This is a very simplified model of molecules smashing about in a soda glass, or atoms in a nuclear reaction, or other similar systems.

Once again we find ourselves in a situation where things are simple enough that we can make progress in understanding them but complicated enough to see interesting phenomena. It is known that on a circular or square table everything is determined by the angle of the first bounce of the ball against a wall. If that first angle is a rational multiple of pi radians, then the path of the ball is periodic. It will travel around in a closed path and it's not hard to calculate the path. But if the first angle is an irrational multiple of pi, then the path will never repeat itself. In fact, the ball's path eventually covers the entire billiard table in that given any point on the edge, if you wait long enough the ball will strike as close as you like to that point!

But even with a single ball on a billiard table there is much we don't know. For example, it's known that every billiard table which is an acute triangle (i.e., one in which all three angles are less than 90 degrees) has a starting trajectory which gives a periodic orbit. But this is unknown for other triangular billiard tables. The current state of the art is work by Richard Schwartz which shows that every triangle which has no angles larger than 100 degrees has a starting trajectory which leads to a periodic orbit. But what about a triangle with largest angle 103 degrees? So far nobody knows! Dr. Schwarz has a Java applet on his webpage called McBilliards which lets you play billiards on a triangular table.

Another famous example in this theory is the Bunimovich stadium. Leonid Bunimovich (a student of Sinai) showed that even on the uncomplicated billiard table shaped like a stadium you have chaotic paths. Balls which start nearby with similar trajectories can have widely divergent paths. Dr. Sinai also worked in this part of dynamical systems. In fact, the square billiard table with a circular obstruction in the center is now called the Sinai billiard table.

500px-SinaiBilliard.svg

A path on the Sinai billiard table. From Wikipedia

Lest you think Dr. Sinai received the Abel prize for such simple games, I should be sure to mention that of course he also worked with much more realistic and challenging dynamical systems. But even in these small examples we see it's an amazingly rich and interesting field.

[0] Thanks to Duke University for the image.

[1] We are, of course, ignoring friction and all other unpleasantries of the real world.

[2] Thanks to Neil deGrasse Tyson for the quote.

[3] Thanks to Liquid Crystals and Photonics Group at Universiteit Gent for the image.

[4] One of the arts of mathematics is finding the sweet spot between problems which are so simple as to be boring and so hard we cannot make any progress on them.