Thursday, February 28, 2013

An Idea That Changed The World


Image via: NowPublishers.com

... Jan. 23, 1913, a century ago this week. Mathematician Andrey A. Markov delivered a lecture that day to the Imperial Academy of Sciences in St. Petersburg on a computational technique now called the Markov chain.

Little noticed in its day, his idea for modeling probability is fundamental to all of present-day science, statistics, and scientific computing. Any attempt to simulate probable events based on vast amounts of data — the weather, a Google search, the behavior of liquids — relies on Markov’s idea.
His lecture went on to engender a series of concepts, called Markov chains and Markov proposals, that calculate likely outcomes in complex systems. His technique is still evolving and expanding. “This is a growth industry,” said Boston-area science writer Brian Hayes. “You really can’t turn around in the sciences without running into some kind of Markov process.”
Before Markov, the theory of probability involved observing a series of events that were independent of each another. The classic example is flipping a coin, an activity that makes probability easy to calculate.
Markov added the idea of interdependence to probability, the notion that what happens next is linked to what is happening now. That creates “chain of linked events,” and not just a series of independent acts. The world, Markov posited, is not just a series of random events. It is a complex thing, and mathematics can help reveal its hidden interconnectedness and likely probabilities. 
In an article on Markov that will appear in the next issue of American Scientist, Hayes contrasts the probabilistic simplicity of coin flipping with the complexity of the board game “Monopoly.” Moves rely on a roll of the dice, but where the player ends up — Park Place? Jail? — also depends on where the player begins. Suddenly, probability (where you end up) is linked to a present state (where you start). Events are linked, not independent. That’s a Markov chain.
A “Monopoly” board has 40 possible “states,” the same as the number of squares. But Markov chains now are vastly larger. Google’s PageRank search algorithm, for instance, has as many states as there are pages on the Web, perhaps 40 billion.
Markov in his 1913 lecture, introduced his technique by analyzing the frequency of vowels and consonants in a work of literature. (Markov used the first 20,000 letters of Alexander Pushkin’s 1833 verse novel “Eugene Onegin,” a work that almost every Russian knew).
Such numerical analysis “is about the most primitive and superficial thing you can do to a poem,” said Hayes. But it proved what Markov wanted: that letters in language are interdependent, and that over time they converge into stable patterns. These patterns of behavior in a complex system are at the bottom of what modern scientists want — a simulation of what reality is, whether on the level of a cell or on the level of the entire Web.

No comments:

Post a Comment