Langbahn Team – Weltmeisterschaft

History of randomness

Ancient fresco of dice players in Pompei

In ancient history, the concepts of chance and randomness were intertwined with that of fate. Many ancient peoples threw dice to determine fate, and this later evolved into games of chance. At the same time, most ancient cultures used various methods of divination to attempt to circumvent randomness and fate.[1][2] Beyond religion and games of chance, randomness has been attested for sortition since at least ancient Athenian democracy in the form of a kleroterion.[3]

The formalization of odds and chance was perhaps earliest done by the Chinese 3,000 years ago. The Greek philosophers discussed randomness at length, but only in non-quantitative forms. It was only in the sixteenth century that Italian mathematicians began to formalize the odds associated with various games of chance. The invention of modern calculus had a positive impact on the formal study of randomness. In the 19th century the concept of entropy was introduced in physics.

The early part of the twentieth century saw a rapid growth in the formal analysis of randomness, and mathematical foundations for probability were introduced, leading to its axiomatization in 1933. At the same time, the advent of quantum mechanics changed the scientific perspective on determinacy. In the mid to late 20th-century, ideas of algorithmic information theory introduced new dimensions to the field via the concept of algorithmic randomness.

Although randomness had often been viewed as an obstacle and a nuisance for many centuries, in the twentieth century computer scientists began to realize that the deliberate introduction of randomness into computations can be an effective tool for designing better algorithms. In some cases, such randomized algorithms are able to outperform the best deterministic methods.

Antiquity to the Middle Ages

Depiction of Roman goddess Fortuna who determined fate, by Hans Beham, 1541

Pre-Christian people along the Mediterranean threw dice to determine fate, and this later evolved into games of chance.[4] There is also evidence of games of chance played by ancient Egyptians, Hindus and Chinese, dating back to 2100 BC.[5] The Chinese used dice before the Europeans, and have a long history of playing games of chance.[6]

Over 3,000 years ago, the problems concerned with the tossing of several coins were considered in the I Ching, one of the oldest Chinese mathematical texts, that probably dates to 1150 BC. The two principal elements yin and yang were combined in the I Ching in various forms to produce Heads and Tails permutations of the type HH, TH, HT, etc. and the Chinese seem to have been aware of Pascal's triangle long before the Europeans formalized it in the 17th century.[7] However, Western philosophy focused on the non-mathematical aspects of chance and randomness until the 16th century.

The development of the concept of chance throughout history has been very gradual. Historians have wondered why progress in the field of randomness was so slow, given that humans have encountered chance since antiquity. Deborah J. Bennett suggests that ordinary people face an inherent difficulty in understanding randomness, although the concept is often taken as being obvious and self-evident. She cites studies by Kahneman and Tversky; these concluded that statistical principles are not learned from everyday experience because people do not attend to the detail necessary to gain such knowledge.[8]

The Greek philosophers were the earliest Western thinkers to address chance and randomness. Around 400 BC, Democritus presented a view of the world as governed by the unambiguous laws of order and considered randomness as a subjective concept that only originated from the inability of humans to understand the nature of events. He used the example of two men who would send their servants to bring water at the same time to cause them to meet. The servants, unaware of the plan, would view the meeting as random.[9]

Aristotle saw chance and necessity as opposite forces. He argued that nature had rich and constant patterns that could not be the result of chance alone, but that these patterns never displayed the machine-like uniformity of necessary determinism. He viewed randomness as a genuine and widespread part of the world, but as subordinate to necessity and order.[10] Aristotle classified events into three types: certain events that happen necessarily; probable events that happen in most cases; and unknowable events that happen by pure chance. He considered the outcome of games of chance as unknowable.[11]

Around 300 BC Epicurus proposed the concept that randomness exists by itself, independent of human knowledge. He believed that in the atomic world, atoms would swerve at random along their paths, bringing about randomness at higher levels.[12]

Hotei, the deity of fortune observing a cock fight in a 16th-century Japanese print

For several centuries thereafter, the idea of chance continued to be intertwined with fate. Divination was practiced in many cultures, using diverse methods. The Chinese analyzed the cracks in turtle shells, while the Germans, who according to Tacitus had the highest regards for lots and omens, utilized strips of bark.[13] In the Roman Empire, chance was personified by the Goddess Fortuna. The Romans would partake in games of chance to simulate what Fortuna would have decided. In 49 BC, Julius Caesar allegedly decided on his fateful decision to cross the Rubicon after throwing dice.[14][unreliable source?]

Aristotle's classification of events into the three classes: certain, probable and unknowable was adopted by Roman philosophers, but they had to reconcile it with deterministic Christian teachings in which even events unknowable to man were considered to be predetermined by God. About 960 Bishop Wibold of Cambrai correctly enumerated the 56 different outcomes (without permutations) of playing with three dice. No reference to playing cards has been found in Europe before 1350. The Church preached against card playing, and card games spread much more slowly than games based on dice.[15] The Christian Church specifically forbade divination; and wherever Christianity went, divination lost most of its old-time power.[16][17]

Over the centuries, many Christian scholars wrestled with the conflict between the belief in free will and its implied randomness, and the idea that God knows everything that happens. Saints Augustine and Aquinas tried to reach an accommodation between foreknowledge and free will, but Martin Luther argued against randomness and took the position that God's omniscience renders human actions unavoidable and determined.[18] In the 13th century, Thomas Aquinas viewed randomness not as the result of a single cause, but of several causes coming together by chance. While he believed in the existence of randomness, he rejected it as an explanation of the end-directedness of nature, for he saw too many patterns in nature to have been obtained by chance.[19]

The Greeks and Romans had not noticed the magnitudes of the relative frequencies of the games of chance. For centuries, chance was discussed in Europe with no mathematical foundation and it was only in the 16th century that Italian mathematicians began to discuss the outcomes of games of chance as ratios.[20][21][22] In his 1565 Liber de Lude Aleae (a gambler's manual published after his death) Gerolamo Cardano wrote one of the first formal tracts to analyze the odds of winning at various games.[23]

17th–19th centuries

Statue of Blaise Pascal, Louvre

Around 1620 Galileo wrote a paper called On a discovery concerning dice that used an early probabilistic model to address specific questions.[24] In 1654, prompted by Chevalier de Méré's interest in gambling, Blaise Pascal corresponded with Pierre de Fermat, and much of the groundwork for probability theory was laid. Pascal's Wager was noted for its early use of the concept of infinity, and the first formal use of decision theory. The work of Pascal and Fermat influenced Leibniz's work on the infinitesimal calculus, which in turn provided further momentum for the formal analysis of probability and randomness.

The first known suggestion for viewing randomness in terms of complexity was made by Leibniz in an obscure 17th-century document discovered after his death. Leibniz asked how one could know if a set of points on a piece of paper were selected at random (e.g. by splattering ink) or not. Given that for any set of finite points there is always a mathematical equation that can describe the points, (e.g. by Lagrangian interpolation) the question focuses on the way the points are expressed mathematically. Leibniz viewed the points as random if the function describing them had to be extremely complex. Three centuries later, the same concept was formalized as algorithmic randomness by A. N. Kolmogorov and Gregory Chaitin as the minimal length of a computer program needed to describe a finite string as random.[25]

The Doctrine of Chances, the first textbook on probability theory was published in 1718 and the field continued to grow thereafter.[26] The frequency theory approach to probability was first developed by Robert Ellis and John Venn late in the 19th century.

The Fortune Teller by Vouet, 1617

While the mathematical elite was making progress in understanding randomness from the 17th to the 19th century, the public at large continued to rely on practices such as fortune telling in the hope of taming chance. Fortunes were told in a multitude of ways both in the Orient (where fortune telling was later termed an addiction) and in Europe by gypsies and others.[27][28] English practices such as the reading of eggs dropped in a glass were exported to Puritan communities in North America.[29]

"I know of scarcely anything so apt to impress the imagination as the wonderful form of cosmic order expressed by the "Law of Frequency of Error." The law would have been personified by the Greeks and deified, if they had known of it. It reigns with serenity and in complete self-effacement amidst the wildest confusion. The huger the mob, and the greater the apparent anarchy, the more perfect is its sway. It is the supreme law of Unreason. Whenever a large sample of chaotic elements are taken in hand and marshalled in the order of their magnitude, an unsuspected and most beautiful form of regularity proves to have been latent all along. The tops of the marshalled row form a flowing curve of invariable proportions; and each element, as it is sorted into place, finds, as it were, a pre-ordained niche, accurately adapted to fit it."

Galton (1894)[30]

The term entropy, which is now a key element in the study of randomness, was coined by Rudolf Clausius in 1865 as he studied heat engines in the context of the second law of thermodynamics. Clausius was the first to state "entropy always increases".[31]

From the time of Newton until about 1890, it was generally believed that if one knows the initial state of a system with great accuracy, and if all the forces acting on the system can be formulated with equal accuracy, it would be possible, in principle, to make predictions of the state of the universe for an infinitely long time. The limits to such predictions in physical systems became clear as early as 1893 when Henri Poincaré showed that in the three-body problem in astronomy, small changes to the initial state could result in large changes in trajectories during the numerical integration of the equations.[32]

During the 19th century, as probability theory was formalized and better understood, the attitude towards "randomness as nuisance" began to be questioned. Goethe wrote:

The tissue of the world is built from necessities and randomness; the intellect of men places itself between both and can control them; it considers the necessity and the reason of its existence; it knows how randomness can be managed, controlled, and used.

The words of Goethe proved prophetic, when in the 20th century randomized algorithms were discovered as powerful tools.[33] By the end of the 19th century, Newton's model of a mechanical universe was fading away as the statistical view of the collision of molecules in gases was studied by Maxwell and Boltzmann.[34] Boltzmann's equation S = k loge W (inscribed on his tombstone) first related entropy with logarithms.

20th century

Antony Gormley's Quantum Cloud sculpture in London was designed by a computer using a random walk algorithm.

During the 20th century, the five main interpretations of probability theory (e.g., classical, logical, frequency, propensity and subjective) became better understood, were discussed, compared and contrasted.[35] A significant number of application areas were developed in this century, from finance to physics. In 1900 Louis Bachelier applied Brownian motion to evaluate stock options, effectively launching the fields of financial mathematics and stochastic processes.

Émile Borel was one of the first mathematicians to formally address randomness in 1909, and introduced normal numbers.[36] In 1919 Richard von Mises gave the first definition of algorithmic randomness via the impossibility of a gambling system. He advanced the frequency theory of randomness in terms of what he called the collective, i.e. a random sequence.[37] Von Mises regarded the randomness of a collective as an empirical law, established by experience. He related the "disorder" or randomness of a collective to the lack of success of attempted gambling systems. This approach led him to suggest a definition of randomness that was later refined and made mathematically rigorous by Alonzo Church by using computable functions in 1940.[38] Von Mises likened the principle of the impossibility of a gambling system to the principle of the conservation of energy, a law that cannot be proven, but has held true in repeated experiments.[39]

Von Mises never totally formalized his rules for sub-sequence selection, but in his 1940 paper "On the concept of random sequence", Alonzo Church suggested that the functions used for place settings in the formalism of von Mises be computable functions rather than arbitrary functions of the initial segments of the sequence, appealing to the Church–Turing thesis on effectiveness.[40][41]

The advent of quantum mechanics in the early 20th century and the formulation of the Heisenberg uncertainty principle in 1927 saw the end to the Newtonian mindset among physicists regarding the determinacy of nature. In quantum mechanics, there is not even a way to consider all observable elements in a system as random variables at once, since many observables do not commute.[42]

Café Central, one of the early meeting places of the Vienna circle

By the early 1940s, the frequency theory approach to probability was well accepted within the Vienna circle, but in the 1950s Karl Popper proposed the propensity theory.[43][44] Given that the frequency approach cannot deal with "a single toss" of a coin, and can only address large ensembles or collectives, the single-case probabilities were treated as propensities or chances. The concept of propensity was also driven by the desire to handle single-case probability settings in quantum mechanics, e.g. the probability of decay of a specific atom at a specific moment. In more general terms, the frequency approach can not deal with the probability of the death of a specific person given that the death can not be repeated multiple times for that person. Karl Popper echoed the same sentiment as Aristotle in viewing randomness as subordinate to order when he wrote that "the concept of chance is not opposed to the concept of law" in nature, provided one considers the laws of chance.[45]

Claude Shannon's development of Information theory in 1948 gave rise to the entropy view of randomness. In this view, randomness is the opposite of determinism in a stochastic process. Hence if a stochastic system has entropy zero it has no randomness and any increase in entropy increases randomness. Shannon's formulation defaults to Boltzmann's 19th century formulation of entropy in case all probabilities are equal.[46][47] Entropy is now widely used in diverse fields of science from thermodynamics to quantum chemistry.[48]

Martingales for the study of chance and betting strategies were introduced by Paul Lévy in the 1930s and were formalized by Joseph L. Doob in the 1950s.[49] The application of random walk hypothesis in financial theory was first proposed by Maurice Kendall in 1953.[50] It was later promoted by Eugene Fama and Burton Malkiel.

Random strings were first studied in the 1960s by A. N. Kolmogorov (who had provided the first axiomatic definition of probability theory in 1933),[51] Chaitin and Martin-Löf.[52] The algorithmic randomness of a string was defined as the minimum size of a program (e.g. in bits) executed on a universal computer that yields the string. Chaitin's Omega number later related randomness and the halting probability for programs.[53]

In 1964, Benoît Mandelbrot suggested that most statistical models approached only a first stage of dealing with indeterminism, and that they ignored many aspects of real world turbulence.[54][55] In his 1997 he defined seven states of randomness ranging from "mild to wild", with traditional randomness being at the mild end of the scale.[56]

Despite mathematical advances, reliance on other methods of dealing with chance, such as fortune telling and astrology continued in the 20th century. The government of Myanmar reportedly shaped 20th century economic policy based on fortune telling and planned the move of the capital of the country based on the advice of astrologers.[57][58][59] White House Chief of Staff Donald Regan criticized the involvement of astrologer Joan Quigley in decisions made during Ronald Reagan's presidency in the 1980s.[60][61][62] Quigley claims to have been the White House astrologer for seven years.[63]

During the 20th century, limits in dealing with randomness were better understood. The best-known example of both theoretical and operational limits on predictability is weather forecasting, simply because models have been used in the field since the 1950s. Predictions of weather and climate are necessarily uncertain. Observations of weather and climate are uncertain and incomplete, and the models into which the data are fed are uncertain.[64] In 1961, Edward Lorenz noticed that a very small change to the initial data submitted to a computer program for weather simulation could result in a completely different weather scenario. This later became known as the butterfly effect, often paraphrased as the question: "Does the flap of a butterfly’s wings in Brazil set off a tornado in Texas?".[65] A key example of serious practical limits on predictability is in geology, where the ability to predict earthquakes either on an individual or on a statistical basis remains a remote prospect.[66]

In the late 1970s and early 1980s, computer scientists began to realize that the deliberate introduction of randomness into computations can be an effective tool for designing better algorithms. In some cases, such randomized algorithms outperform the best deterministic methods.[33]

Notes

  1. ^ Adkins 1998, p. 279.
  2. ^ Johnston 2004, p. 370.
  3. ^ Hansen 1991, p. 230.
  4. ^ Beltrami 1999, pp. 2–4.
  5. ^ Jenkins 2004, p. 194.
  6. ^ Mccormick 2007, p. 158.
  7. ^ Kramer 1983, p. 313.
  8. ^ Bennett 1998, pp. 8–9, 24.
  9. ^ Hromkovič 2005, p. 1.
  10. ^ Sachs & Aristotle 1995, p. 70.
  11. ^ Hald 2003, p. 30.
  12. ^ Rist 1972, p. 52.
  13. ^ Reith 2000, p. 15.
  14. ^ Beltrami 1999, pp. 3–4.
  15. ^ Hald 2003, pp. 29–36.
  16. ^ Priestley 1804, p. 11.
  17. ^ Graham 1909.
  18. ^ Vaughn & Dacey 2003, p. 81.
  19. ^ Aquinas 2006, p. 198.
  20. ^ Hald 2003, pp. 30–4.
  21. ^ McGrath & Traverspage 1999, p. 893.
  22. ^ Bennett 1998, p. 8.
  23. ^ Daintith & Gjertsen 1999, p. 88.
  24. ^ Hald 2003, p. 41.
  25. ^ Chaitin 2007, p. 242.
  26. ^ Schneider 2005.
  27. ^ Lach & Van Kley 1998, p. 1660.
  28. ^ Crowe 1996, p. 36.
  29. ^ Findling & Thackeray 2000, p. 168.
  30. ^ Galton, Francis (1894), Natural Inheritance, Macmillan, ISBN 978-1297895982, p. 66
  31. ^ Cropper 2004, p. 93.
  32. ^ Wiin-Nielsen 1998, p. 3.
  33. ^ a b Hromkovič 2005, p. 4.
  34. ^ Trefil 2001, p. cxxxiii.
  35. ^ Hájek 2019.
  36. ^ Borel 1909.
  37. ^ Bienvenu, Shafer & Shen 2009, pp. 3–4.
  38. ^ Grattan-Guinness 2003, p. 1412.
  39. ^ Keuth 2004, p. 171.
  40. ^ Church 1940.
  41. ^ Coffa 1974, p. 106.
  42. ^ Zambrini & Chung 2003.
  43. ^ Popper 1957.
  44. ^ Popper 1959b.
  45. ^ Popper 1959a, p. 170.
  46. ^ Weiss 1999, p. 83.
  47. ^ Kåhre 2002, p. 218.
  48. ^ Lipkowitz 2007, p. 279.
  49. ^ Borovskikh 1997, p. 287.
  50. ^ Kendall & Hill 1953.
  51. ^ Jaynes 2003, p. 49.
  52. ^ Calude 2002, p. 145.
  53. ^ Chaitin 2007, p. 185.
  54. ^ Mandelbrot 2001, p. 20.
  55. ^ Mirowski 2004, p. 255.
  56. ^ Mandelbrot 1997, p. 136–142.
  57. ^ Perry 2007, p. 10.
  58. ^ Ramachandran & Win 2009.
  59. ^ Mydans 2005.
  60. ^ Seaman 1988.
  61. ^ Levy 1996, p. 25.
  62. ^ Pemberton 1997, p. 123.
  63. ^ Quigley 1990.
  64. ^ Palmer & Hagedorn 2006, p. 1.
  65. ^ Mathis 2007, p. x.
  66. ^ Knopoff 1996, p. 3720.

References

  • Adkins, Lesley (July 1998). Handbook to Life in Ancient Rome. Oxford University Press. ISBN 0-19-512332-8.
  • Aquinas, Saint Thomas (2006). The Treatise on the Divine Nature Summa Theologiae I l-13. Translated by Shanley, Brian J. Hackett Publishing Company. ISBN 0-87220-805-2.
  • Beltrami, Edward J. (1999). What is Random?: Chance and Order in Mathematics and Life. Springer. ISBN 0-387-98737-1.
  • Borovskikh, Yu. V. (December 1997). Martingale approximation. Brill. ISBN 90-6764-271-1.
  • Calude, Cristian (2002). Information and Randomness: an Algorithmic Perspective. Springer. ISBN 3-540-43466-6.
  • Chaitin, Gregory J. (2007). THINKING ABOUT GÖDEL AND TURING: Essays on Complexity, 1970-2007. World Scientific Publishing Company. ISBN 978-981-270-896-0.
  • Cropper, William H. (2004). Great physicists: The Life and Times of Leading Physicists from Galileo to Hawking. Oxford University Press. ISBN 9780195173246.
  • Crowe, David M. (January 1996). A History of the Gypsies of Eastern Europe and Russia. Palgrave Macmillan. ISBN 0-312-12946-7.
  • Daintith, John; Gjertsen, Derek (1999). A Dictionary of Scientists. Oxford University Press. ISBN 0-19-280086-8.
  • David, H.A..; Edwards, A.W.F. (December 2001). Annotated Readings in the History of Statistics. Springer. ISBN 9780387988443.
  • Findling, John E.; Thackeray, Frank W. (2000). Events that Changed America through the Seventeenth Century. Greenwood. ISBN 0-313-29083-0.
  • Graham, E. (1909). Divination. The Catholic Encyclopedia. Retrieved 15 April 2022.
  • Grattan-Guinness, Ivor (August 2003). Companion Encyclopedia of the History and Philosophy. Vol. 2. Johns Hopkins University Press. ISBN 9780801873973.
  • Hald, Anders (September 2003). A History of Probability and Statistics and Their Applications before 1750. Wiley-Interscience. ISBN 0-471-47129-1.
  • Hromkovič, Juraj (December 2005). Design and Analysis of Randomized Algorithms: Introduction to Design Paradigms. Springer. ISBN 3-540-23949-9.
  • Jenkins, John Michael (2004). Encyclopedia of Leisure and Outdoor Recreation. Routledge. ISBN 0-415-25226-1.
  • Johnston, Sarah Iles (November 2004). Religions of the Ancient World: A Guide. Belknap Press: An Imprint of Harvard University Press. ISBN 0-674-01517-7.
  • Kåhre, Jan (2002). The Mathematical Theory of Information. Springer. ISBN 1-4020-7064-0.
  • Kendall, M. G.; Hill, A. B. (1953). "The Analysis of Economic Time-Series-Part I: Prices". Journal of the Royal Statistical Society. Series A (General). 116 (1): 11–34. doi:10.2307/2980947. JSTOR 2980947.
  • Keuth, Herbert (December 2004). The Philosophy of Karl Popper. Cambridge University Press. ISBN 9780521548304.
  • Kramer, Edna Ernestine (1983). The Nature and Growth of Modern Mathematics. Princeton University Press. ISBN 9780691023724.
  • Lach, Donald Frederick; Van Kley, Edwin J. (December 1998). Asia in the Making of Europe, Volume III: A Century of Advance. Book 4: East Asia. Vol. 3. University of Chicago Press. ISBN 0-226-46769-4.
  • Levy, Peter B. (1996). Encyclopedia of the Reagan-Bush Years. Greenwood. ISBN 0-313-29018-0.
  • Lipkowitz, Kenneth B. (December 2007). Reviews in Computational Chemistry (Volume 23 ed.). Wiley-VCH. ISBN 978-0-470-08201-0.
  • Mandelbrot, Benoit B. (September 1997). Fractals and Scaling in Finance: Discontinuity, Concentration, Risk. Springer. ISBN 0-387-98363-5.
  • Mandelbrot, Benoit B. (2001). Gaussian Self-Affinity and Fractals: Globality, The Earth, 1/f Noise, and R/S. Springer. ISBN 0-387-98993-5.
  • Mathis, Nancy (March 2007). Storm Warning: The Story of a Killer Tornado. Touchstone. ISBN 978-0-7432-8053-2.
  • Mccormick, Elise (December 2007). Audacious Angles Of China. Read Books. ISBN 978-1-4067-5332-5.
  • McGrath, Kimberley A.; Traverspage, Bridget (December 1999). World of Scientific Discovery. Gale / Cengage Learning. ISBN 0-7876-2760-7.
  • Mirowski, Philip (2004). The Effortless Economy of Science?. Duke University Press Books. ISBN 0-8223-3322-8.
  • Palmer, Tim; Hagedorn, Renate (2006). Predictability of Weather and Climate. Cambridge University Press. ISBN 0-521-84882-2.
  • Pemberton, William E. (December 1997). Exit with Honor: The Life and Presidency of Ronald Reagan (Right Wing in America). Routledge. ISBN 0-7656-0095-1.
  • Perry, Peter John (2007). Myanmar (Burma) since 1962: the failure of development. Routledge. ISBN 978-0-7546-4534-4.
  • Popper, Karl Raimund (1957). "The propensity interpretation of the calculus of probability and the quantum theory". In Stephan Körner (ed.). Observation and Interpretation. Butterworths. pp. 65–70.
  • Priestley, Joseph (1804). A General History of the Christian Church, Vol. 2 of 2: To the Fall of the Western Empire. Vol. 2.
  • Quigley, Joan Ceciel (1 March 1990). What Does Joan Say?: My Seven Years as White House Astrologer to Nancy and Ronald Reagan. Carol Publishing Group. New York, NY.
  • Reith, Gerda (2000). he Age of Chance: Gambling in Western Culture (Routledge Studies in Social and Political Thought). Routledge. ISBN 0-415-17997-1.
  • Sachs, Joe; Aristotle (March 1995). Aristotle's Physics: A Guided Study. Rutgers University Press. ISBN 0-8135-2192-0.
  • Schneider, Ivo (2005). "Chapter 7 - Abraham de Moivre, The doctrine of chances (1718, 1738, 1756)". In I. Grattan-Guinness (ed.). Landmark Writings in Western Mathematics 1640-1940. Amsterdam: Elsevier Science. pp. 105–120. doi:10.1016/B978-044450871-3/50088-7. ISBN 9780444508713.
  • Seaman, Barrett (16 May 1988). Good Heavens!. Time Magazine. Retrieved 17 April 2022.
  • Trefil, James S. (2001). The Encyclopedia of Science and Technology. Routledge. ISBN 0-415-93724-8.
  • Vaughn, Lewis; Dacey, Austin (December 2003). The Case For Humanism: An Introduction. Rowman & Littlefield Publishers. ISBN 0-7425-1393-9.
  • Weiss, Benjamin (1999). Single Orbit Dynamics (Cbms Regional Conference Series in Mathematics). American Mathematical Society. ISBN 0-8218-0414-6.
  • Zambrini, Jean-Claude; Chung, Kai Lai (2003). Introduction to Random Time and Quantum Randomness. World Scientific Publishing Comp. ISBN 981-238-415-4.

See also

Chaparro, Luis F. (April 2020). "A brief history of randomness".

Sheynin, O.B. (1991). "The notion of randomness from Aristotle to Poincaré" (PDF). Mathématiques et sciences humaines. 114: 41–55.