In lieu of an abstract, here is a brief excerpt of the content:

Reviewed by:
  • Classic Topics on the History of Modern Mathematical Statistics: From Laplace to More Recent Times by Gorroochurn Prakash
  • Daniel Courgeau
Gorroochurn Prakash, 2016, Classic Topics on the History of Modern Mathematical Statistics: From Laplace to More Recent Times, Hoboken, NJ, John Wiley & Sons, Inc., 754 p.

This history of modern mathematical statistics retraces their development from the "Laplacean revolution," as the author so rightly calls it (though the beginnings are to be found in Bayes' 1763 essay(1)), through the mid-twentieth century and Fisher's major contribution. Up to the nineteenth century the book covers the same ground as Stigler's history of statistics(2), though with notable differences (see infra). It then discusses developments through the first half of the twentieth century: Fisher's synthesis but also the renewal of Bayesian methods, which implied a return to Laplace.

Part I offers an in-depth, chronological account of Laplace's approach to probability, with all the mathematical detail and deductions he drew from it. It begins with his first innovative articles and concludes with his philosophical synthesis showing that all fields of human knowledge are connected to the theory of probabilities.

Here Gorrouchurn raises a problem that Stigler does not, that of induction (pp. 102-113), a notion that gives us a better understanding of probability according to Laplace. The term induction has two meanings, the first put forward by Bacon(3) in 1620, the second by Hume(4) in 1748. Gorroochurn discusses only the second. For Bacon, induction meant discovering the principles of a system by studying its properties through observation and experimentation. For Hume, induction was mere enumeration and could not lead to certainty. Laplace followed Bacon: "The surest method which can guide us in the search for truth, consists in rising by induction from phenomena to laws and from laws to forces"(5). To my knowledge, he never cited Hume, though Hume's work had been translated into French by 1758. For Laplace, probability was a new way of reasoning, on the basis of partial knowledge of the phenomena under study. His "rise-of-the-sun" example should of course be understood in connection with the hypothesis that that phenomenon has only been observed for five thousand years. But as Laplace clearly indicates, knowledge of the regulating principle behind the phenomenon enables us to make a much more precise estimate. Moreover, the assumption here of a uniform a priori distribution is not a blind metaphysical assumption, as Gorroochurn seems to think, but always a reasonable one, and Laplace uses non-uniform a priori distributions in other examples (cf. Stigler 1986, pp. 135-136). Here, since there are only [End Page 156] two possibilities (the sun will either rise tomorrow or it will not), the principle of indifference applies perfectly. None of the critics Gorroochurn cites seems to have understood this point; all seem to have accepted Hume's understanding of induction.

Part II, "From Galton to Fisher", focuses on how a fundamentally frequentist approach was adopted, an approach opposed to Laplace's and based on Hume's induction principle, though the author does not clearly state this. The researchers who developed it were trying to devise a statistical approach in the biological and social sciences. Though they were interested in several areas in them, Quételet and Lexis can be associated with population science, Galton and Pearson with the study of heredity and biometry, Edgeworth and Yule with economics, Fisher with biology and genetics, and so forth.

Laplace's methods were applied either to astronomy or geodesic data, fields that had already been theorized, or to simple data for which the probability law had already been established, such as sex ratio at birth (binomial law). In the life and social sciences, the problem is linked to the mass of causes of the phenomena under study and their nontrivial effects: given that the hypothesis of population homogeneity is untenable, how can we take account of observation complexity? The entire effort of these statisticians was to devise tools – correlation, regression analysis, multivariate analysis, contingency tables, and others – to disentangle causal ties. This analysis culminated in Fisher's theory of statistical estimation, which Gorroochurn describes in...

pdf

Share