In lieu of an abstract, here is a brief excerpt of the content:

Preface This book surveys basic modern techniques for the numerical solution of linear and nonlinear least squares problems and introduces the treatment of large and ill-conditioned problems. The theory is extensively illustrated with examples from engineering, environmental sciences, geophysics and other application areas. In addition to the treatment of the numerical aspects of least squares problems, we introduce some important topics from the area of regression analysis in statistics, which can help to motivate, understand and evaluate the computed least squares solutions. The inclusion of these topics is one aspect that distinguishes the present book from other books on the subject. The presentation of the material is designed to give an overview, with the goal of helping the reader decide which method would be appropriate for a given problem, point toward available algorithms/software and, if necessary , help in modifying the available tools to adapt for a given application. The emphasis is therefore on the properties of the different algorithms and few proofs are presented; the reader is instead referred to the appropriate articles/books. Unfortunately, several important topics had to be left out, among them, direct methods for sparse problems. The content is geared toward scientists and engineers who must analyze and solve least squares problems in their fields. It can be used as course material for an advanced undergraduate or graduate course in the sciences and engineering, presupposing a working knowledge of linear algebra and basic statistics. It is written mostly in a terse style in order to provide a quick introduction to the subject, while treating some of the not so wellknown topics in more depth. This in fact presents the reader with an opportunity to verify the understanding of the material by completing or providing the proofs without checking the references. The least squares problem is known under different names in different disciplines. One of our aims is to help bridge the communication gap between the statistics and the numerical analysis literature on the subject, often due to the use of different terminology, such as l2-approximation, ix x PREFACE regularization, regression analysis, parameter estimation, filtering, process identification, etc. Least squares methods have been with us for many years, since Gauss invented and used them in his surveying activities [83]. In 1965, the paper by G. H. Golub [92] on using the QR factorization and later his development of a stable algorithm for the computation of the SVD started a renewed interest in the subject in the, by then, changed work environment of computers. Thanks also to, among many others, Å. Björck, L. Eldén, C. C. Paige, M. A. Saunders, G. W. Stewart, S. van Huffel and P.-. Wedin, the topic is now available in a robust, algorithmic and well-founded form. There are many books partially or completely dedicated to linear and nonlinear least squares. The first and one of the fundamental references for linear problems is Lawson and Hanson’s monograph [150]. Besides summarizing the state of the art at the time of its publication, it highlighted the practical aspects of solving least squares problems. Bates and Watts [9] have an early comprehensive book focused on the nonlinear least squares problem with a strong statistical approach. Björck’s book [20] contains a very careful and comprehensive survey of numerical methods for both linear and nonlinear problems, including the treatment of large, sparse problems . Golub and Van Loan’s Matrix Computations [105] includes several chapters on different aspects of least squares solution and on total least squares. The total least squares problem, known in statistics as latent root regression, is discussed in the book by S. van Huffel and J. Vandewalle [239]. Seber and Wild [223] consider exhaustively all aspects of nonlinear least squares estimation and modeling. Although it is a general treatise on optimization, Nocedal and Wright’s book [170] includes a very clear chapter on nonlinear least squares. Additional material can be found in [21, 63, 128, 232, 242, 251]. We would like to acknowledge the help of Michael Saunders (iCME, Stanford University), who read carefully the whole manuscript and made a myriad of observations and corrections that have greatly improved the final product. Per Christian Hansen would like to thank several colleagues from DTU Informatics who assisted with the statistical aspects. Godela Scherer gives thanks for all the support at the Department of Mathematics and Statistics, University of Reading, where she was a visiting research fellow while working on this book...

Share