In lieu of an abstract, here is a brief excerpt of the content:

Acknowledgments It is a great pleasure to thank Achim Weiguny for his support throughout this work, his careful reading of the manuscript, and him and Joerg Uhlig for the enjoyable collaboration. Essential parts ofthis work are based on the following articles of the author and these two collaborators: [291-300,303-306]. I am also verygrateful to TomasoPoggio for the opportunity to work in the most stimulating atmosphere of his group and to him and the members of his group, in particular Federico Girosi, for manyfruitful discussions. Manyof the ideas worked out in this paper originate from that visit at the Massachusetts Institute of Technologywhich wasmade possible by a Postdoctoral Fellowship (Le 1014/1-1) from the Deutsche Forschungsgemeinschaft and a NSF/CISE Postdoctoral Fellowship. Special thanks go to Gernot Miinster, Manfred Stingl, Christian Wieczerkowski , Klaus Finn and all the other members of the Institut fiir Theoretische Physik for the manyinteresting discussions about field theory in seminars and at lunch time. And finally I want to express my sincere thanks to my wife Anja, as well as to my son Marius and my daughter Julia, for their support and patience during the time this work had to be completed. xix Abstract Bayesian field theory stands for a nonparametric Bayesian approach to learning from observational data. Based on the principles of Bayesian statistics , a particular Bayesian field theory is defined by the combination of two models: 1) A likelihood model, providing a probabilistic description of the measurement process for the observational ('training') data, characterizing the area for which application of the theory is intended and 2) a prior model, providing the information which is necessary to generalize from training to non-training data. The particular likelihood models discussed in this book are those of general density estimation, Gaussian regression, clustering, classification , or pattern recognition, as well as specific models of inverse quantum theory. These models represent the most common problem types of empirical learning, studied in many disciplines like applied statistics, artificial intelligence, and physics. Prior models have to implement problem-typical hard constraints, like normalization and non-negativity for probabilities, and also all the vague and probabilistic a priori knowledge available for a specific task. Due to their flexibility, nonparametric approaches rely much more on an adequate implementation of a priori information than typical parametric methods. The book therefore intends to provide a tool box to deal with a prioriinformation in nonparametric models. The nonparametric prior models treated in this book include Gaussian processes, mixtures of Gaussian processes, non-quadratic potentials, as well as so-called hyperparameters, hyperfields and auxiliary fields, all of which can be seen as specific statistical field theories. For applications a collection of practical methods is developed to adapt prior models. In particular, the adaption of mean functions and covariance operators of Gaussian process components is discussed in detail . Bayesian field theories are typically non-Gaussian and have thus to be solved numerically. Thanks to increasing computational resources the class of non-Gaussian Bayesian field theories of practical interest which are numerically feasible is steadily growing. Where a direct numerical solution is computationally too demanding, a variety of approximation methods based on variational techniques is available. [18.117.158.47] Project MUSE (2024-04-19 00:11 GMT) Bayesian Field Theory This page intentionally left blank ...

Share