Bayesian Field Theory
Publication Year: 2003
Published by: The Johns Hopkins University Press
Title Page, Copyright
Numerical Case Studies
Acknowledgments and Abstract
It is a great pleasure to thank Achim Weiguny for his support throughout this work, his careful reading of the manuscript, and him and Joerg Uhlig for the enjoyable collaboration. Essential parts of this work are based on the following articles of the author and these two collaborators...
Chapter 1. Introduction
Due to increasing computational resources, the last decade has seen a rapidly growing interest in applied empirical learning problems. They appear, for example, as density estimation, regression or classification problems and include, just to name a few, image reconstruction, speech recognition, time series prediction, object recognition...
Chapter 2. Bayesian framework
Looking for a scientific explanation of some phenomena means to search for a causal model which relates the relevant observations under study. To define a causal structure, observable (or visible) variables are separated into dependent variables ('measured effects', 'answers') and independent variables ('controlled causes', 'questions'). Dependent...
Chapter 3. Gaussian prior factors
In this chapter nonparametric density estimation problems will be studied working with Gaussian prior factors. The aim is to show that Gaussian prior factors are not only convenient from a technical point of view, but are also quite flexible and can be adapted in many ways to a specific learning task. Using this flexibility to implement...
Chapter 4. Parameterizing likelihoods: Variational methods
In this sense a MAP with a parametric model can be interpreted as a variational approach for a MAP for a nonparametric Bayesian problem. Clearly, minimal values obtained by minimization within a trial space can only be larger than or equal to the true minimal value, and from two variational approximations the one with smaller error is the...
Chapter 5. Parameterizing priors: Hyperparameters
The quality of nonparametric Bayesian approaches depends mainly on the adequate implementation of problem specific a priori information. Especially complex tasks with relatively few training data available, for example, in speech recognition or image reconstruction, require task specific priors. Choosing a simple Gaussian smoothness...
Chapter 6. Mixtures of Gaussian prior factors
Non-Gaussian prior factors which correspond to multimodal energy surfaces can be constructed or approximated by using mixtures of simpler prior components. In particular, it is convenient to use Gaussian densities as components or 'building blocks', since then many useful results obtained for Gaussian processes survive the generalization...
Chapter 7. Bayesian inverse quantum theory (BIQT)
The problem addressed in this chapter is the reconstruction of the Hamiltonians or potentials of quantum systems from observational data. Finding such 'causes' or 'laws' from a finite number of observations constitutes an inverse problem and is typically ill-posed in the sense of Hadamard...
Chapter 8. Summary
In this book we wanted to develop a tool box for constructing prior models within a nonparametric Bayesian framework to empirical learning, and to exemplify its use for problems from different application areas. Nonparametric models, or field theories in the language of physics, allow typically a more explicit implementation of...
Appendix A: A priori information and a posteriori control
Appendix B: Probability, free energy, energy, information, entropy, and temperature
Appendix C: Iteration procedures: Learning
Page Count: 432
Publication Year: 2003
OCLC Number: 847594358
MUSE Marc Record: Download for Bayesian Field Theory