Cover

pdf iconDownload PDF

pp. 1-1

Title Page, Copyright

pdf iconDownload PDF

pp. 2-5

Contents

pdf iconDownload PDF

pp. v-ix

Figures

pdf iconDownload PDF

pp. xi-xiv

Tables

pdf iconDownload PDF

pp. xv-17

Numerical Case Studies

pdf iconDownload PDF

pp. xvii-19

read more

Acknowledgments and Abstract

pdf iconDownload PDF

pp. xix-xx

It is a great pleasure to thank Achim Weiguny for his support throughout this work, his careful reading of the manuscript, and him and Joerg Uhlig for the enjoyable collaboration. Essential parts of this work are based on the following articles of the author and these two collaborators...

read more

Chapter 1. Introduction

pdf iconDownload PDF

pp. 3-8

Due to increasing computational resources, the last decade has seen a rapidly growing interest in applied empirical learning problems. They appear, for example, as density estimation, regression or classification problems and include, just to name a few, image reconstruction, speech recognition, time series prediction, object recognition...

read more

Chapter 2. Bayesian framework

pdf iconDownload PDF

pp. 9-84

Looking for a scientific explanation of some phenomena means to search for a causal model which relates the relevant observations under study. To define a causal structure, observable (or visible) variables are separated into dependent variables ('measured effects', 'answers') and independent variables ('controlled causes', 'questions'). Dependent...

read more

Chapter 3. Gaussian prior factors

pdf iconDownload PDF

pp. 85-165

In this chapter nonparametric density estimation problems will be studied working with Gaussian prior factors. The aim is to show that Gaussian prior factors are not only convenient from a technical point of view, but are also quite flexible and can be adapted in many ways to a specific learning task. Using this flexibility to implement...

read more

Chapter 4. Parameterizing likelihoods: Variational methods

pdf iconDownload PDF

pp. 167-186

In this sense a MAP with a parametric model can be interpreted as a variational approach for a MAP for a nonparametric Bayesian problem. Clearly, minimal values obtained by minimization within a trial space can only be larger than or equal to the true minimal value, and from two variational approximations the one with smaller error is the...

read more

Chapter 5. Parameterizing priors: Hyperparameters

pdf iconDownload PDF

pp. 187-227

The quality of nonparametric Bayesian approaches depends mainly on the adequate implementation of problem specific a priori information. Especially complex tasks with relatively few training data available, for example, in speech recognition or image reconstruction, require task specific priors. Choosing a simple Gaussian smoothness...

read more

Chapter 6. Mixtures of Gaussian prior factors

pdf iconDownload PDF

pp. 229-256

Non-Gaussian prior factors which correspond to multimodal energy surfaces can be constructed or approximated by using mixtures of simpler prior components. In particular, it is convenient to use Gaussian densities as components or 'building blocks', since then many useful results obtained for Gaussian processes survive the generalization...

read more

Chapter 7. Bayesian inverse quantum theory (BIQT)

pdf iconDownload PDF

pp. 257-308

The problem addressed in this chapter is the reconstruction of the Hamiltonians or potentials of quantum systems from observational data. Finding such 'causes' or 'laws' from a finite number of observations constitutes an inverse problem and is typically ill-posed in the sense of Hadamard...

read more

Chapter 8. Summary

pdf iconDownload PDF

pp. 309-312

In this book we wanted to develop a tool box for constructing prior models within a nonparametric Bayesian framework to empirical learning, and to exemplify its use for problems from different application areas. Nonparametric models, or field theories in the language of physics, allow typically a more explicit implementation of...

Appendix A: A priori information and a posteriori control

pdf iconDownload PDF

pp. 313-321

Appendix B: Probability, free energy, energy, information, entropy, and temperature

pdf iconDownload PDF

pp. 323-343

Appendix C: Iteration procedures: Learning

pdf iconDownload PDF

pp. 345-364

Bibliography

pdf iconDownload PDF

pp. 365-402

Index

pdf iconDownload PDF

pp. 403-411