We cannot verify your location
Browse Book and Journal Content on Project MUSE
OR

Bayesian Field Theory

Jörg C. Lemm

Publication Year: 2003

Ask a traditional mathematician the likely outcome of a coin-toss, and he will reply that no evidence exists on which to base such a prediction. Ask a Bayesian, and he will examine the coin, conclude that it was probably not tampered with, and predict five hundred heads in a thousand tosses; a subsequent experiment would then be used to refine this prediction. The Bayesian approach, in other words, permits the use of prior knowledge when testing a hypothesis. Long the province of mathematicians and statisticians, Bayesian methods are applied in this ground-breaking book to problems in cutting-edge physics. Joerg Lemm offers practical examples of Bayesian analysis for the physicist working in such areas as neural networks, artificial intelligence, and inverse problems in quantum theory. The book also includes nonparametric density estimation problems, including, as special cases, nonparametric regression and pattern recognition. Thought-provoking and sure to be controversial, Bayesian Field Theory will be of interest to physicists as well as to other specialists in the rapidly growing number of fields that make use of Bayesian methods.

Published by: The Johns Hopkins University Press

Cover

pdf iconDownload PDF (64.4 KB)
p. 1-1

Title Page, Copyright

pdf iconDownload PDF (200.4 KB)
pp. 2-5

Contents

pdf iconDownload PDF (278.6 KB)
pp. v-ix

Figures

pdf iconDownload PDF (212.5 KB)
pp. xi-xiv

Tables

pdf iconDownload PDF (189.2 KB)
pp. xv-17

Numerical Case Studies

pdf iconDownload PDF (138.9 KB)
pp. xvii-19

read more

Acknowledgments and Abstract

pdf iconDownload PDF (208.1 KB)
pp. xix-xx

It is a great pleasure to thank Achim Weiguny for his support throughout this work, his careful reading of the manuscript, and him and Joerg Uhlig for the enjoyable collaboration. Essential parts of this work are based on the following articles of the author and these two collaborators...

read more

Chapter 1. Introduction

pdf iconDownload PDF (312.9 KB)
pp. 3-8

Due to increasing computational resources, the last decade has seen a rapidly growing interest in applied empirical learning problems. They appear, for example, as density estimation, regression or classification problems and include, just to name a few, image reconstruction, speech recognition, time series prediction, object recognition...

read more

Chapter 2. Bayesian framework

pdf iconDownload PDF (1.5 MB)
pp. 9-84

Looking for a scientific explanation of some phenomena means to search for a causal model which relates the relevant observations under study. To define a causal structure, observable (or visible) variables are separated into dependent variables ('measured effects', 'answers') and independent variables ('controlled causes', 'questions'). Dependent...

read more

Chapter 3. Gaussian prior factors

pdf iconDownload PDF (1.5 MB)
pp. 85-165

In this chapter nonparametric density estimation problems will be studied working with Gaussian prior factors. The aim is to show that Gaussian prior factors are not only convenient from a technical point of view, but are also quite flexible and can be adapted in many ways to a specific learning task. Using this flexibility to implement...

read more

Chapter 4. Parameterizing likelihoods: Variational methods

pdf iconDownload PDF (514.8 KB)
pp. 167-186

In this sense a MAP with a parametric model can be interpreted as a variational approach for a MAP for a nonparametric Bayesian problem. Clearly, minimal values obtained by minimization within a trial space can only be larger than or equal to the true minimal value, and from two variational approximations the one with smaller error is the...

read more

Chapter 5. Parameterizing priors: Hyperparameters

pdf iconDownload PDF (960.9 KB)
pp. 187-227

The quality of nonparametric Bayesian approaches depends mainly on the adequate implementation of problem specific a priori information. Especially complex tasks with relatively few training data available, for example, in speech recognition or image reconstruction, require task specific priors. Choosing a simple Gaussian smoothness...

read more

Chapter 6. Mixtures of Gaussian prior factors

pdf iconDownload PDF (800.2 KB)
pp. 229-256

Non-Gaussian prior factors which correspond to multimodal energy surfaces can be constructed or approximated by using mixtures of simpler prior components. In particular, it is convenient to use Gaussian densities as components or 'building blocks', since then many useful results obtained for Gaussian processes survive the generalization...

read more

Chapter 7. Bayesian inverse quantum theory (BIQT)

pdf iconDownload PDF (1.2 MB)
pp. 257-308

The problem addressed in this chapter is the reconstruction of the Hamiltonians or potentials of quantum systems from observational data. Finding such 'causes' or 'laws' from a finite number of observations constitutes an inverse problem and is typically ill-posed in the sense of Hadamard...

read more

Chapter 8. Summary

pdf iconDownload PDF (217.4 KB)
pp. 309-312

In this book we wanted to develop a tool box for constructing prior models within a nonparametric Bayesian framework to empirical learning, and to exemplify its use for problems from different application areas. Nonparametric models, or field theories in the language of physics, allow typically a more explicit implementation of...

Appendix A: A priori information and a posteriori control

pdf iconDownload PDF (351.8 KB)
pp. 313-321

Appendix B: Probability, free energy, energy, information, entropy, and temperature

pdf iconDownload PDF (551.4 KB)
pp. 323-343

Appendix C: Iteration procedures: Learning

pdf iconDownload PDF (568.9 KB)
pp. 345-364

Bibliography

pdf iconDownload PDF (821.3 KB)
pp. 365-402

Index

pdf iconDownload PDF (709.7 KB)
pp. 403-411


E-ISBN-13: 9780801877971
E-ISBN-10: 0801877970
Print-ISBN-13: 9780801872204
Print-ISBN-10: 0801872200

Page Count: 432
Publication Year: 2003