Michael's Wiki

Belief Networks

Class Website Project specification


  • $I_{PR}(X,Y,Z)$ means “X and Y are independent given Z”.
  • $<X,Z,Y>_G$ means “X,Y, and Z are in a graph such that Z separates X from Y.
  • “Belief in $x_i$”: posterior marginal on $x_i$ conditioned on all evidence.
Lecture 1

Exhaustively enumerating all exceptions in a logical system is intractable. Logic systems have a difficult time handling non-monotonic reasoning. Bayesian logic addresses these issues through explaining away. When evidence is known, we want to update our belief in the system by producing a posterior distribution.

Lecture 2

Class outline:

Lecture 3
Lecture 4
Lecture 5
Lecture 6

Modeling with Bayesian Networks [http://en.wikipedia.org/wiki/Convolution_code Convolutional Codes] (Darwiche 105)

Lecture 7
Lecture 8
Lecture 9
Lecture 10

Discuss the class project

Lecture 11
Lecture 12

Full OR search trees Context Minimal OR search graph AND-OR Trees Pseudo-Trees

Lecture 13
Lecture 14
Lecture 15
Lecture 16

Mini-clustering Variational Inference

Lecture 17
Lecture 18