Dependency Graphs include undirected possibly-cyclic Markov networks, and DAG Bayesian networks. Induced dependencies cannot be represented by undirected graphs, but can be represented by directed graphs. Inference can be performed through Belief Propagation, Tree Searching, or Sampling methods.
independency map
is a graph with connections between dependent propositions.dependency map
is a graph with connections between independent propositions.relevance network
?So a full graph is a trivial I-map, and an empty graph is a trivial D-map.
The previous four properties hold for a probabilistic model
A strictly positive probabilistic model will also have intersection.
For each edge in the graph, check that there is no dependence between the two variables when conditioned on their respective remaining edges.
The problem is NP-complete. Trees have width and induced-width of 1. Chordal graphs have (width = induced-width).
This is an optimal algorithm for the width, but not for induced width
Given a graph G = (V,E) , V = {v1,…vn}, produce a min-width ordering of the nodes
Also called min field
Runs in time O(|V| + |E|), faster than the others.
In practice, is superior to min-width and max-cardinality.
(Gogate and Dechter)
Gibbs' Potential is for constructing a complete and consistent graph model while preserving the dependency structure of an arbitrary graph G:
A probability distribution formed in this way is a Markov Field.
An undirected graph that has variables as vertices and an edge connecting two variables that appear in the scope of the same function.
A hypergraph is a set of vertices and corresponding “hyperedges”. A Hyperedge is a generalization of a traditional graph edge, in that it represents any sub-set of vertices rather than just a pair.
For graphical models, a hyperedge is used to represent each function. It connects all variables in the scope of that function.
A graph that represents each function scope by a node and associates labeled edges between all nodes that share variables in their scopes. The edge is labeled by shared variables between nodes. The dual graph of any Bayesian network has a minimal-arc dual join tree.
A Join Graph is any connected sub-graph of a Dual Graph. A join graph that is a tree is a join-tree a.k.a. “junction tree”. Join trees can be constructed through Tree Decomposition.
A tree constructed from a graph such that all back-edges connect a node to its ancestor in the tree. For search, smaller-height pseudo-tree is better. DFS-trees are a subset of psuedo-trees. Chain graphs are trivial pseudo-trees of maximum height.
Given a tree-decomposition whose tree-width is $w^{*}$, there exists a pseudo-tree T of G whose depth $m$ satisfies $m \leq w^{*} \log{n}$.
The width of a pseudo-tree is defined by a forward-order elimination.
Can be generated from a bucket-tree.
Any AND-OR Tree based on a pseudo-tree is sound and complete.