**Template Based Inference in Symmetric Relational Markov Random
Fields**

## A. Jaimovich, O. Meshi, and **N. Friedman**

*The 23rd Conference on Uncertainty in Artificial
Intelligence (UAI'07)*, 2007.

PDF

**Abstract**

*Relational Markov Random Fields* are a general and
flexible framework for reasoning about the joint distribution
over
attributes of a large number of interacting entities.
The main computational difficulty in learning such models is
inference.
Even when dealing
with complete data, where one can summarize a large domain
by sufficient statistics, learning requires one to compute the expectation
of the sufficient statistics given different parameter choices. The
typical solution to this problem is to resort to approximate
inference procedures,
such as loopy belief propagation. Although
these procedures are quite efficient, they still require computation
that is on the order of the number of interactions (or features) in
the model. When learning a large relational model over a complex
domain, even such approximations require unrealistic running time.

In this paper we show that for a particular class of relational
MRFs, which have inherent symmetry,
we can perform the
inference needed for learning procedures using a
*template-level*
belief propagation.
This procedure's running time is
proportional to the size of the relational
model rather than the size of the domain.
Moreover, we show that this computational
procedure is equivalent to the original loopy belief
propagation.
This enables
a dramatic speedup in inference and learning
time.
We use this procedure to learn
relational MRFs for
capturing the joint distribution of large protein-protein
interaction networks.

nir@cs.huji.ac.il