Continuous-Time Belief Propagation

T. El-Hay, I. Cohn, N. Friedman, and R. Kupferman

Proc. Twenty Seventh International Conf. on Machine Learning (ICML 10), 2010.



Many temporal processes can be naturally mod- eled as a stochastic system that evolves continuously over time. The representation lan- guage of continuous-time Bayesian networks allows to succinctly describe multi-component continuous-time stochastic processes. A crucial element in applications of such models is (approximate) inference. Here we introduce a variational approximation scheme, which is a natural extension of Belief Propagation for continuous- time processes. In this scheme, we view messages as inhomogeneous Markov processes over individual components. This leads to a relatively simple procedure that allows to easily incorporate adaptive ordinary differential equation (ODE) solvers to perform individual steps. We provide the theoretical foundations for the approximation, and show how it performs on a range of networks. Our results demonstrate that our method is quite accurate on singly connected networks, and provides close approximations in more complex ones.