Variational methods are becoming increasingly popular for the approximate
solution of complex probabilistic models in machine learning, computer vision,
information retrieval and many other fields. Unfortunately, for every new
application it is necessary first to derive the specific forms of the variational
update equations for the particular probabilistic model being used, and then to
implement these equations in application-specific software. Each of these steps
is both time consuming and error prone. We have therefore recently developed a
general purpose inference engine called VIBES (`Variational Inference for
Bayesian Networks') which allows a wide variety of probabilistic models to be
implemented and solved variationally without recourse to coding. New models are
specified as a directed acyclic graph using an interface analogous to a drawing
package, and VIBES then automatically generates and solves the variational
equations. The original version of VIBES assumed a fully factorized variational
posterior distribution. In this paper we present an extension of VIBES in which
the variational posterior distribution corresponds to a sub-graph of the full
probabilistic model. Such structured distributions can produce much closer
approximations to the true posterior distribution. We illustrate this approach
using an example based on Bayesian hidden Markov models.