1 libDAI - A free/open source C++ library for Discrete Approximate Inference
3 -------------------------------------------------------------------------------
6 Date: September 17, 2012 - or later
7 See also: http://www.libdai.org
9 -------------------------------------------------------------------------------
13 libDAI is free software; you can redistribute it and/or modify it under the
14 terms of the BSD 2-clause license (also known as the FreeBSD license), which
15 can be found in the accompanying LICENSE file.
17 [Note: up to and including version 0.2.7, libDAI was licensed under the GNU
18 General Public License (GPL) version 2 or higher.]
21 -------------------------------------------------------------------------------
25 If you write a scientific paper describing research that made substantive use
26 of this library, please cite the following paper describing libDAI:
29 libDAI: A free & open source C++ library for Discrete Approximate Inference in graphical models;
30 Journal of Machine Learning Research, 11(Aug):2169-2173, 2010.
32 In BiBTeX format (for your convenience):
34 @article{Mooij_libDAI_10,
35 author = {Joris M. Mooij},
36 title = {lib{DAI}: A Free and Open Source {C++} Library for Discrete Approximate Inference in Graphical Models},
37 journal = {Journal of Machine Learning Research},
42 url = "http://www.jmlr.org/papers/volume11/mooij10a/mooij10a.pdf"
45 Moreover, as a personal note, I would appreciate it to be informed about any
46 publications using libDAI at joris dot mooij at libdai dot org.
48 -------------------------------------------------------------------------------
52 libDAI is a free/open source C++ library that provides implementations of
53 various (approximate) inference methods for discrete graphical models. libDAI
54 supports arbitrary factor graphs with discrete variables; this includes
55 discrete Markov Random Fields and Bayesian Networks.
57 The library is targeted at researchers. To be able to use the library, a good
58 understanding of graphical models is needed.
60 The best way to use libDAI is by writing C++ code that invokes the library; in
61 addition, part of the functionality is accessibly by using the
63 * command line interface
64 * (limited) MatLab interface
65 * (experimental) python interface
66 * (experimental) octave interface.
68 libDAI can be used to implement novel (approximate) inference algorithms and to
69 easily compare the accuracy and performance with existing algorithms that have
70 been implemented already.
72 A solver using libDAI was amongst the three winners of the UAI 2010 Approximate
73 Inference Challenge (see http://www.cs.huji.ac.il/project/UAI10/ for more
74 information). The full source code is provided as part of the library.
78 Currently, libDAI supports the following (approximate) inference methods:
80 * Exact inference by brute force enumeration;
81 * Exact inference by junction-tree methods;
83 * Loopy Belief Propagation [KFL01];
84 * Fractional Belief Propagation [WiH03];
85 * Tree-Reweighted Belief Propagation [WJW03];
86 * Tree Expectation Propagation [MiQ04];
87 * Generalized Belief Propagation [YFW05];
88 * Double-loop GBP [HAK03];
89 * Various variants of Loop Corrected Belief Propagation [MoK07, MoR05];
91 * Conditioned Belief Propagation [EaG09];
92 * Decimation algorithm.
94 These inference methods can be used to calculate partition sums, marginals over
95 subsets of variables, and MAP states (the joint state of variables that has
98 In addition, libDAI supports parameter learning of conditional probability
99 tables by Expectation Maximization.
103 libDAI is not intended to be a complete package for approximate inference.
104 Instead, it should be considered as an "inference engine", providing various
105 inference methods. In particular, it contains no GUI, currently only supports
106 its own file format for input and output (although support for standard file
107 formats may be added later), and provides very limited visualization
108 functionalities. The only learning method supported currently is Expectation
109 Maximization (or Maximum Likelihood if no data is missing) for learning factor
114 In my opinion, the lack of open source "reference" implementations hampers
115 progress in research on approximate inference. Methods differ widely in terms
116 of quality and performance characteristics, which also depend in different ways
117 on various properties of the graphical models. Finding the best approximate
118 inference method for a particular application therefore often requires
119 empirical comparisons. However, implementing and debugging these methods takes
120 a lot of time which could otherwise be spent on research. I hope that this code
121 will aid researchers to be able to easily compare various (existing as well as
122 new) approximate inference methods, in this way accelerating research and
123 stimulating real-world applications of approximate inference.
127 Because libDAI is implemented in C++, it is very fast compared with
128 implementations in MatLab (a factor 1000 faster is not uncommon). libDAI does
129 provide a (limited) MatLab interface for easy integration with MatLab. It also
130 provides a command line interface and experimental python and octave interfaces
131 (thanks to Patrick Pletscher).
135 The code has been developed under Debian GNU/Linux with the GCC compiler suite.
136 libDAI compiles successfully with g++ versions 3.4 up to 4.7 (both 32 and 64
139 libDAI has also been successfully compiled with MS Visual Studio 2008 under
140 Windows, MS Visual Studio 2010 under Windows 64, and with Cygwin under Windows.
142 Finally, libDAI has been compiled successfully on MacOS X (both 32 and 64
147 The libDAI sources and documentation can be downloaded from the libDAI website:
148 http://www.libdai.org.
152 The Google group "libDAI" (http://groups.google.com/group/libdai) can be used
153 for getting support and discussing development issues.
155 -------------------------------------------------------------------------------
157 Building libDAI under UNIX variants (Linux / Cygwin / Mac OS X)
163 * a recent version of gcc (at least version 3.4)
165 * recent boost C++ libraries (at least version 1.37; however, version 1.37
166 shipped with Ubuntu 9.04 is known not to work)
167 * GMP library (or the Windows port called MPIR, for 64 bits builds MPIR 2.5.0
169 * doxygen (only for building the documentation)
170 * graphviz (only for using some of the libDAI command line utilities)
171 * CImg library (only for building the image segmentation example)
173 On Debian/Ubuntu, you can easily install the required packages with a single
176 apt-get install g++ make doxygen graphviz libboost-dev libboost-graph-dev libboost-program-options-dev libboost-test-dev libgmp-dev cimg-dev
178 (root permissions needed).
180 On Mac OS X (10.4 is known to work), these packages can be installed easily via
181 MacPorts. If MacPorts is not already installed, install it according to the
182 instructions at http://www.macports.org/. Then, a simple
184 sudo port install gmake boost gmp doxygen graphviz
186 should be enough to install everything that is needed.
188 On Cygwin, the prebuilt Cygwin package boost-1.33.1-x is known not to work. You
189 can however obtain the latest boost version (you need at least 1.37.0) from
190 http://www.boost.org/ and build it as described in the next subsection.
192 Building boost under Cygwin
194 * Download the latest boost libraries from http://www.boost.org
195 * Build the required boost libraries using:
197 ./bootstrap.sh –with-libraries=program_options,math,graph,test –prefix=/boost_root/
200 * In order to use dynamic linking, the boost .dll's should be somewhere in
201 the path. This can be achieved by a command like:
203 export PATH=$PATH:/boost_root/stage/lib
207 To build the libDAI source, first copy a template Makefile.* to Makefile.conf
208 (for example, copy Makefile.LINUX to Makefile.conf if you use GNU/Linux). Then,
209 edit the Makefile.conf template to adapt it to your local setup. In case you
210 want to use Boost libraries which are installed in non-standard locations, you
211 have to tell the compiler and linker about their locations (using the -I, -L
212 flags for GCC; also you may need to set the LD_LIBRARY_PATH environment
213 variable correctly before running libDAI binaries). Platform independent build
214 options can be set in Makefile.ALL. Finally, run
218 The build includes a regression test, which may take a while to complete.
220 If the build is successful, you can test the example program:
222 examples/example tests/alarm.fg
224 or the more extensive test program:
226 tests/testdai –aliases tests/aliases.conf –filename tests/alarm.fg –methods JTREE_HUGIN BP_SEQMAX
228 -------------------------------------------------------------------------------
230 Building libDAI under Windows
236 * A recent version of MicroSoft Visual Studio (2008 is known to work)
237 * recent boost C++ libraries (version 1.37 or higher)
238 * GMP or MPIR library (for 64-bits builds, MPIR 2.5.0 or higher is needed)
239 * GNU make (can be obtained from http://gnuwin32.sourceforge.net)
240 * CImg library (only for building the image segmentation example)
242 For the regression test, you need:
244 * GNU diff, GNU sed (can be obtained from http://gnuwin32.sourceforge.net)
246 Building boost under Windows
248 Because building boost under Windows is tricky, I provide some guidance here.
250 * Download the boost zip file from http://www.boost.org/users/download and
252 * Download the bjam executable from http://www.boost.org/users/download and
253 unpack it somewhere else.
254 * Download Boost.Build (v2) from http://www.boost.org/docs/tools/build/
255 index.html and unpack it yet somewhere else.
256 * Edit the file boost-build.jam in the main boost directory to change the
257 BOOST_BUILD directory to the place where you put Boost.Build (use UNIX /
258 instead of Windows \ in pathnames).
259 * Copy the bjam.exe executable into the main boost directory. Now if you
260 issue "bjam --version" you should get a version and no errors. Issueing
261 "bjam --show-libraries" will show the libraries that will be built.
262 * The following command builds the boost libraries that are relevant for
265 bjam –with-graph –with-math –with-program_options –with-test link=static runtime-link=shared
267 Building GMP or MPIR under Windows
269 Information about how to build GPR or MPIR under Windows can be found on the
270 internet. The user has to update Makefile.WINDOWS in order to link with the GPR
271 /MPIR libraries. Note that for 64-bit builds, MPIR 2.5.0 or higher is needed.
275 To build the source, copy Makefile.WINDOWS to Makefile.conf. Then, edit
276 Makefile.conf to adapt it to your local setup. Platform independent build
277 options can be set in Makefile.ALL. Finally, run (from the command line)
281 The build includes a regression test, which may take a while to complete.
283 If the build is successful, you can test the example program:
285 examples\example tests\alarm.fg
287 or the more extensive test program:
289 tests\testdai –aliases tests\aliases.conf –filename tests\alarm.fg –methods JTREE_HUGIN BP_SEQMAX
291 -------------------------------------------------------------------------------
293 Building the libDAI MatLab interface
298 * The platform-dependent requirements described above
300 First, you need to build the libDAI source as described above for your
301 platform. By default, the MatLab interface is disabled, so before compiling the
302 source, you have to enable it in Makefile.ALL by setting
306 Also, you have to configure the MatLab-specific parts of Makefile.conf to match
307 your system (in particular, the Makefile variables ME, MATLABDIR and MEX). The
308 MEX file extension depends on your platform; for a 64-bit linux x86_64 system
309 this would be "ME=.mexa64", for a 32-bit linux x86 system "ME=.mexglx". If you
310 are unsure about your MEX file extension: it needs to be the same as what the
311 MatLab command "mexext" returns. The required MEX files are built by issuing
315 from the command line. The MatLab interface is much less powerful than using
316 libDAI from C++. There are two reasons for this: (i) it is boring to write MEX
317 files; (ii) the large performance penalty paid when large data structures (like
318 factor graphs) have to be converted between their native C++ data structure to
319 something that MatLab understands.
321 A simple example of how to use the MatLab interface is the following (entered
322 at the MatLab prompt), which performs exact inference by the junction tree
323 algorithm and approximate inference by belief propagation on the ALARM network:
325 cd path_to_libdai/matlab
326 [psi] = dai_readfg ('../tests/alarm.fg');
327 [logZ,q,md,qv,qf] = dai (psi, 'JTREE', '[updates=HUGIN,verbose=0]')
328 [logZ,q,md,qv,qf] = dai (psi, 'BP', '[updates=SEQMAX,tol=1e-9,maxiter=10000,logdomain=0]')
330 where "path_to_libdai" has to be replaced with the directory in which libDAI
331 was installed. For other algorithms and some default parameters, see the file
334 -------------------------------------------------------------------------------
336 Building the documentation
338 Install doxygen, graphviz and a TeX distribution and use
342 to build the documentation. If the documentation is not clear enough, feel free
343 to send me an email (or even better, to improve the documentation and send a
344 patch!). The documentation can also be browsed online at http://www.libdai.org.