1 libDAI - A free/open source C++ library for Discrete Approximate Inference
3 -------------------------------------------------------------------------------
7 See also: http://www.libdai.org
9 -------------------------------------------------------------------------------
13 libDAI is free software; you can redistribute it and/or modify it under the
14 terms of the GNU General Public License as published by the Free Software
15 Foundation; either version 2 of the License, or (at your option) any later
18 libDAI is distributed in the hope that it will be useful, but WITHOUT ANY
19 WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A
20 PARTICULAR PURPOSE. See the GNU General Public License for more details.
22 You should have received a copy of the GNU General Public License
23 along with libDAI in the file COPYING. If not, see http://www.gnu.org/licenses/
25 -------------------------------------------------------------------------------
29 If you write a scientific paper describing research that made substantive use
30 of this library, please cite the following paper describing libDAI:
33 libDAI: A free & open source C++ library for Discrete Approximate Inference in
35 Journal of Machine Learning Research, 11(Aug):2169-2173, 2010.
37 In BiBTeX format (for your convenience):
39 @article{Mooij_libDAI_10,
40 author = {Joris M. Mooij},
41 title = {lib{DAI}: A Free and Open Source {C++} Library for Discrete Approximate Inference in Graphical Models},
42 journal = {Journal of Machine Learning Research},
47 url = "http://www.jmlr.org/papers/volume11/mooij10a/mooij10a.pdf"
50 Moreover, as a personal note, I would appreciate it to be informed about any
51 publications using libDAI at joris dot mooij at libdai dot org.
53 -------------------------------------------------------------------------------
57 libDAI is a free/open source C++ library (licensed under GPL 2+) that provides
58 implementations of various (approximate) inference methods for discrete
59 graphical models. libDAI supports arbitrary factor graphs with discrete
60 variables; this includes discrete Markov Random Fields and Bayesian Networks.
62 The library is targeted at researchers. To be able to use the library, a good
63 understanding of graphical models is needed.
65 The best way to use libDAI is by writing C++ code that invokes the library; in
66 addition, part of the functionality is accessibly by using the
68 * command line interface
69 * (limited) MatLab interface
70 * (experimental) python interface
71 * (experimental) octave interface.
73 libDAI can be used to implement novel (approximate) inference algorithms and to
74 easily compare the accuracy and performance with existing algorithms that have
75 been implemented already.
77 A solver using libDAI was amongst the three winners of the UAI 2010 Approximate
78 Inference Challenge (see http://www.cs.huji.ac.il/project/UAI10/ for more
79 information). The full source code is provided as part of the library.
83 Currently, libDAI supports the following (approximate) inference methods:
85 * Exact inference by brute force enumeration;
86 * Exact inference by junction-tree methods;
88 * Loopy Belief Propagation [KFL01];
89 * Fractional Belief Propagation [WiH03];
90 * Tree-Reweighted Belief Propagation [WJW03];
91 * Tree Expectation Propagation [MiQ04];
92 * Generalized Belief Propagation [YFW05];
93 * Double-loop GBP [HAK03];
94 * Various variants of Loop Corrected Belief Propagation [MoK07, MoR05];
96 * Conditioned Belief Propagation [EaG09];
97 * Decimation algorithm.
99 These inference methods can be used to calculate partition sums, marginals over
100 subsets of variables, and MAP states (the joint state of variables that has
101 maximum probability).
103 In addition, libDAI supports parameter learning of conditional probability
104 tables by Expectation Maximization.
108 libDAI is not intended to be a complete package for approximate inference.
109 Instead, it should be considered as an "inference engine", providing various
110 inference methods. In particular, it contains no GUI, currently only supports
111 its own file format for input and output (although support for standard file
112 formats may be added later), and provides very limited visualization
113 functionalities. The only learning method supported currently is Expectation
114 Maximization (or Maximum Likelihood if no data is missing) for learning factor
119 In my opinion, the lack of open source "reference" implementations hampers
120 progress in research on approximate inference. Methods differ widely in terms
121 of quality and performance characteristics, which also depend in different ways
122 on various properties of the graphical models. Finding the best approximate
123 inference method for a particular application therefore often requires
124 empirical comparisons. However, implementing and debugging these methods takes
125 a lot of time which could otherwise be spent on research. I hope that this code
126 will aid researchers to be able to easily compare various (existing as well as
127 new) approximate inference methods, in this way accelerating research and
128 stimulating real-world applications of approximate inference.
132 Because libDAI is implemented in C++, it is very fast compared with
133 implementations in MatLab (a factor 1000 faster is not uncommon). libDAI does
134 provide a (limited) MatLab interface for easy integration with MatLab. It also
135 provides a command line interface and experimental python and octave interfaces
136 (thanks to Patrick Pletscher).
140 The code has been developed under Debian GNU/Linux with the GCC compiler suite.
141 libDAI compiles successfully with g++ versions 3.4 up to 4.4.
143 libDAI has also been successfully compiled with MS Visual Studio 2008 under
144 Windows (but not all build targets are supported yet) and with Cygwin under
147 Finally, libDAI has been compiled successfully on MacOS X.
151 The libDAI sources and documentation can be downloaded from the libDAI website:
152 http://www.libdai.org.
156 The Google group "libDAI" (http://groups.google.com/group/libdai) can be used
157 for getting support and discussing development issues.
159 -------------------------------------------------------------------------------
161 Building libDAI under UNIX variants (Linux / Cygwin / Mac OS X)
167 * a recent version of gcc (at least version 3.4)
169 * recent boost C++ libraries (at least version 1.37; however, version 1.37
170 shipped with Ubuntu 9.04 is known not to work)
171 * doxygen (only for building the documentation)
172 * graphviz (only for using some of the libDAI command line utilities)
173 * CImg library (only for building the image segmentation example)
175 On Debian/Ubuntu, you can easily install the required packages with a single
178 apt-get install g++ make doxygen graphviz libboost-dev libboost-graph-dev libboost-program-options-dev libboost-test-dev cimg-dev
180 (root permissions needed).
182 On Mac OS X (10.4 is known to work), these packages can be installed easily via
183 MacPorts. If MacPorts is not already installed, install it according to the
184 instructions at http://www.macports.org/. Then, a simple
186 sudo port install gmake boost doxygen graphviz
188 should be enough to install everything that is needed.
190 On Cygwin, the prebuilt Cygwin package boost-1.33.1-x is known not to work. You
191 can however obtain the latest boost version (you need at least 1.37.0) from
192 http://www.boost.org/ and build it as described in the next subsection.
194 Building boost under Cygwin
196 * Download the latest boost libraries from http://www.boost.org
197 * Build the required boost libraries using:
199 ./bootstrap.sh --with-libraries=program_options,math,graph,test --prefix=/boost_root/
202 * In order to use dynamic linking, the boost .dll's should be somewhere in
203 the path. This can be achieved by a command like:
205 export PATH=$PATH:/boost_root/stage/lib
209 To build the libDAI source, first copy a template Makefile.* to Makefile.conf
210 (for example, copy Makefile.LINUX to Makefile.conf if you use GNU/Linux). Then,
211 edit the Makefile.conf template to adapt it to your local setup. In case you
212 want to use Boost libraries which are installed in non-standard locations, you
213 have to tell the compiler and linker about their locations (using the -I, -L
214 flags for GCC; also you may need to set the LD_LIBRARY_PATH environment
215 variable correctly before running libDAI binaries). Platform independent build
216 options can be set in Makefile.ALL. Finally, run
220 The build includes a regression test, which may take a while to complete.
222 If the build is successful, you can test the example program:
224 examples/example tests/alarm.fg
226 or the more extensive test program:
228 tests/testdai --aliases tests/aliases.conf --filename tests/alarm.fg --methods JTREE_HUGIN BP_SEQMAX
230 -------------------------------------------------------------------------------
232 Building libDAI under Windows
238 * A recent version of MicroSoft Visual Studio (2008 is known to work)
239 * recent boost C++ libraries (version 1.37 or higher)
240 * GNU make (can be obtained from http://gnuwin32.sourceforge.net)
241 * CImg library (only for building the image segmentation example)
243 For the regression test, you need:
245 * GNU diff, GNU sed (can be obtained from http://gnuwin32.sourceforge.net)
247 Building boost under Windows
249 Because building boost under Windows is tricky, I provide some guidance here.
251 * Download the boost zip file from http://www.boost.org/users/download and
253 * Download the bjam executable from http://www.boost.org/users/download and
254 unpack it somewhere else.
255 * Download Boost.Build (v2) from http://www.boost.org/docs/tools/build/
256 index.html and unpack it yet somewhere else.
257 * Edit the file boost-build.jam in the main boost directory to change the
258 BOOST_BUILD directory to the place where you put Boost.Build (use UNIX /
259 instead of Windows \ in pathnames).
260 * Copy the bjam.exe executable into the main boost directory. Now if you
261 issue "bjam --version" you should get a version and no errors. Issueing
262 "bjam --show-libraries" will show the libraries that will be built.
263 * The following command builds the boost libraries that are relevant for
266 bjam --with-graph --with-math --with-program_options --with-test link=static runtime-link=shared
270 To build the source, copy Makefile.WINDOWS to Makefile.conf. Then, edit
271 Makefile.conf to adapt it to your local setup. Platform independent build
272 options can be set in Makefile.ALL. Finally, run (from the command line)
276 The build includes a regression test, which may take a while to complete.
278 If the build is successful, you can test the example program:
280 examples\example tests\alarm.fg
282 or the more extensive test program:
284 tests\testdai --aliases tests\aliases.conf --filename tests\alarm.fg --methods JTREE_HUGIN BP_SEQMAX
286 -------------------------------------------------------------------------------
288 Building the libDAI MatLab interface
293 * The platform-dependent requirements described above
295 First, you need to build the libDAI source as described above for your
296 platform. By default, the MatLab interface is disabled, so before compiling the
297 source, you have to enable it in Makefile.ALL by setting
301 Also, you have to configure the MatLab-specific parts of Makefile.conf to match
302 your system (in particular, the Makefile variables ME, MATLABDIR and MEX). The
303 MEX file extension depends on your platform; for a 64-bit linux x86_64 system
304 this would be "ME=.mexa64", for a 32-bit linux x86 system "ME=.mexglx". If you
305 are unsure about your MEX file extension: it needs to be the same as what the
306 MatLab command "mexext" returns. The required MEX files are built by issuing
310 from the command line. The MatLab interface is much less powerful than using
311 libDAI from C++. There are two reasons for this: (i) it is boring to write MEX
312 files; (ii) the large performance penalty paid when large data structures (like
313 factor graphs) have to be converted between their native C++ data structure to
314 something that MatLab understands.
316 A simple example of how to use the MatLab interface is the following (entered
317 at the MatLab prompt), which performs exact inference by the junction tree
318 algorithm and approximate inference by belief propagation on the ALARM network:
320 cd path_to_libdai/matlab
321 [psi] = dai_readfg ('../tests/alarm.fg');
322 [logZ,q,md,qv,qf] = dai (psi, 'JTREE', '[updates=HUGIN,verbose=0]')
323 [logZ,q,md,qv,qf] = dai (psi, 'BP', '[updates=SEQMAX,tol=1e-9,maxiter=10000,logdomain=0]')
325 where "path_to_libdai" has to be replaced with the directory in which libDAI
326 was installed. For other algorithms and some default parameters, see the file
329 -------------------------------------------------------------------------------
331 Building the documentation
333 Install doxygen, graphviz and a TeX distribution and use
337 to build the documentation. If the documentation is not clear enough, feel free
338 to send me an email (or even better, to improve the documentation and send a
339 patch!). The documentation can also be browsed online at http://www.libdai.org.