[Laurens van der Maaten] src/matlab/dai.cpp now correctly handles missing logZ()...
[libdai.git] / README
1 libDAI - A free/open source C++ library for Discrete Approximate Inference
2
3 -------------------------------------------------------------------------------
4
5 Version: git HEAD
6 Date: July 7, 2011
7 See also: http://www.libdai.org
8
9 -------------------------------------------------------------------------------
10
11 License
12
13 libDAI is free software; you can redistribute it and/or modify it under the
14 terms of the BSD 2-clause license (also known as the FreeBSD license), which
15 can be found in the accompanying LICENSE file.
16
17 [Note: up to and including version 0.2.7, libDAI was licensed under the GNU
18 General Public License (GPL) version 2 or higher.]
19
20
21 -------------------------------------------------------------------------------
22
23 Citing libDAI
24
25 If you write a scientific paper describing research that made substantive use
26 of this library, please cite the following paper describing libDAI:
27
28 Joris M. Mooij;
29 libDAI: A free & open source C++ library for Discrete Approximate Inference in
30 graphical models;
31 Journal of Machine Learning Research, 11(Aug):2169-2173, 2010.
32
33 In BiBTeX format (for your convenience):
34
35 @article{Mooij_libDAI_10,
36 author = {Joris M. Mooij},
37 title = {lib{DAI}: A Free and Open Source {C++} Library for Discrete Approximate Inference in Graphical Models},
38 journal = {Journal of Machine Learning Research},
39 year = 2010,
40 month = Aug,
41 volume = 11,
42 pages = {2169-2173},
43 url = "http://www.jmlr.org/papers/volume11/mooij10a/mooij10a.pdf"
44 }
45
46 Moreover, as a personal note, I would appreciate it to be informed about any
47 publications using libDAI at joris dot mooij at libdai dot org.
48
49 -------------------------------------------------------------------------------
50
51 About libDAI
52
53 libDAI is a free/open source C++ library that provides implementations of
54 various (approximate) inference methods for discrete graphical models. libDAI
55 supports arbitrary factor graphs with discrete variables; this includes
56 discrete Markov Random Fields and Bayesian Networks.
57
58 The library is targeted at researchers. To be able to use the library, a good
59 understanding of graphical models is needed.
60
61 The best way to use libDAI is by writing C++ code that invokes the library; in
62 addition, part of the functionality is accessibly by using the
63
64 * command line interface
65 * (limited) MatLab interface
66 * (experimental) python interface
67 * (experimental) octave interface.
68
69 libDAI can be used to implement novel (approximate) inference algorithms and to
70 easily compare the accuracy and performance with existing algorithms that have
71 been implemented already.
72
73 A solver using libDAI was amongst the three winners of the UAI 2010 Approximate
74 Inference Challenge (see http://www.cs.huji.ac.il/project/UAI10/ for more
75 information). The full source code is provided as part of the library.
76
77 Features
78
79 Currently, libDAI supports the following (approximate) inference methods:
80
81 * Exact inference by brute force enumeration;
82 * Exact inference by junction-tree methods;
83 * Mean Field;
84 * Loopy Belief Propagation [KFL01];
85 * Fractional Belief Propagation [WiH03];
86 * Tree-Reweighted Belief Propagation [WJW03];
87 * Tree Expectation Propagation [MiQ04];
88 * Generalized Belief Propagation [YFW05];
89 * Double-loop GBP [HAK03];
90 * Various variants of Loop Corrected Belief Propagation [MoK07, MoR05];
91 * Gibbs sampler;
92 * Conditioned Belief Propagation [EaG09];
93 * Decimation algorithm.
94
95 These inference methods can be used to calculate partition sums, marginals over
96 subsets of variables, and MAP states (the joint state of variables that has
97 maximum probability).
98
99 In addition, libDAI supports parameter learning of conditional probability
100 tables by Expectation Maximization.
101
102 Limitations
103
104 libDAI is not intended to be a complete package for approximate inference.
105 Instead, it should be considered as an "inference engine", providing various
106 inference methods. In particular, it contains no GUI, currently only supports
107 its own file format for input and output (although support for standard file
108 formats may be added later), and provides very limited visualization
109 functionalities. The only learning method supported currently is Expectation
110 Maximization (or Maximum Likelihood if no data is missing) for learning factor
111 parameters.
112
113 Rationale
114
115 In my opinion, the lack of open source "reference" implementations hampers
116 progress in research on approximate inference. Methods differ widely in terms
117 of quality and performance characteristics, which also depend in different ways
118 on various properties of the graphical models. Finding the best approximate
119 inference method for a particular application therefore often requires
120 empirical comparisons. However, implementing and debugging these methods takes
121 a lot of time which could otherwise be spent on research. I hope that this code
122 will aid researchers to be able to easily compare various (existing as well as
123 new) approximate inference methods, in this way accelerating research and
124 stimulating real-world applications of approximate inference.
125
126 Language
127
128 Because libDAI is implemented in C++, it is very fast compared with
129 implementations in MatLab (a factor 1000 faster is not uncommon). libDAI does
130 provide a (limited) MatLab interface for easy integration with MatLab. It also
131 provides a command line interface and experimental python and octave interfaces
132 (thanks to Patrick Pletscher).
133
134 Compatibility
135
136 The code has been developed under Debian GNU/Linux with the GCC compiler suite.
137 libDAI compiles successfully with g++ versions 3.4 up to 4.4.
138
139 libDAI has also been successfully compiled with MS Visual Studio 2008 under
140 Windows (but not all build targets are supported yet) and with Cygwin under
141 Windows.
142
143 Finally, libDAI has been compiled successfully on MacOS X.
144
145 Downloading libDAI
146
147 The libDAI sources and documentation can be downloaded from the libDAI website:
148 http://www.libdai.org.
149
150 Mailing list
151
152 The Google group "libDAI" (http://groups.google.com/group/libdai) can be used
153 for getting support and discussing development issues.
154
155 -------------------------------------------------------------------------------
156
157 Building libDAI under UNIX variants (Linux / Cygwin / Mac OS X)
158
159 Preparations
160
161 You need:
162
163 * a recent version of gcc (at least version 3.4)
164 * GNU make
165 * recent boost C++ libraries (at least version 1.37; however, version 1.37
166 shipped with Ubuntu 9.04 is known not to work)
167 * doxygen (only for building the documentation)
168 * graphviz (only for using some of the libDAI command line utilities)
169 * CImg library (only for building the image segmentation example)
170
171 On Debian/Ubuntu, you can easily install the required packages with a single
172 command:
173
174 apt-get install g++ make doxygen graphviz libboost-dev libboost-graph-dev libboost-program-options-dev libboost-test-dev cimg-dev
175
176 (root permissions needed).
177
178 On Mac OS X (10.4 is known to work), these packages can be installed easily via
179 MacPorts. If MacPorts is not already installed, install it according to the
180 instructions at http://www.macports.org/. Then, a simple
181
182 sudo port install gmake boost doxygen graphviz
183
184 should be enough to install everything that is needed.
185
186 On Cygwin, the prebuilt Cygwin package boost-1.33.1-x is known not to work. You
187 can however obtain the latest boost version (you need at least 1.37.0) from
188 http://www.boost.org/ and build it as described in the next subsection.
189
190 Building boost under Cygwin
191
192 * Download the latest boost libraries from http://www.boost.org
193 * Build the required boost libraries using:
194
195 ./bootstrap.sh --with-libraries=program_options,math,graph,test --prefix=/boost_root/
196 ./bjam
197
198 * In order to use dynamic linking, the boost .dll's should be somewhere in
199 the path. This can be achieved by a command like:
200
201 export PATH=$PATH:/boost_root/stage/lib
202
203 Building libDAI
204
205 To build the libDAI source, first copy a template Makefile.* to Makefile.conf
206 (for example, copy Makefile.LINUX to Makefile.conf if you use GNU/Linux). Then,
207 edit the Makefile.conf template to adapt it to your local setup. In case you
208 want to use Boost libraries which are installed in non-standard locations, you
209 have to tell the compiler and linker about their locations (using the -I, -L
210 flags for GCC; also you may need to set the LD_LIBRARY_PATH environment
211 variable correctly before running libDAI binaries). Platform independent build
212 options can be set in Makefile.ALL. Finally, run
213
214 make
215
216 The build includes a regression test, which may take a while to complete.
217
218 If the build is successful, you can test the example program:
219
220 examples/example tests/alarm.fg
221
222 or the more extensive test program:
223
224 tests/testdai --aliases tests/aliases.conf --filename tests/alarm.fg --methods JTREE_HUGIN BP_SEQMAX
225
226 -------------------------------------------------------------------------------
227
228 Building libDAI under Windows
229
230 Preparations
231
232 You need:
233
234 * A recent version of MicroSoft Visual Studio (2008 is known to work)
235 * recent boost C++ libraries (version 1.37 or higher)
236 * GNU make (can be obtained from http://gnuwin32.sourceforge.net)
237 * CImg library (only for building the image segmentation example)
238
239 For the regression test, you need:
240
241 * GNU diff, GNU sed (can be obtained from http://gnuwin32.sourceforge.net)
242
243 Building boost under Windows
244
245 Because building boost under Windows is tricky, I provide some guidance here.
246
247 * Download the boost zip file from http://www.boost.org/users/download and
248 unpack it somewhere.
249 * Download the bjam executable from http://www.boost.org/users/download and
250 unpack it somewhere else.
251 * Download Boost.Build (v2) from http://www.boost.org/docs/tools/build/
252 index.html and unpack it yet somewhere else.
253 * Edit the file boost-build.jam in the main boost directory to change the
254 BOOST_BUILD directory to the place where you put Boost.Build (use UNIX /
255 instead of Windows \ in pathnames).
256 * Copy the bjam.exe executable into the main boost directory. Now if you
257 issue "bjam --version" you should get a version and no errors. Issueing
258 "bjam --show-libraries" will show the libraries that will be built.
259 * The following command builds the boost libraries that are relevant for
260 libDAI:
261
262 bjam --with-graph --with-math --with-program_options --with-test link=static runtime-link=shared
263
264 Building libDAI
265
266 To build the source, copy Makefile.WINDOWS to Makefile.conf. Then, edit
267 Makefile.conf to adapt it to your local setup. Platform independent build
268 options can be set in Makefile.ALL. Finally, run (from the command line)
269
270 make
271
272 The build includes a regression test, which may take a while to complete.
273
274 If the build is successful, you can test the example program:
275
276 examples\example tests\alarm.fg
277
278 or the more extensive test program:
279
280 tests\testdai --aliases tests\aliases.conf --filename tests\alarm.fg --methods JTREE_HUGIN BP_SEQMAX
281
282 -------------------------------------------------------------------------------
283
284 Building the libDAI MatLab interface
285
286 You need:
287
288 * MatLab
289 * The platform-dependent requirements described above
290
291 First, you need to build the libDAI source as described above for your
292 platform. By default, the MatLab interface is disabled, so before compiling the
293 source, you have to enable it in Makefile.ALL by setting
294
295 WITH_MATLAB=true
296
297 Also, you have to configure the MatLab-specific parts of Makefile.conf to match
298 your system (in particular, the Makefile variables ME, MATLABDIR and MEX). The
299 MEX file extension depends on your platform; for a 64-bit linux x86_64 system
300 this would be "ME=.mexa64", for a 32-bit linux x86 system "ME=.mexglx". If you
301 are unsure about your MEX file extension: it needs to be the same as what the
302 MatLab command "mexext" returns. The required MEX files are built by issuing
303
304 make
305
306 from the command line. The MatLab interface is much less powerful than using
307 libDAI from C++. There are two reasons for this: (i) it is boring to write MEX
308 files; (ii) the large performance penalty paid when large data structures (like
309 factor graphs) have to be converted between their native C++ data structure to
310 something that MatLab understands.
311
312 A simple example of how to use the MatLab interface is the following (entered
313 at the MatLab prompt), which performs exact inference by the junction tree
314 algorithm and approximate inference by belief propagation on the ALARM network:
315
316 cd path_to_libdai/matlab
317 [psi] = dai_readfg ('../tests/alarm.fg');
318 [logZ,q,md,qv,qf] = dai (psi, 'JTREE', '[updates=HUGIN,verbose=0]')
319 [logZ,q,md,qv,qf] = dai (psi, 'BP', '[updates=SEQMAX,tol=1e-9,maxiter=10000,logdomain=0]')
320
321 where "path_to_libdai" has to be replaced with the directory in which libDAI
322 was installed. For other algorithms and some default parameters, see the file
323 tests/aliases.conf.
324
325 -------------------------------------------------------------------------------
326
327 Building the documentation
328
329 Install doxygen, graphviz and a TeX distribution and use
330
331 make doc
332
333 to build the documentation. If the documentation is not clear enough, feel free
334 to send me an email (or even better, to improve the documentation and send a
335 patch!). The documentation can also be browsed online at http://www.libdai.org.
336