Fixed a bug in JTree::findMaximum() (reported by zhengyun84 and Dhruv Batra):
[libdai.git] / README
1 libDAI - A free/open source C++ library for Discrete Approximate Inference
2
3 -------------------------------------------------------------------------------
4
5 Version: git master
6 Date: August 11, 2010 (or later)
7 See also: http://www.libdai.org
8
9 -------------------------------------------------------------------------------
10
11 License
12
13 libDAI is free software; you can redistribute it and/or modify it under the
14 terms of the GNU General Public License as published by the Free Software
15 Foundation; either version 2 of the License, or (at your option) any later
16 version.
17
18 libDAI is distributed in the hope that it will be useful, but WITHOUT ANY
19 WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A
20 PARTICULAR PURPOSE. See the GNU General Public License for more details.
21
22 You should have received a copy of the GNU General Public License
23 along with libDAI in the file COPYING. If not, see http://www.gnu.org/licenses/
24
25 -------------------------------------------------------------------------------
26
27 Citing libDAI
28
29 If you write a scientific paper describing research that made substantive use
30 of this program, please cite the software appropriately, by mentioning the
31 fashion in which this software was used, including the version number.
32
33 An appropriate citation would be:
34
35 Joris M. Mooij et al. (2010) "libDAI 0.2.6: A free/open source C++ library for
36 Discrete Approximate Inference", http://www.libdai.org
37
38 or in BiBTeX format:
39
40 @misc{mooij2010libdai,
41 author = "Joris M. Mooij et al.",
42 title = "lib{DAI} 0.2.6: A free/open source {C}++ library for {D}iscrete {A}pproximate {I}nference",
43 howpublished = "http://www.libdai.org/",
44 year = 2010
45 }
46
47
48 Moreover, as a personal note, I would appreciate it to be informed about any
49 publications using libDAI at joris dot mooij at libdai dot org.
50
51 -------------------------------------------------------------------------------
52
53 About libDAI
54
55 libDAI is a free/open source C++ library (licensed under GPL 2+) that provides
56 implementations of various (approximate) inference methods for discrete
57 graphical models. libDAI supports arbitrary factor graphs with discrete
58 variables; this includes discrete Markov Random Fields and Bayesian Networks.
59
60 The library is targeted at researchers. To be able to use the library, a good
61 understanding of graphical models is needed.
62
63 The best way to use libDAI is by writing C++ code that invokes the library; in
64 addition, part of the functionality is accessibly by using the
65
66 * command line interface
67 * (limited) MatLab interface
68 * (experimental) python interface
69 * (experimental) octave interface.
70
71 libDAI can be used to implement novel (approximate) inference algorithms and to
72 easily compare the accuracy and performance with existing algorithms that have
73 been implemented already.
74
75 A solver using libDAI was amongst the three winners of the UAI 2010 Approximate
76 Inference Challenge (see http://www.cs.huji.ac.il/project/UAI10/ for more
77 information). The full source code is provided as part of the library.
78
79 Features
80
81 Currently, libDAI supports the following (approximate) inference methods:
82
83 * Exact inference by brute force enumeration;
84 * Exact inference by junction-tree methods;
85 * Mean Field;
86 * Loopy Belief Propagation [KFL01];
87 * Fractional Belief Propagation [WiH03];
88 * Tree-Reweighted Belief Propagation [WJW03];
89 * Tree Expectation Propagation [MiQ04];
90 * Generalized Belief Propagation [YFW05];
91 * Double-loop GBP [HAK03];
92 * Various variants of Loop Corrected Belief Propagation [MoK07, MoR05];
93 * Gibbs sampler;
94 * Conditioned Belief Propagation [EaG09];
95 * Decimation algorithm.
96
97 These inference methods can be used to calculate partition sums, marginals over
98 subsets of variables, and MAP states (the joint state of variables that has
99 maximum probability).
100
101 In addition, libDAI supports parameter learning of conditional probability
102 tables by Expectation Maximization.
103
104 Limitations
105
106 libDAI is not intended to be a complete package for approximate inference.
107 Instead, it should be considered as an "inference engine", providing various
108 inference methods. In particular, it contains no GUI, currently only supports
109 its own file format for input and output (although support for standard file
110 formats may be added later), and provides very limited visualization
111 functionalities. The only learning method supported currently is Expectation
112 Maximization (or Maximum Likelihood if no data is missing) for learning factor
113 parameters.
114
115 Rationale
116
117 In my opinion, the lack of open source "reference" implementations hampers
118 progress in research on approximate inference. Methods differ widely in terms
119 of quality and performance characteristics, which also depend in different ways
120 on various properties of the graphical models. Finding the best approximate
121 inference method for a particular application therefore often requires
122 empirical comparisons. However, implementing and debugging these methods takes
123 a lot of time which could otherwise be spent on research. I hope that this code
124 will aid researchers to be able to easily compare various (existing as well as
125 new) approximate inference methods, in this way accelerating research and
126 stimulating real-world applications of approximate inference.
127
128 Language
129
130 Because libDAI is implemented in C++, it is very fast compared with
131 implementations in MatLab (a factor 1000 faster is not uncommon). libDAI does
132 provide a (limited) MatLab interface for easy integration with MatLab. It also
133 provides a command line interface and experimental python and octave interfaces
134 (thanks to Patrick Pletscher).
135
136 Compatibility
137
138 The code has been developed under Debian GNU/Linux with the GCC compiler suite.
139 libDAI compiles successfully with g++ versions 3.4 up to 4.4.
140
141 libDAI has also been successfully compiled with MS Visual Studio 2008 under
142 Windows (but not all build targets are supported yet) and with Cygwin under
143 Windows.
144
145 Finally, libDAI has been compiled successfully on MacOS X.
146
147 Downloading libDAI
148
149 The libDAI sources and documentation can be downloaded from the libDAI website:
150 http://www.libdai.org.
151
152 Mailing list
153
154 The Google group "libDAI" (http://groups.google.com/group/libdai) can be used
155 for getting support and discussing development issues.
156
157 -------------------------------------------------------------------------------
158
159 Building libDAI under UNIX variants (Linux / Cygwin / Mac OS X)
160
161 Preparations
162
163 You need:
164
165 * a recent version of gcc (at least version 3.4)
166 * GNU make
167 * recent boost C++ libraries (at least version 1.37; however, version 1.37
168 shipped with Ubuntu 9.04 is known not to work)
169 * doxygen (only for building the documentation)
170 * graphviz (only for using some of the libDAI command line utilities)
171 * CImg library (only for building the image segmentation example)
172
173 On Debian/Ubuntu, you can easily install the required packages with a single
174 command:
175
176 apt-get install g++ make doxygen graphviz libboost-dev libboost-graph-dev libboost-program-options-dev libboost-test-dev cimg-dev
177
178 (root permissions needed).
179
180 On Mac OS X (10.4 is known to work), these packages can be installed easily via
181 MacPorts. If MacPorts is not already installed, install it according to the
182 instructions at http://www.macports.org/. Then, a simple
183
184 sudo port install gmake boost doxygen graphviz
185
186 should be enough to install everything that is needed.
187
188 On Cygwin, the prebuilt Cygwin package boost-1.33.1-x is known not to work. You
189 can however obtain the latest boost version (you need at least 1.37.0) from
190 http://www.boost.org/ and build it as described in the next subsection.
191
192 Building boost under Cygwin
193
194 * Download the latest boost libraries from http://www.boost.org
195 * Build the required boost libraries using:
196
197 ./bootstrap.sh --with-libraries=program_options,math,graph,test --prefix=/boost_root/
198 ./bjam
199
200 * In order to use dynamic linking, the boost .dll's should be somewhere in
201 the path. This can be achieved by a command like:
202
203 export PATH=$PATH:/boost_root/stage/lib
204
205 Building libDAI
206
207 To build the libDAI source, first copy a template Makefile.* to Makefile.conf
208 (for example, copy Makefile.LINUX to Makefile.conf if you use GNU/Linux). Then,
209 edit the Makefile.conf template to adapt it to your local setup. Especially
210 directories may differ from system to system. Platform independent build
211 options can be set in Makefile.ALL. Finally, run
212
213 make
214
215 The build includes a regression test, which may take a while to complete.
216
217 If the build is successful, you can test the example program:
218
219 examples/example tests/alarm.fg
220
221 or the more extensive test program:
222
223 tests/testdai --aliases tests/aliases.conf --filename tests/alarm.fg --methods JTREE_HUGIN BP_SEQMAX
224
225 -------------------------------------------------------------------------------
226
227 Building libDAI under Windows
228
229 Preparations
230
231 You need:
232
233 * A recent version of MicroSoft Visual Studio (2008 is known to work)
234 * recent boost C++ libraries (version 1.37 or higher)
235 * GNU make (can be obtained from http://gnuwin32.sourceforge.net)
236 * CImg library (only for building the image segmentation example)
237
238 For the regression test, you need:
239
240 * GNU diff, GNU sed (can be obtained from http://gnuwin32.sourceforge.net)
241
242 Building boost under Windows
243
244 Because building boost under Windows is tricky, I provide some guidance here.
245
246 * Download the boost zip file from http://www.boost.org/users/download and
247 unpack it somewhere.
248 * Download the bjam executable from http://www.boost.org/users/download and
249 unpack it somewhere else.
250 * Download Boost.Build (v2) from http://www.boost.org/docs/tools/build/
251 index.html and unpack it yet somewhere else.
252 * Edit the file boost-build.jam in the main boost directory to change the
253 BOOST_BUILD directory to the place where you put Boost.Build (use UNIX /
254 instead of Windows \ in pathnames).
255 * Copy the bjam.exe executable into the main boost directory. Now if you
256 issue "bjam --version" you should get a version and no errors. Issueing
257 "bjam --show-libraries" will show the libraries that will be built.
258 * The following command builds the boost libraries that are relevant for
259 libDAI:
260
261 bjam --with-graph --with-math --with-program_options --with-test link=static runtime-link=shared
262
263 Building libDAI
264
265 To build the source, copy Makefile.WINDOWS to Makefile.conf. Then, edit
266 Makefile.conf to adapt it to your local setup. Platform independent build
267 options can be set in Makefile.ALL. Finally, run (from the command line)
268
269 make
270
271 The build includes a regression test, which may take a while to complete.
272
273 If the build is successful, you can test the example program:
274
275 examples\example tests\alarm.fg
276
277 or the more extensive test program:
278
279 tests\testdai --aliases tests\aliases.conf --filename tests\alarm.fg --methods JTREE_HUGIN BP_SEQMAX
280
281 -------------------------------------------------------------------------------
282
283 Building the libDAI MatLab interface
284
285 You need:
286
287 * MatLab
288 * The platform-dependent requirements described above
289
290 First, you need to build the libDAI source as described above for your
291 platform. By default, the MatLab interface is disabled, so before compiling the
292 source, you have to enable it in Makefile.ALL by setting
293
294 WITH_MATLAB=true
295
296 Also, you have to configure the MatLab-specific parts of Makefile.conf to match
297 your system (in particular, the Makefile variables ME, MATLABDIR and MEX). The
298 MEX file extension depends on your platform; for a 64-bit linux x86_64 system
299 this would be "ME=.mexa64", for a 32-bit linux x86 system "ME=.mexglx". If you
300 are unsure about your MEX file extension: it needs to be the same as what the
301 MatLab command "mexext" returns. The required MEX files are built by issuing
302
303 make
304
305 from the command line. The MatLab interface is much less powerful than using
306 libDAI from C++. There are two reasons for this: (i) it is boring to write MEX
307 files; (ii) the large performance penalty paid when large data structures (like
308 factor graphs) have to be converted between their native C++ data structure to
309 something that MatLab understands.
310
311 A simple example of how to use the MatLab interface is the following (entered
312 at the MatLab prompt), which performs exact inference by the junction tree
313 algorithm and approximate inference by belief propagation on the ALARM network:
314
315 cd path_to_libdai/matlab
316 [psi] = dai_readfg ('../tests/alarm.fg');
317 [logZ,q,md,qv,qf] = dai (psi, 'JTREE', '[updates=HUGIN,verbose=0]')
318 [logZ,q,md,qv,qf] = dai (psi, 'BP', '[updates=SEQMAX,tol=1e-9,maxiter=10000,logdomain=0]')
319
320 where "path_to_libdai" has to be replaced with the directory in which libDAI
321 was installed. For other algorithms and some default parameters, see the file
322 tests/aliases.conf.
323
324 -------------------------------------------------------------------------------
325
326 Building the documentation
327
328 Install doxygen, graphviz and a TeX distribution and use
329
330 make doc
331
332 to build the documentation. If the documentation is not clear enough, feel free
333 to send me an email (or even better, to improve the documentation and send a
334 patch!). The documentation can also be browsed online at http://www.libdai.org.