SWIPL-LBFGS is an interface to call libLBFGS from within SWI-Prolog. libLBFGS is a C library for Limited-memory Broyden-Fletcher-Goldfarb-Shanno (L-BFGS) solving the under-constrained minimization problem:

minimize F(X), X=(x1,x2,..., xN)

SWIPL-LBFGS is a porting of YAP-LBFGS. YAP-LBFGS was developed by Bernd Gutmann (email, homepage). In case you publish something using YAP-LBFGS, please give credit to Bernd and to libLBFGS. And if you find YAP-LBFGS useful, or if you find a bug, or if you port it to another system, ... please send Bernd an email.

YAP-LBFGS was ported to SWI-Prolog by Fabrizio Riguzzi (homepage)

use pack_install(lbfgs)

YAP-LBFGS is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

YAP-LBFGS is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.

The module lbfgs provides the following predicates after you loaded it by :-use_module(lbfgs).

Create space to optimize a function with N variables (N has to be integer). Module is the name of the module where the call back predicates can be found, Evaluate is the call back predicate (arity 3) to evaluate the function math F, Progress is the call back predicate invoked (arity 8) after every iteration

Example optimizer_initialize(1,user,evaluate,progress)

The evaluate call back predicate has to be of the type evaluate(-F,+N,+Step). It has to calculate the current function value F. N is the size of the parameter vector (the value which was used to initialize LBFGS) and Step is the current state of the line search. The call back predicate can access the current values of x[i] by calling optimizer_get_x(+I,-Xi). Finally, the call back predicate has to calculate the gradient of F and set its value by calling optimizer_set_g(+I,+Gi) for every 1<=I<=N.

The progress call back predicate has to be of the type progress(+F,+X_Norm,+G_Norm,+Step,+N,+Iteration,+LS,-Continue). It is called after every iteration. The call back predicate can access the current values of X and of the gradient by calling optimizer_get_x(+I,-Xi) and optimizer_get_g(+I,-Gi) respectively. Howver, it must not call the setter predicates for X or G. If it tries to do so, the optimizer will terminate with an error. If Continue is set to 0 (int) the optimization process will continue for one more iteration, any other value will terminate the optimization process.

The same as before, except that the user module is the default value.

Example optimizer_initialize(1,evaluate,progress)

Runs the optimization, F is the best (minimal) function value and Status (int) is the status code returned by libLBFGS. Anything except 0 indicates an error, see the documentation of libLBFGS for the meaning.

Get the current value for x[I]. Only possible when the optimizer is initialized or running.

Set the current value for x[I]. Only possible when the optimizer is initialized but not running.

Get the current value for g[I] (the partial derivative of F with respect to x[I]). Only possible when the optimizer is initialized or running.

Set the current value for g[I] (the partial derivative of F with respect to x[I]). Can only be called from the evaluate call back predicate.

Clean up the memory.

Prints a table with the current parameters. See the documentation of libLBFGS for the meaning of each parameter.

?- optimizer_parameters. ========================================================================================== Type Name Value Description ========================================================================================== int m 6 The number of corrections to approximate the inverse hessian matrix. float epsilon 1e-05 Epsilon for convergence test. int past 0 Distance for delta-based convergence test. float delta 1e-05 Delta for convergence test. int max_iterations 0 The maximum number of iterations int linesearch 0 The line search algorithm. int max_linesearch 40 The maximum number of trials for the line search. float min_step 1e-20 The minimum step of the line search routine. float max_step 1e+20 The maximum step of the line search. float ftol 0.0001 A parameter to control the accuracy of the line search routine. float gtol 0.9 A parameter to control the accuracy of the line search routine. float xtol 1e-16 The machine precision for floating-point values. float orthantwise_c 0.0 Coefficient for the L1 norm of variables int orthantwise_start 0 Start index for computing the L1 norm of the variables. int orthantwise_end -1 End index for computing the L1 norm of the variables. ========================================================================================== use optimizer_set_paramater(Name,Value) to change parameters use optimizer_get_parameter(Name,Value) to see current parameters use optimizer_parameters to print this overview

Set the parameter Name to Value. Only possible while the optimizer is not running.

Get the current Value for Name

The following Prolog program (ex1.pl) searches for minimas of the function f(x0)=sin(x0). In order to do so, it provides the call back predicate evaluate which calculates f(x0) and the gradient d/dx0 f=cos(x0).

:- use_module(lbfgs). % This is the call back function which evaluates F and the gradient of F evaluate(FX,_N,_Step) :- optimizer_get_x(0,X0), FX is sin(X0), G0 is cos(X0), optimizer_set_g(0,G0). % This is the call back function which is invoked to report the progress % if the last argument is set to anything else than 0, the optimizer will % stop right now progress(FX,X_Norm,G_Norm,Step,_N,Iteration,Ls,0) :- optimizer_get_x(0,X0), format('~d. Iteration : x0=~4f f(X)=~4f |X|=~4f |X\'|=~4f Step=~4f Ls=~4f~n', [Iteration,X0,FX,X_Norm,G_Norm,Step,Ls]). demo :- format('Optimizing the function f(x0) = sin(x0)~n',[]), optimizer_initialize(1,evaluate,progress), StartX is random*10, format('We start the search at the random position x0=~5f~2n',[StartX]), optimizer_set_x(0,StartX), optimizer_run(BestF,Status), optimizer_get_x(0,BestX0), optimizer_finalize, format('~2nOptimization done~nWe found a minimum at f(~f)=~f~2nLBFGS Status=~w~n',[BestX0,BestF,Status]).

The output of this program is something like:

?- demo. Optimizing the function f(x0) = sin(x0) We start the search at the random position x0=7.24639 1. Iteration : x0=5.0167 f(X)=-0.9541 |X|=5.0167 |X'|=0.2996 Step=3.9057 Ls=3.0000 2. Iteration : x0=4.7708 f(X)=-0.9983 |X|=4.7708 |X'|=0.0584 Step=0.0998 Ls=2.0000 3. Iteration : x0=4.7113 f(X)=-1.0000 |X|=4.7113 |X'|=0.0011 Step=1.0000 Ls=1.0000 4. Iteration : x0=4.7124 f(X)=-1.0000 |X|=4.7124 |X'|=0.0000 Step=1.0000 Ls=1.0000 Optimization done We found a minimum at f(4.712390)=-1.000000 LBFGS Status=0 yes ?-