This is annlib version 0.99.1 Please inform us of any problems you may encounter with this code. This code has been tested on Linux with gcc-2.95 If you successfully compile this on other platforms, please inform us, and include any changes you had to make in order to compile the code. We can be reached at annealing@cs.huji.ac.il There are some assumptions built into this algorithm, and if you wish to use this, your search algorithm must comply with these requirements, else Weight Annealing will not be able to work (at least not directly). 1: The goal of the algorithm is to maximize (or minimize) some score, S(W,E), that is a function of a set of examples and their matching weights. 2: The search procedure must be a "local search" in the sense that it accepts a set of examples with corresponding weights, and a starting point, and it modifies it's "state" (current location in parameter space) in an attempt to improve the score. The procedure must either "remember" it's last state and start from it, or it must output it's end-point, which would be feed back as the starting point for the next round. If you wish to use the Adversarial WA, there is an additional requirement: 3: you must supply a function, dS_dWi, that will give the partial derivative of the score with respect to each individual weight, at the current point and current set of weights. If your code complies with this, here is what you need to do (the numbering here correspond to the numbers in comments in the example code. The relevant comments include a "##" for easy navigation) Random Annealing: ## 1) add the header files #include "WeightUpdate.h" #include "RandomProb.h" ## 2) add initialization for the random number generator (use a long int for repeatable runs) _RandomProbGenerator.Initialize(time(NULL)); ## 3) create a weights vector with the original weights (usually 1.0 all along) vector weights(SIN_NUM,1.0); ## 4) initialize the Weight Update with: Original weights Starting temperature End temperature and Cooling factor tWeightUpdateRandom WU(weights,10,0.01,0.99); ## 5) add a loop around the local search so that you could run some iterations (one, 100, until convergence or any other appropriate number of iterations) with one set of weights, update the weights and run an other set of iterations. loop until the WA is done int loops = 0; while ( !WU.IsDone() ) { ## 6) add in the loop: ## 6a) compute the new weights and cool down WU.CoolDown(); ## 6b) extract the new weights into a vector const vector& newweights = WU.GetWeights(); ## 6c) use the new weights in the local search double newx = sinc_func_max(newweights,x); ## 6d) OPTIONAL: add an stopping criteria (in this case: if we hit a steady state, we exit) if ( newx == x) break; and that's it for the adversarial annealing the steps are very similar: ## 1) add the header files #include "WeightUpdate.h" #include "RandomProb.h" ## 2) add initialization of the random number generator (use a long int for repeatable runs) _RandomProbGenerator.Initialize(time(NULL)); ## 3) create a weights vector with the original weights (usually 1.0 all along) and a gradients vector, both with #(samples) elements vector weights(SIN_NUM,1.0); vector gradients(SIN_NUM,1.0); ## 4) initialize the Weight Update with: Original weights Starting temperature End temperature and Cooling factor tWeightUpdateGradient WU(weights,5.0,0.01,0.99); ## 5) add a loop around the local search so that you could run some iterations (one, 100, until convergence or any other appropriate number of iterations) with one set of weights, update the weights and run an other set of iterations. loop until the WA is done int loops = 0; while ( !WU.IsDone() ) { ## 6) add in the loop: ## 6a) compute the gradients W/R to the weights sinc_func_dw(gradients,x); ## 6b) use the gradients to compute the new weights and cool down WU.CoolDown(&gradients); ## 6c) extract the new weights into a vector const vector& newweights = WU.GetWeights(); ## 6d) use the new weights in the local search double newx = sinc_func_max(newweights,x); ## 6d) OPTIONAL: add an stooping criteria (in this case: if we hit a steady state, we exit) if ( newx == x) break; and that's it. Please cite using this refrence: @incollection{Elidan+al:2002, author = "Gal Elidan and Matan Ninio and Nir Friedman and Dale Schuurmans", booktitle = "Proc. National Conference on Artificial Intelligence (AAAI-02)", pages = "132-139", year = "2002", title = "Data Perturbation for Escaping Local Maxima in Learning", } You can contact the authors at annealing@cs.huji.ac.il