Optimization package used to optimize functions.

There are a few components to the optimization package:
Optimizers - Algorithms used to optimize the functions.
Optimizables - Functions to be optimized.
Maximizing - How to maximize a function.
Derivatives - How to find the first and second derivative numerically.



Optimizable

  • Optimizable1DContinuous
  • Optimizable1DContinuousDifferentiable
  • OptimizableNDContinuous
  • OptimizableNDContinuousDifferentiable

  • 1. Optimizable1DContinuous - Your function needs to be piecewise continuous for the given domain. Your function class needs to implement the following methods:

    public double getValue(double x);
    Returns the value of your function with the input x.

    public double getDomainMin();
    public double getDomainMax();

    Returns the minimum and maximum domain of your function. You should return Double.MIN_VALUE and DOUBLE.MAX_VALUE if your function is continuous everywhere.

    public double getRangeMin();
    public double getRangeMax();

    Returns the minimum and maximum range of your function. Some optimization functions may be faster if the range is known. However, no current optimizers use it.

    public double getDomainTolerance();
    Returns the tolerance of the function. The optimizers will stop calculating once the the difference between 2 iterations is less than the tolerance.

    2. Optimizable1DContinuousDifferentiable - Your function needs to be differentiable for the given domain. It is not necessary to implement Optimizable1DContinuousDifferentiable for a differentiable function. Doing so just gives you some more optimizers that can be used. Your function class needs to implement the following methods in addition to the ones in Optimizable1DContinuous:

    public double getDerivative(double x);
    public double get2ndDerivative(double x);

    Returns the first and the second derivatives of the functions.

    3. OptimizableNDContinuous - Your function needs to be piecewise continuous for the given domain. Your function class needs to implement the following methods:

    public int getNumberOfDimensions();
    Returns the number of dimensions your function has.

    public double getValue(double x[]);
    Returns the value with the given set of points.

    public double[] getDomainMin();
    public double[] getDomainMax();
    public double[] getRangeMin();
    public double[] getRangeMax();

    These methods are like the ones for Optimizable1DContinuous except that an array of doubles is returned. Each element, n, in the array corresponds to the nth value for the nth dimension. You should return Double.MIN_VALUE and DOUBLE.MAX_VALUE if value is unknown or is continuous everywhere.

    public double getDomainTolerance();
    Returns the tolerance of the function. The optimizers will stop calculating once the the difference between 2 iterations is less than the tolerance.

    4. OptimizableNDContinuousDifferentiable - Your function needs to be differentiable for the given domain. It is not necessary to implement OptimizableNDContinuousDifferentiable for a differentiable function. Doing so just gives you some more optimizers that can be used. Your function class needs to implement the following methods in addition to the ones in OptimizableNDContinuous:

    public double[] getDerivative(double []x);
    Returns an array of derivatives at the given location x. Each element, n, in the array corresponds to the derivative of the nth dimension.

    public double[][] get2ndDerivative(double []x);
    Returns a matrix of second derivatives (Hessian) at the given location x.

    public double get2ndDerivative(double[] x, int dim1, int dim2);
    Returns the value of the second derivative at a given location x in the direction dim1, dim2.



    Optimizers

    There are 4 types of optimizers that you can use:

    Your function must implement the Optimizable version of the interface.

    1D Optimizers:

    GoldenSectionSearch1D:
    This uses the golden section search to find extremas. It implements Optimizer1DContinuous. It is essentially a binary search with optimized ratios for distances traveled per iteration.

    NewtonMethod1D:
    This uses Newton's method to find extremas. It implements Optimizer1DContinuousDifferentiable. Your function must have 1st and 2nd derivatives. The 2nd derivative is used to find the roots of the 1st derivative, which in turn is where the extrema of the function is.

    BrentMethod1D;
    This uses Brent's method to find extremas. It will use the faster method of Parabolic Interpolation when the function is nice, and it will revert to the more robust Golden Section Search when it isn't. It implements Optimizer1DContinuous.

    Using these 1D optimizers:

    There are 2 constructors for each of these optimizers:
    public Constructor();
    public Constructor(int maxIter);

    The default constructor will use the default setting of 1000 iterations. You can also choose to set your own maximum iterations.

    To initialize your function:
    public void initialize(Optimizable1DContinuous function);
    public void initialize(Optimizable1DContinuousDifferentiable function);

    Depending on whether the optimizer takes in an Optimizable1DContinuous or an Optimizable1DContinuousDifferentiable function, you will call one of these methods.

    To optimize:
    public boolean optimize(boolean findMinima);
    You then call this method with true if you're looking for the minimum or false if you're looking for the maximum. It will return whether it was able to succesfully optimize the function.

    To get the extrema:
    public double getExtrema();
    This method returns the extrema location.

    To get the status message:
    public String statusMessage();
    It will return a string of the status message telling you things such as whether the maximum number of iterations has been reached and whether it has converged to a single location. It may be as null until a message is stored, so it does not necessarily have to return a meaningful String object.

    ND Optimizers:

    LineSearchND:
    This uses 1D optimizers to optimize one dimension at a time. The constructors are as follows:
    public LineSearchND();
    public LineSearchND(Optimizer1DContinuous alg);
    public LineSearchND(intmaxIter);
    public LineSearchND(Optimizer1DContinuous alg, int maxIter);

    The default maximum iterations is 1000 and the default algorithm is GoldenSectionSearch1D.
    The initialize method is:
    public void initialize(OptimizableNDContinuous function);
    It takes in an OptimizableNDContinuous function.

    LineSearchNDDifferentiable:
    This is the same as the LineSearchND except that the 1D optimizer is an Optimizer1DContinuousDifferentiable instead of an Optimizer1DContinuous.

    DownhillSimplexND:
    This uses Nelder and Mead's simplex method for finding the extrema. The constructors are as follows:
    public DownhillSimplexND();
    public DownhillSimplexND(int maxIter);

    The default maximum iterations is 1000.
    These are the choices for the initialize method:
    public void initialize(OptimizableNDContinuous function);
    public void initialize(OptimizableNDContinuous function, double center[], double rad);

    You can specify an initial guess as to where the extrema is which is located at center. rad is the radius around the center where the extrema might be. By specifying values, it may make calculations faster. The default value for center is the midpoint of the domain. The default value for rad is the average of half the width of the domain in each dimension. For example, if your domain for each dimension is [-100, 100], then rad is 100.

    LM:
    This uses the Levenberg-Marquardt Method of finding extrema. The constructors are as follows:
    public LevenbergMarquardt();
    public LevenbergMarquardt(int maxIter);
    The default maximum iterations is 1000.
    The initialize methods are:
    public void initialize(OptimizableNDContinuousDifferentiable function);
    public void initialize(OptimizableNDContinuousDifferentiable function, double[] init);
    You can specify an initial guess as to where the extrema is which is located at center. By specifying values, it may make calculations faster. The default value for the starting point is the midpoint of the domain.

    Using these ND Optimizers:

    In addition to the constructor and initialize method mentioned above, each optimizer also has:
    boolean optimize(boolean findMinima);
    public double[] getExtrema();
    public String statusMessage();

    They are the same as for the 1D optimizers except that getExtrema() returns an array of the location of the extrema.



    Maximizing Functions

    To maximize a function, we simply need to minimize the negative of the function. To do this, there are 4 MinToMax classes:
    These MinToMax classes will negate the value, derivative, and second derivative of the function. This is how the built in optimizers maximize functions. When false is passed into public void initialize(boolean findMinima), it will pass the function through one of the MinToMax classes.



    Numerical Derivatives

    The derivatives used by the optimizers are calculated numerically. You can use Forward Euler's Method, Backward Euler's Method, 3-point, 5-point, 7-point, or 9-point. 3-point is the default one. You can also specify the step size. The default is 1e-5.

    To use the derivative class for 1D functions, you will need to call one of the following constructors:
    public FunctionNumeric1DDifferentiation(Optimizable1DContinuous function);
    public FunctionNumeric1DDifferentiation(Optimizable1DContinuous function, double step, int method);

    The second constructor allows you to specify your step size and method to use. 0 is Forward Euler, 1 is Backward Euler, and 3, 5, 7, 9 are 3-, 5-, 7-, and 9- point derivatives respectively.

    To use the derivative class for ND functions, you will need to call one of the following constructors:
    public FunctionNumericNDDifferentiation(OptimizableNDContinuous function);
    public FunctionNumericNDDifferentiation(OptimizableNDContinuous function, double step, int method);

    The second constructor allows you to specify your step size and method to use.

    Both of these differentiators implement the interfaces Optimizable1DContinuousDifferentiable and OptimizableNDContinuousDifferentiable respectively.