See: Description
| Interface | Description |
|---|---|
| DiffFloatFunction |
An interface for once-differentiable double-valued functions over
double arrays.
|
| DiffFunction |
An interface for once-differentiable double-valued functions over
double arrays.
|
| Evaluator | |
| FloatFunction |
An interface for double-valued functions over double arrays.
|
| Function |
An interface for double-valued functions over double arrays.
|
| HasEvaluators |
Indicates that an minimizer supports evaluation periodically
|
| HasFeatureGrouping |
Indicates that an minimizer supports grouping features for g-lasso or ae-lasso
|
| HasFloatInitial |
Indicates that a function has a method for supplying an intitial value.
|
| HasInitial |
Indicates that a function has a method for supplying an intitial value.
|
| HasRegularizerParamRange |
Indicates that an minimizer supports evaluation periodically
|
| LineSearcher |
The interface for one variable function minimizers.
|
| Minimizer<T extends Function> |
The interface for unconstrained function minimizers.
|
| StochasticMinimizer.PropertySetter<T1> |
| Class | Description |
|---|---|
| AbstractCachingDiffFloatFunction | |
| AbstractCachingDiffFunction |
A differentiable function that caches the last evaluation of its value and
derivative.
|
| AbstractStochasticCachingDiffFunction | |
| AbstractStochasticCachingDiffUpdateFunction |
Function for stochastic calculations that does update in place
(instead of maintaining and returning the derivative).
|
| CGMinimizer |
Conjugate-gradient implementation based on the code in Numerical
Recipes in C.
|
| CmdEvaluator |
Runs a cmdline to evaluate a dataset (assumes cmd takes input from stdin)
|
| GoldenSectionLineSearch |
A class to do golden section line search.
|
| HybridMinimizer |
Hybrid Minimizer is set up as a combination of two minimizers.
|
| InefficientSGDMinimizer<T extends Function> |
Stochastic Gradient Descent Minimizer.
|
| MemoryEvaluator |
Evaluate current memory usage
|
| QNMinimizer |
An implementation of L-BFGS for Quasi Newton unconstrained minimization.
|
| QNMinimizer.SurpriseConvergence | |
| ResultStoringFloatMonitor | |
| ResultStoringMonitor | |
| ScaledSGDMinimizer<Q extends AbstractStochasticCachingDiffFunction> |
Stochastic Gradient Descent To Quasi Newton Minimizer.
|
| ScaledSGDMinimizer.Weights | |
| SGDMinimizer<T extends Function> |
In place Stochastic Gradient Descent Minimizer.
|
| SGDToQNMinimizer |
Stochastic Gradient Descent To Quasi Newton Minimizer
An experimental minimizer which takes a stochastic function (one implementing AbstractStochasticCachingDiffFunction)
and executes SGD for the first couple passes.
|
| SGDWithAdaGradAndFOBOS<T extends DiffFunction> |
Stochastic Gradient Descent With AdaGrad and FOBOS in batch mode.
|
| SMDMinimizer<T extends Function> |
Stochastic Meta Descent Minimizer based on
|
| SQNMinimizer<T extends Function> |
Online Limited-Memory Quasi-Newton BFGS implementation based on the algorithms in
|
| StochasticDiffFunctionTester | |
| StochasticMinimizer<T extends Function> |
Stochastic Gradient Descent Minimizer.
|
| Enum | Description |
|---|---|
| AbstractStochasticCachingDiffFunction.SamplingMethod | |
| QNMinimizer.eLineSearch | |
| QNMinimizer.eScaling | |
| QNMinimizer.eState | |
| SGDWithAdaGradAndFOBOS.Prior | |
| StochasticCalculateMethods |
This enumeratin was created to organize the selection of different methods for stochastic
calculations.
|