go home Home | Main Page | Modules | Namespace List | Class Hierarchy | Alphabetical List | Data Structures | File List | Namespace Members | Data Fields | Globals | Related Pages
Public Types | Public Member Functions | Static Public Member Functions | Protected Member Functions | Protected Attributes | Private Member Functions | Private Attributes
itk::FiniteDifferenceGradientDescentOptimizer Class Reference

#include <itkFiniteDifferenceGradientDescentOptimizer.h>

Detailed Description

An optimizer based on gradient descent ...

If $C(x)$ is a costfunction that has to be minimised, the following iterative algorithm is used to find the optimal parameters x:

\[ x(k+1)_j = x(k)_j - a(k) \left[ C(x(k)_j + c(k)) - C(x(k)_j - c(k)) \right] / 2c(k), \]

for all parameters $j$.

From this equation it is clear that it a gradient descent optimizer, using a finite difference approximation of the gradient.

The gain $a(k)$ at each iteration $k$ is defined by:

\[ a(k) = a / (A + k + 1)^{\alpha}. \]

The perturbation size $c(k)$ at each iteration $k$ is defined by:

\[ c(k) = c / (k + 1)^{\gamma}. \]

Note the similarities to the SimultaneousPerturbation optimizer and the StandardGradientDescent optimizer.

See also
FiniteDifferenceGradientDescent

Definition at line 55 of file itkFiniteDifferenceGradientDescentOptimizer.h.

Inheritance diagram for itk::FiniteDifferenceGradientDescentOptimizer:
Inheritance graph
[legend]

Public Types

typedef SmartPointer< const SelfConstPointer
 
typedef SmartPointer< SelfPointer
 
typedef FiniteDifferenceGradientDescentOptimizer Self
 
enum  StopConditionType { MaximumNumberOfIterations, MetricError }
 
typedef ScaledSingleValuedNonLinearOptimizer Superclass
 
- Public Types inherited from itk::ScaledSingleValuedNonLinearOptimizer
typedef SmartPointer< const SelfConstPointer
 
typedef Superclass::CostFunctionType CostFunctionType
 
typedef Superclass::DerivativeType DerivativeType
 
typedef Superclass::MeasureType MeasureType
 
typedef Superclass::ParametersType ParametersType
 
typedef SmartPointer< SelfPointer
 
typedef ScaledCostFunctionType::Pointer ScaledCostFunctionPointer
 
typedef ScaledSingleValuedCostFunction ScaledCostFunctionType
 
typedef NonLinearOptimizer::ScalesType ScalesType
 
typedef ScaledSingleValuedNonLinearOptimizer Self
 
typedef SingleValuedNonLinearOptimizer Superclass
 

Public Member Functions

virtual void AdvanceOneStep (void)
 
virtual void ComputeCurrentValueOff ()
 
virtual void ComputeCurrentValueOn ()
 
virtual const char * GetClassName () const
 
virtual bool GetComputeCurrentValue () const
 
virtual unsigned long GetCurrentIteration () const
 
virtual double GetGradientMagnitude () const
 
virtual double GetLearningRate () const
 
virtual unsigned long GetNumberOfIterations () const
 
virtual double GetParam_a ()
 
virtual double GetParam_A ()
 
virtual double GetParam_alpha ()
 
virtual double GetParam_c ()
 
virtual double GetParam_gamma ()
 
virtual StopConditionType GetStopCondition () const
 
virtual double GetValue () const
 
void ResumeOptimization (void)
 
virtual void SetComputeCurrentValue (bool _arg)
 
virtual void SetNumberOfIterations (unsigned long _arg)
 
virtual void SetParam_a (double _arg)
 
virtual void SetParam_A (double _arg)
 
virtual void SetParam_alpha (double _arg)
 
virtual void SetParam_c (double _arg)
 
virtual void SetParam_gamma (double _arg)
 
void StartOptimization (void)
 
void StopOptimization (void)
 
- Public Member Functions inherited from itk::ScaledSingleValuedNonLinearOptimizer
virtual const ParametersTypeGetCurrentPosition (void) const
 
virtual bool GetMaximize () const
 
virtual const ScaledCostFunctionTypeGetScaledCostFunction ()
 
virtual const ParametersTypeGetScaledCurrentPosition ()
 
bool GetUseScales (void) const
 
virtual void InitializeScales (void)
 
virtual void MaximizeOff ()
 
virtual void MaximizeOn ()
 
virtual void SetCostFunction (CostFunctionType *costFunction)
 
virtual void SetMaximize (bool _arg)
 
virtual void SetUseScales (bool arg)
 

Static Public Member Functions

static Pointer New ()
 
- Static Public Member Functions inherited from itk::ScaledSingleValuedNonLinearOptimizer
static Pointer New ()
 

Protected Member Functions

virtual double Compute_a (unsigned long k) const
 
virtual double Compute_c (unsigned long k) const
 
 FiniteDifferenceGradientDescentOptimizer ()
 
void PrintSelf (std::ostream &os, Indent indent) const
 
virtual ~FiniteDifferenceGradientDescentOptimizer ()
 
- Protected Member Functions inherited from itk::ScaledSingleValuedNonLinearOptimizer
virtual void GetScaledDerivative (const ParametersType &parameters, DerivativeType &derivative) const
 
virtual MeasureType GetScaledValue (const ParametersType &parameters) const
 
virtual void GetScaledValueAndDerivative (const ParametersType &parameters, MeasureType &value, DerivativeType &derivative) const
 
void PrintSelf (std::ostream &os, Indent indent) const
 
 ScaledSingleValuedNonLinearOptimizer ()
 
virtual void SetCurrentPosition (const ParametersType &param)
 
virtual void SetScaledCurrentPosition (const ParametersType &parameters)
 
virtual ~ScaledSingleValuedNonLinearOptimizer ()
 

Protected Attributes

bool m_ComputeCurrentValue
 
DerivativeType m_Gradient
 
double m_GradientMagnitude
 
double m_LearningRate
 
- Protected Attributes inherited from itk::ScaledSingleValuedNonLinearOptimizer
ScaledCostFunctionPointer m_ScaledCostFunction
 
ParametersType m_ScaledCurrentPosition
 

Private Member Functions

 FiniteDifferenceGradientDescentOptimizer (const Self &)
 
void operator= (const Self &)
 

Private Attributes

unsigned long m_CurrentIteration
 
unsigned long m_NumberOfIterations
 
double m_Param_a
 
double m_Param_A
 
double m_Param_alpha
 
double m_Param_c
 
double m_Param_gamma
 
bool m_Stop
 
StopConditionType m_StopCondition
 
double m_Value
 

Member Typedef Documentation

Standard class typedefs.

Definition at line 61 of file itkFiniteDifferenceGradientDescentOptimizer.h.

Member Enumeration Documentation

Codes of stopping conditions

Enumerator
MaximumNumberOfIterations 
MetricError 

Definition at line 73 of file itkFiniteDifferenceGradientDescentOptimizer.h.

Constructor & Destructor Documentation

itk::FiniteDifferenceGradientDescentOptimizer::FiniteDifferenceGradientDescentOptimizer ( )
protected
virtual itk::FiniteDifferenceGradientDescentOptimizer::~FiniteDifferenceGradientDescentOptimizer ( )
inlineprotectedvirtual
itk::FiniteDifferenceGradientDescentOptimizer::FiniteDifferenceGradientDescentOptimizer ( const Self )
private

Member Function Documentation

virtual void itk::FiniteDifferenceGradientDescentOptimizer::AdvanceOneStep ( void  )
virtual

Advance one step following the gradient direction.

virtual double itk::FiniteDifferenceGradientDescentOptimizer::Compute_a ( unsigned long  k) const
protectedvirtual
virtual double itk::FiniteDifferenceGradientDescentOptimizer::Compute_c ( unsigned long  k) const
protectedvirtual
virtual void itk::FiniteDifferenceGradientDescentOptimizer::ComputeCurrentValueOff ( )
virtual
virtual void itk::FiniteDifferenceGradientDescentOptimizer::ComputeCurrentValueOn ( )
virtual
virtual const char* itk::FiniteDifferenceGradientDescentOptimizer::GetClassName ( ) const
virtual

Run-time type information (and related methods).

Reimplemented from itk::ScaledSingleValuedNonLinearOptimizer.

Reimplemented in elastix::FiniteDifferenceGradientDescent< TElastix >.

virtual bool itk::FiniteDifferenceGradientDescentOptimizer::GetComputeCurrentValue ( ) const
virtual
virtual unsigned long itk::FiniteDifferenceGradientDescentOptimizer::GetCurrentIteration ( ) const
virtual

Get the current iteration number.

virtual double itk::FiniteDifferenceGradientDescentOptimizer::GetGradientMagnitude ( ) const
virtual

Get the CurrentStepLength, GradientMagnitude and LearningRate (a_k)

virtual double itk::FiniteDifferenceGradientDescentOptimizer::GetLearningRate ( ) const
virtual
virtual unsigned long itk::FiniteDifferenceGradientDescentOptimizer::GetNumberOfIterations ( ) const
virtual

Get the number of iterations.

virtual double itk::FiniteDifferenceGradientDescentOptimizer::GetParam_a ( )
virtual
virtual double itk::FiniteDifferenceGradientDescentOptimizer::GetParam_A ( )
virtual
virtual double itk::FiniteDifferenceGradientDescentOptimizer::GetParam_alpha ( )
virtual
virtual double itk::FiniteDifferenceGradientDescentOptimizer::GetParam_c ( )
virtual
virtual double itk::FiniteDifferenceGradientDescentOptimizer::GetParam_gamma ( )
virtual
virtual StopConditionType itk::FiniteDifferenceGradientDescentOptimizer::GetStopCondition ( ) const
virtual

Get Stop condition.

virtual double itk::FiniteDifferenceGradientDescentOptimizer::GetValue ( ) const
virtual

Get the current value.

static Pointer itk::FiniteDifferenceGradientDescentOptimizer::New ( )
static

Method for creation through the object factory.

void itk::FiniteDifferenceGradientDescentOptimizer::operator= ( const Self )
private
void itk::FiniteDifferenceGradientDescentOptimizer::PrintSelf ( std::ostream &  os,
Indent  indent 
) const
protected

PrintSelf method.

void itk::FiniteDifferenceGradientDescentOptimizer::ResumeOptimization ( void  )

Resume previously stopped optimization with current parameters

See also
StopOptimization.
virtual void itk::FiniteDifferenceGradientDescentOptimizer::SetComputeCurrentValue ( bool  _arg)
virtual
virtual void itk::FiniteDifferenceGradientDescentOptimizer::SetNumberOfIterations ( unsigned long  _arg)
virtual

Set the number of iterations.

virtual void itk::FiniteDifferenceGradientDescentOptimizer::SetParam_a ( double  _arg)
virtual

Set/Get a.

virtual void itk::FiniteDifferenceGradientDescentOptimizer::SetParam_A ( double  _arg)
virtual

Set/Get A.

virtual void itk::FiniteDifferenceGradientDescentOptimizer::SetParam_alpha ( double  _arg)
virtual

Set/Get alpha.

virtual void itk::FiniteDifferenceGradientDescentOptimizer::SetParam_c ( double  _arg)
virtual

Set/Get c.

virtual void itk::FiniteDifferenceGradientDescentOptimizer::SetParam_gamma ( double  _arg)
virtual

Set/Get gamma.

void itk::FiniteDifferenceGradientDescentOptimizer::StartOptimization ( void  )

Start optimization.

void itk::FiniteDifferenceGradientDescentOptimizer::StopOptimization ( void  )

Stop optimization.

See also
ResumeOptimization

Field Documentation

bool itk::FiniteDifferenceGradientDescentOptimizer::m_ComputeCurrentValue
protected

Boolean that says if the current value of the metric has to be computed. This is not necessary for optimisation; just nice for progress information.

Definition at line 153 of file itkFiniteDifferenceGradientDescentOptimizer.h.

unsigned long itk::FiniteDifferenceGradientDescentOptimizer::m_CurrentIteration
private
DerivativeType itk::FiniteDifferenceGradientDescentOptimizer::m_Gradient
protected
double itk::FiniteDifferenceGradientDescentOptimizer::m_GradientMagnitude
protected
double itk::FiniteDifferenceGradientDescentOptimizer::m_LearningRate
protected
unsigned long itk::FiniteDifferenceGradientDescentOptimizer::m_NumberOfIterations
private
double itk::FiniteDifferenceGradientDescentOptimizer::m_Param_a
private

Parameters, as described by Spall.

Definition at line 173 of file itkFiniteDifferenceGradientDescentOptimizer.h.

double itk::FiniteDifferenceGradientDescentOptimizer::m_Param_A
private
double itk::FiniteDifferenceGradientDescentOptimizer::m_Param_alpha
private
double itk::FiniteDifferenceGradientDescentOptimizer::m_Param_c
private
double itk::FiniteDifferenceGradientDescentOptimizer::m_Param_gamma
private
bool itk::FiniteDifferenceGradientDescentOptimizer::m_Stop
private

Private member variables.

Definition at line 166 of file itkFiniteDifferenceGradientDescentOptimizer.h.

StopConditionType itk::FiniteDifferenceGradientDescentOptimizer::m_StopCondition
private
double itk::FiniteDifferenceGradientDescentOptimizer::m_Value
private


Generated on 04-09-2015 for elastix by doxygen 1.8.9.1 elastix logo