go home Home | Main Page | Modules | Namespace List | Class Hierarchy | Alphabetical List | Data Structures | File List | Namespace Members | Data Fields | Globals | Related Pages
Public Types | Public Member Functions | Static Public Member Functions | Protected Member Functions | Private Member Functions
itk::RSGDEachParameterApartOptimizer Class Reference

#include <itkRSGDEachParameterApartOptimizer.h>

Detailed Description

An optimizer based on gradient descent.

This class is almost a copy of the normal itk::RegularStepGradientDescentOptimizer. The difference is that each parameter has its own step length, whereas the normal RSGD has one step length that is used for all parameters.

This could cause inaccuracies, if, for example, parameter 1, 2 and 3 are already close to the optimum, but parameter 4 not yet. The average stepsize is halved then, so parameter 4 will not have time to reach its optimum (in a worst case scenario).

The RSGDEachParameterApart stops only if ALL steplenghts are smaller than the MinimumStepSize given in the parameter file!

Note that this is a quite experimental optimizer, currently only used for some specific tests.

See also

Definition at line 53 of file itkRSGDEachParameterApartOptimizer.h.

Inheritance diagram for itk::RSGDEachParameterApartOptimizer:
Inheritance graph

Public Types

typedef SmartPointer< const SelfConstPointer
typedef CostFunctionType::Pointer CostFunctionPointer
typedef Superclass::CostFunctionType CostFunctionType
typedef SmartPointer< SelfPointer
typedef RSGDEachParameterApartOptimizer Self
typedef RSGDEachParameterApartBaseOptimizer Superclass
- Public Types inherited from itk::RSGDEachParameterApartBaseOptimizer
typedef SmartPointer< const SelfConstPointer
typedef SmartPointer< SelfPointer
typedef RSGDEachParameterApartBaseOptimizer Self
enum  StopConditionType {
  GradientMagnitudeTolerance = 1, StepTooSmall, ImageNotAvailable, SamplesNotAvailable,
  MaximumNumberOfIterations, MetricError
typedef SingleValuedNonLinearOptimizer Superclass

Public Member Functions

virtual const char * GetClassName () const
- Public Member Functions inherited from itk::RSGDEachParameterApartBaseOptimizer
virtual unsigned long GetCurrentIteration () const
virtual double GetCurrentStepLength () const
virtual const DerivativeType & GetCurrentStepLengths ()
virtual const DerivativeType & GetGradient ()
virtual double GetGradientMagnitude () const
virtual double GetGradientMagnitudeTolerance () const
virtual bool GetMaximize () const
virtual double GetMaximumStepLength () const
bool GetMinimize () const
virtual double GetMinimumStepLength () const
virtual unsigned long GetNumberOfIterations () const
virtual StopConditionType GetStopCondition () const
virtual MeasureType GetValue () const
virtual void MaximizeOff ()
virtual void MaximizeOn ()
void MinimizeOff (void)
void MinimizeOn (void)
void ResumeOptimization (void)
virtual void SetGradientMagnitudeTolerance (double _arg)
virtual void SetMaximize (bool _arg)
virtual void SetMaximumStepLength (double _arg)
void SetMinimize (bool v)
virtual void SetMinimumStepLength (double _arg)
virtual void SetNumberOfIterations (unsigned long _arg)
void StartOptimization (void)
void StopOptimization (void)

Static Public Member Functions

static Pointer New ()
- Static Public Member Functions inherited from itk::RSGDEachParameterApartBaseOptimizer
static Pointer New ()

Protected Member Functions

 RSGDEachParameterApartOptimizer ()
virtual void StepAlongGradient (const DerivativeType &factor, const DerivativeType &transformedGradient)
virtual ~RSGDEachParameterApartOptimizer ()
- Protected Member Functions inherited from itk::RSGDEachParameterApartBaseOptimizer
virtual void AdvanceOneStep (void)
void PrintSelf (std::ostream &os, Indent indent) const
 RSGDEachParameterApartBaseOptimizer ()
virtual ~RSGDEachParameterApartBaseOptimizer ()

Private Member Functions

void operator= (const Self &)
 RSGDEachParameterApartOptimizer (const Self &)

Additional Inherited Members

- Protected Attributes inherited from itk::RSGDEachParameterApartBaseOptimizer
unsigned long m_CurrentIteration
double m_CurrentStepLength
DerivativeType m_CurrentStepLengths
DerivativeType m_Gradient
double m_GradientMagnitude
double m_GradientMagnitudeTolerance
bool m_Maximize
double m_MaximumStepLength
double m_MinimumStepLength
unsigned long m_NumberOfIterations
DerivativeType m_PreviousGradient
bool m_Stop
StopConditionType m_StopCondition
MeasureType m_Value

Member Typedef Documentation

Definition at line 62 of file itkRSGDEachParameterApartOptimizer.h.

Definition at line 73 of file itkRSGDEachParameterApartOptimizer.h.

typedef Superclass::CostFunctionType itk::RSGDEachParameterApartOptimizer::CostFunctionType

Cost function typedefs.

Definition at line 69 of file itkRSGDEachParameterApartOptimizer.h.

Definition at line 61 of file itkRSGDEachParameterApartOptimizer.h.

Standard class typedefs.

Definition at line 59 of file itkRSGDEachParameterApartOptimizer.h.

Definition at line 60 of file itkRSGDEachParameterApartOptimizer.h.

Constructor & Destructor Documentation

itk::RSGDEachParameterApartOptimizer::RSGDEachParameterApartOptimizer ( )

Definition at line 77 of file itkRSGDEachParameterApartOptimizer.h.

virtual itk::RSGDEachParameterApartOptimizer::~RSGDEachParameterApartOptimizer ( )

Definition at line 78 of file itkRSGDEachParameterApartOptimizer.h.

itk::RSGDEachParameterApartOptimizer::RSGDEachParameterApartOptimizer ( const Self )

Member Function Documentation

virtual const char* itk::RSGDEachParameterApartOptimizer::GetClassName ( ) const

Run-time type information (and related methods).

Reimplemented from itk::RSGDEachParameterApartBaseOptimizer.

Reimplemented in elastix::RSGDEachParameterApart< TElastix >.

static Pointer itk::RSGDEachParameterApartOptimizer::New ( )

Method for creation through the object factory.

void itk::RSGDEachParameterApartOptimizer::operator= ( const Self )
virtual void itk::RSGDEachParameterApartOptimizer::StepAlongGradient ( const DerivativeType &  factor,
const DerivativeType &  transformedGradient 

Advance one step along the corrected gradient taking into account the steplengths represented by the factor array. This method is invoked by AdvanceOneStep. It is expected to be overrided by optimization methods in non-vector spaces

See also

Reimplemented from itk::RSGDEachParameterApartBaseOptimizer.

Generated on 04-09-2015 for elastix by doxygen elastix logo