Storm
A Modern Probabilistic Model Checker
Loading...
Searching...
No Matches
storm::derivative::GradientDescentInstantiationSearcher< FunctionType, ConstantType > Class Template Reference

#include <GradientDescentInstantiationSearcher.h>

Classes

struct  VisualizationPoint
 A point in the Gradient Descent walk, recorded if recordRun is set to true in the constructor (false by default). More...
 

Public Member Functions

 GradientDescentInstantiationSearcher (storm::models::sparse::Dtmc< FunctionType > const &model, GradientDescentMethod method=GradientDescentMethod::ADAM, ConstantType learningRate=0.1, ConstantType averageDecay=0.9, ConstantType squaredAverageDecay=0.999, uint_fast64_t miniBatchSize=32, ConstantType terminationEpsilon=1e-6, boost::optional< std::map< typename utility::parametric::VariableType< FunctionType >::type, typename utility::parametric::CoefficientType< FunctionType >::type > > startPoint=boost::none, GradientDescentConstraintMethod constraintMethod=GradientDescentConstraintMethod::PROJECT_WITH_GRADIENT, bool recordRun=false)
 The GradientDescentInstantiationSearcher can find extrema and feasible instantiations in pMCs, for either rewards or probabilities.
 
void setup (Environment const &env, std::shared_ptr< storm::pars::FeasibilitySynthesisTask const > const &task)
 This will setup the matrices used for computing the derivatives by constructing the SparseDerivativeInstantiationModelChecker.
 
std::pair< std::map< typename utility::parametric::VariableType< FunctionType >::type, typename utility::parametric::CoefficientType< FunctionType >::type >, ConstantType > gradientDescent ()
 Perform Gradient Descent.
 
void printRunAsJson ()
 Print the previously done run as JSON.
 
std::vector< VisualizationPointgetVisualizationWalk ()
 Get the visualization walk that is recorded if recordRun is set to true in the constructor (false by default).
 

Detailed Description

template<typename FunctionType, typename ConstantType>
class storm::derivative::GradientDescentInstantiationSearcher< FunctionType, ConstantType >

Definition at line 23 of file GradientDescentInstantiationSearcher.h.

Constructor & Destructor Documentation

◆ GradientDescentInstantiationSearcher()

template<typename FunctionType , typename ConstantType >
storm::derivative::GradientDescentInstantiationSearcher< FunctionType, ConstantType >::GradientDescentInstantiationSearcher ( storm::models::sparse::Dtmc< FunctionType > const &  model,
GradientDescentMethod  method = GradientDescentMethod::ADAM,
ConstantType  learningRate = 0.1,
ConstantType  averageDecay = 0.9,
ConstantType  squaredAverageDecay = 0.999,
uint_fast64_t  miniBatchSize = 32,
ConstantType  terminationEpsilon = 1e-6,
boost::optional< std::map< typename utility::parametric::VariableType< FunctionType >::type, typename utility::parametric::CoefficientType< FunctionType >::type > >  startPoint = boost::none,
GradientDescentConstraintMethod  constraintMethod = GradientDescentConstraintMethod::PROJECT_WITH_GRADIENT,
bool  recordRun = false 
)
inline

The GradientDescentInstantiationSearcher can find extrema and feasible instantiations in pMCs, for either rewards or probabilities.

Internally uses the SparseDerivativeInstantiationModelChecker to evaluate derivatives at instantiations.

Parameters
envThe environment. Always pass the same environment to the gradientDescent call!
modelThe Dtmc to optimize.
methodThe method of gradient descent that is used.
learningRateThe learning rate of the Gradient Descent.
averageDecayDecay of ADAM's decaying average.
squaredAverageDecayDecay of ADAM's squared decaying average.
miniBatchSizeSize of a minibatch.
terminationEpsilonA value step smaller than this is considered a "tiny step", after a number of these the algorithm will terminate.
startPointStart point of the search (default: all parameters set to 0.5)
recordRunRecords the run into a global variable, which can be converted into JSON using the printRunAsJson function

Definition at line 42 of file GradientDescentInstantiationSearcher.h.

Member Function Documentation

◆ getVisualizationWalk()

template<typename FunctionType , typename ConstantType >
std::vector< typename GradientDescentInstantiationSearcher< FunctionType, ConstantType >::VisualizationPoint > storm::derivative::GradientDescentInstantiationSearcher< FunctionType, ConstantType >::getVisualizationWalk ( )

Get the visualization walk that is recorded if recordRun is set to true in the constructor (false by default).

Definition at line 532 of file GradientDescentInstantiationSearcher.cpp.

◆ gradientDescent()

template<typename FunctionType , typename ConstantType >
std::pair< std::map< VariableType< FunctionType >, CoefficientType< FunctionType > >, ConstantType > storm::derivative::GradientDescentInstantiationSearcher< FunctionType, ConstantType >::gradientDescent ( )

Perform Gradient Descent.

Definition at line 382 of file GradientDescentInstantiationSearcher.cpp.

◆ printRunAsJson()

template<typename FunctionType , typename ConstantType >
void storm::derivative::GradientDescentInstantiationSearcher< FunctionType, ConstantType >::printRunAsJson ( )

Print the previously done run as JSON.

This run can be retrieved using getVisualizationWalk.

Definition at line 511 of file GradientDescentInstantiationSearcher.cpp.

◆ setup()

template<typename FunctionType , typename ConstantType >
void storm::derivative::GradientDescentInstantiationSearcher< FunctionType, ConstantType >::setup ( Environment const &  env,
std::shared_ptr< storm::pars::FeasibilitySynthesisTask const > const &  task 
)
inline

This will setup the matrices used for computing the derivatives by constructing the SparseDerivativeInstantiationModelChecker.

Note this consumes a substantial amount of memory, if the model is big and there's a large number of parameters.

Parameters
envThe environment. We need this because we need to know what kind of equation solver we are dealing with.
taskThe FeasibilitySynthesisTask.

Definition at line 135 of file GradientDescentInstantiationSearcher.h.


The documentation for this class was generated from the following files: