This class can be used to model Diagonal Gaussian Mixture Models.
Inheritance:
Public Fields
-
int n_gaussians
- number of Gaussians in the mixture
-
real prior_weights
- prior weights of the Gaussians, used in EM to give a small prior on each Gaussian
-
EMTrainer* initial_kmeans_trainer
- optional initializations if nothing is given, then random, at your own risks.
-
List* initial_kmeans_trainer_measurers
- as well as a measurer of this trainer
-
List* initial_params
- or one can give an initial parameter List
-
char* initial_file
- or one can give an initial file
-
real* log_weights
- the pointers to the parameters
-
real* dlog_weights
- the pointers to the derivative of the parameters
-
real* var_threshold
- this contains the minimal value of each variance
-
real** log_probabilities_g
- for each frame, for each gaussian, keep its log probability
-
real* sum_log_var_plus_n_obs_log_2_pi
- in order to faster the computation, we can do some "pre-computation" pre-computed sum_log_var + n_obs * log_2_pi
-
real** minus_half_over_var
- pre-computed -05 / var
-
real** means_acc
- accumulators for EM
Public Methods
-
DiagonalGMM(int n_observations_, int n_gaussians_, real* var_threshold_, real prior_weights_)
-
virtual real frameLogProbabilityOneGaussian(real* observations, real* inputs, int g)
- this method returns the log probability of the "g" Gaussian
Public Fields
-
int n_observations
-
int tot_n_frames
-
int max_n_frames
-
real log_probability
-
real* log_probabilities
Public Methods
-
virtual real logProbability(List* inputs)
-
virtual real viterbiLogProbability(List* inputs)
-
virtual real frameLogProbability(real* observations, real* inputs, int t)
-
virtual void frameExpectation(real* observations, real* inputs, int t)
-
virtual void eMIterInitialize()
-
virtual void iterInitialize()
-
virtual void eMSequenceInitialize(List* inputs)
-
virtual void sequenceInitialize(List* inputs)
-
virtual void eMAccPosteriors(List* inputs, real log_posterior)
-
virtual void frameEMAccPosteriors(real* observations, real log_posterior, real* inputs, int t)
-
virtual void viterbiAccPosteriors(List* inputs, real log_posterior)
-
virtual void frameViterbiAccPosteriors(real* observations, real log_posterior, real* inputs, int t)
-
virtual void eMUpdate()
-
virtual void decode(List* inputs)
-
virtual void eMForward(List* inputs)
-
virtual void viterbiForward(List* inputs)
-
virtual void frameBackward(real* observations, real* alpha, real* inputs, int t)
-
virtual void viterbiBackward(List* inputs, real* alpha)
Public Fields
-
bool is_free
-
List* params
-
List* der_params
-
int n_params
-
real* beta
Public Methods
-
virtual void init()
-
virtual int numberOfParams()
-
virtual void backward(List* inputs, real* alpha)
-
virtual void allocateMemory()
-
virtual void freeMemory()
-
virtual void loadFILE(FILE* file)
-
virtual void saveFILE(FILE* file)
Inherited from Machine:
Public Fields
-
int n_inputs
-
int n_outputs
-
List* outputs
Public Methods
-
virtual void forward(List* inputs)
-
virtual void reset()
Inherited from Object:
Public Methods
-
void addOption(const char* name, int size, void* ptr, const char* help="", bool is_allowed_after_init=false)
-
void addIOption(const char* name, int* ptr, int init_value, const char* help="", bool is_allowed_after_init=false)
-
void addROption(const char* name, real* ptr, real init_value, const char* help="", bool is_allowed_after_init=false)
-
void addBOption(const char* name, bool* ptr, bool init_value, const char* help="", bool is_allowed_after_init=false)
-
void setOption(const char* name, void* ptr)
-
void setIOption(const char* name, int option)
-
void setROption(const char* name, real option)
-
void setBOption(const char* name, bool option)
-
void load(const char* filename)
-
void save(const char* filename)
Documentation
This class can be used to model Diagonal Gaussian Mixture Models.
They can be trained using either EM (with EMTrainer) or gradient descent
(with GMTrainer).
int n_gaussians
- number of Gaussians in the mixture
real prior_weights
- prior weights of the Gaussians, used in EM to give
a small prior on each Gaussian
EMTrainer* initial_kmeans_trainer
- optional initializations
if nothing is given, then random, at your own risks.
one can give a initial trainer containing a kmeans
List* initial_kmeans_trainer_measurers
- as well as a measurer of this trainer
List* initial_params
- or one can give an initial parameter List
char* initial_file
- or one can give an initial file
real* log_weights
- the pointers to the parameters
real* dlog_weights
- the pointers to the derivative of the parameters
real* var_threshold
- this contains the minimal value of each variance
real** log_probabilities_g
- for each frame, for each gaussian, keep its log probability
real* sum_log_var_plus_n_obs_log_2_pi
- in order to faster the computation, we can do some "pre-computation"
pre-computed sum_log_var + n_obs * log_2_pi
real** minus_half_over_var
- pre-computed -05 / var
real** means_acc
- accumulators for EM
DiagonalGMM(int n_observations_, int n_gaussians_, real* var_threshold_, real prior_weights_)
virtual real frameLogProbabilityOneGaussian(real* observations, real* inputs, int g)
- this method returns the log probability of the "g" Gaussian
- Direct child classes:
- Kmeans
- Author:
- Samy Bengio (bengio@idiap.ch)
Alphabetic index HTML hierarchy of classes or Java
This page was generated with the help of DOC++.