Class EM
Expectation Maximization model
public class EM : UnmanagedObject, IDisposable, IStatModel, IAlgorithm
- Inheritance
-
EM
- Implements
- Inherited Members
- Extension Methods
Constructors
EM()
Create an Expectation Maximization model
public EM()
Properties
ClustersNumber
The number of mixtures
public int ClustersNumber { get; set; }
Property Value
CovarianceMatrixType
The type of the mixture covariation matrices
public EM.CovarianMatrixType CovarianceMatrixType { get; set; }
Property Value
TermCriteria
Termination criteria of the procedure. EM algorithm stops either after a certain number of iterations (term_crit.num_iter), or when the parameters change too little (no more than term_crit.epsilon) from iteration to iteration
public MCvTermCriteria TermCriteria { get; set; }
Property Value
Methods
DisposeObject()
Release the memory associated with this EM model
protected override void DisposeObject()
Predict(IInputArray, IOutputArray)
Predict the probability of the samples
public MCvPoint2D64f Predict(IInputArray samples, IOutputArray probs = null)
Parameters
samples
IInputArrayThe input samples
probs
IOutputArrayThe prediction results, should have the same # of rows as the
samples
Returns
- MCvPoint2D64f
The results
TrainM(IInputArray, IInputArray, IOutputArray, IOutputArray, IOutputArray)
Estimate the Gaussian mixture parameters from a samples set. This variation starts with Expectation step. Initial values of the model parameters will be estimated by the k-means algorithm. Unlike many of the ML models, EM is an unsupervised learning algorithm and it does not take responses (class labels or function values) as input. Instead, it computes the Maximum Likelihood Estimate of the Gaussian mixture parameters from an input sample set, stores all the parameters inside the structure, and optionally computes the output "class label" for each sample. The trained model can be used further for prediction, just like any other classifier.
public void TrainM(IInputArray samples, IInputArray probs0, IOutputArray logLikelihoods = null, IOutputArray labels = null, IOutputArray probs = null)
Parameters
samples
IInputArraySamples from which the Gaussian mixture model will be estimated. It should be a one-channel matrix, each row of which is a sample. If the matrix does not have CV_64F type it will be converted to the inner matrix of such type for the further computing.
probs0
IInputArrayThe probs0.
logLikelihoods
IOutputArrayThe optional output matrix that contains a likelihood logarithm value for each sample. It has nsamples x 1 size and CV_64FC1 type.
labels
IOutputArrayThe optional output "class label" for each sample(indices of the most probable mixture component for each sample). It has nsamples x 1 size and CV_32SC1 type.
probs
IOutputArrayThe optional output matrix that contains posterior probabilities of each Gaussian mixture component given the each sample. It has nsamples x nclusters size and CV_64FC1 type.
trainE(IInputArray, IInputArray, IInputArray, IInputArray, IOutputArray, IOutputArray, IOutputArray)
Estimate the Gaussian mixture parameters from a samples set. This variation starts with Expectation step. You need to provide initial means of mixture components. Optionally you can pass initial weights and covariance matrices of mixture components.
public void trainE(IInputArray samples, IInputArray means0, IInputArray covs0 = null, IInputArray weights0 = null, IOutputArray loglikelihoods = null, IOutputArray labels = null, IOutputArray probs = null)
Parameters
samples
IInputArraySamples from which the Gaussian mixture model will be estimated. It should be a one-channel matrix, each row of which is a sample. If the matrix does not have CV_64F type it will be converted to the inner matrix of such type for the further computing.
means0
IInputArrayInitial means of mixture components. It is a one-channel matrix of nclusters x dims size. If the matrix does not have CV_64F type it will be converted to the inner matrix of such type for the further computing.
covs0
IInputArrayThe vector of initial covariance matrices of mixture components. Each of covariance matrices is a one-channel matrix of dims x dims size. If the matrices do not have CV_64F type they will be converted to the inner matrices of such type for the further computing.
weights0
IInputArrayInitial weights of mixture components. It should be a one-channel floating-point matrix with 1 x nclusters or nclusters x 1 size.
loglikelihoods
IOutputArrayThe optional output matrix that contains a likelihood logarithm value for each sample. It has nsamples x 1 size and CV_64FC1 type.
labels
IOutputArrayThe optional output "class label" (indices of the most probable mixture component for each sample). It has nsamples x 1 size and CV_32SC1 type.
probs
IOutputArrayThe optional output matrix that contains posterior probabilities of each Gaussian mixture component given the each sample. It has nsamples x nclusters size and CV_64FC1 type.