|
||||||||||
PREV CLASS NEXT CLASS | FRAMES NO FRAMES All Classes | |||||||||
SUMMARY: NESTED | FIELD | CONSTR | METHOD | DETAIL: FIELD | CONSTR | METHOD |
java.lang.Objectde.jstacs.algorithms.optimization.DifferentiableFunction
de.jstacs.classifiers.differentiableSequenceScoreBased.OptimizableFunction
de.jstacs.classifiers.differentiableSequenceScoreBased.AbstractOptimizableFunction
de.jstacs.classifiers.differentiableSequenceScoreBased.AbstractMultiThreadedOptimizableFunction
de.jstacs.classifiers.differentiableSequenceScoreBased.DiffSSBasedOptimizableFunction
de.jstacs.classifiers.differentiableSequenceScoreBased.gendismix.LogGenDisMixFunction
public class LogGenDisMixFunction
This class implements the the following function
DifferentiableSequenceScore.clone()
method works correctly, since each thread works on its own clones.
Nested Class Summary |
---|
Nested classes/interfaces inherited from class de.jstacs.classifiers.differentiableSequenceScoreBased.OptimizableFunction |
---|
OptimizableFunction.KindOfParameter |
Field Summary | |
---|---|
protected double[] |
beta
The mixture parameters of the GenDisMix |
protected double[][] |
cllGrad
Array for the gradient of the conditional log-likelihood |
protected double[][] |
helpArray
General temporary array |
protected double[][] |
llGrad
Array for the gradient of the log-likelihood |
protected double[] |
prGrad
Array for the gradient of the prior |
Fields inherited from class de.jstacs.classifiers.differentiableSequenceScoreBased.DiffSSBasedOptimizableFunction |
---|
dList, iList, prior, score, shortcut |
Fields inherited from class de.jstacs.classifiers.differentiableSequenceScoreBased.AbstractMultiThreadedOptimizableFunction |
---|
params |
Fields inherited from class de.jstacs.classifiers.differentiableSequenceScoreBased.AbstractOptimizableFunction |
---|
cl, clazz, data, freeParams, logClazz, norm, sum, weights |
Constructor Summary | |
---|---|
LogGenDisMixFunction(int threads,
DifferentiableSequenceScore[] score,
DataSet[] data,
double[][] weights,
LogPrior prior,
double[] beta,
boolean norm,
boolean freeParams)
The constructor for creating an instance that can be used in an Optimizer . |
Method Summary | |
---|---|
protected void |
evaluateFunction(int index,
int startClass,
int startSeq,
int endClass,
int endSeq)
This method evaluates the function for a part of the data. |
protected void |
evaluateGradientOfFunction(int index,
int startClass,
int startSeq,
int endClass,
int endSeq)
This method evaluates the gradient of the function for a part of the data. |
protected double |
joinFunction()
This method joins the partial results that have been computed using AbstractMultiThreadedOptimizableFunction.evaluateFunction(int, int, int, int, int) . |
protected double[] |
joinGradients()
This method joins the gradients of each part that have been computed using AbstractMultiThreadedOptimizableFunction.evaluateGradientOfFunction(int, int, int, int, int) . |
void |
reset(DifferentiableSequenceScore[] funs)
This method allows to reset the internally used functions and the corresponding objects. |
Methods inherited from class de.jstacs.classifiers.differentiableSequenceScoreBased.DiffSSBasedOptimizableFunction |
---|
addTermToClassParameter, getClassParams, getDimensionOfScope, getParameters, reset, setParams, setThreadIndependentParameters |
Methods inherited from class de.jstacs.classifiers.differentiableSequenceScoreBased.AbstractMultiThreadedOptimizableFunction |
---|
evaluateFunction, evaluateGradientOfFunction, getNumberOfAvailableProcessors, getNumberOfThreads, setDataAndWeights, setParams, stopThreads |
Methods inherited from class de.jstacs.classifiers.differentiableSequenceScoreBased.AbstractOptimizableFunction |
---|
getData, getParameters, getSequenceWeights |
Methods inherited from class de.jstacs.algorithms.optimization.DifferentiableFunction |
---|
findOneDimensionalMin |
Methods inherited from class java.lang.Object |
---|
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait |
Field Detail |
---|
protected double[][] helpArray
protected double[][] llGrad
protected double[][] cllGrad
protected double[] beta
protected double[] prGrad
Constructor Detail |
---|
public LogGenDisMixFunction(int threads, DifferentiableSequenceScore[] score, DataSet[] data, double[][] weights, LogPrior prior, double[] beta, boolean norm, boolean freeParams) throws IllegalArgumentException
Optimizer
.
threads
- the number of threads used for evaluating the function and determining the gradient of the functionscore
- an array containing the DifferentiableSequenceScore
s that are used for determining the sequences scores;
if the weight beta[LearningPrinciple.LIKELIHOOD_INDEX]
is positive all elements of score
have to be DifferentiableStatisticalModel
data
- the array of DataSet
s containing the data that is needed to evaluate the functionweights
- the weights for each Sequence
in each DataSet
of data
prior
- the prior that is used for learning the parametersbeta
- the beta-weights for the three terms of the learning principlenorm
- the switch for using the normalization (division by the number
of sequences)freeParams
- the switch for using only the free parameters
IllegalArgumentException
- if the number of threads is not positive, the number of classes or the dimension of the weights is not correctMethod Detail |
---|
protected double[] joinGradients() throws EvaluationException
AbstractMultiThreadedOptimizableFunction
AbstractMultiThreadedOptimizableFunction.evaluateGradientOfFunction(int, int, int, int, int)
.
joinGradients
in class AbstractMultiThreadedOptimizableFunction
EvaluationException
- if the gradient could not be evaluated properlyprotected void evaluateGradientOfFunction(int index, int startClass, int startSeq, int endClass, int endSeq)
AbstractMultiThreadedOptimizableFunction
evaluateGradientOfFunction
in class AbstractMultiThreadedOptimizableFunction
index
- the index of the partstartClass
- the index of the start classstartSeq
- the index of the start sequenceendClass
- the index of the end class (inclusive)endSeq
- the index of the end sequence (exclusive)protected double joinFunction() throws DimensionException, EvaluationException
AbstractMultiThreadedOptimizableFunction
AbstractMultiThreadedOptimizableFunction.evaluateFunction(int, int, int, int, int)
.
joinFunction
in class AbstractMultiThreadedOptimizableFunction
DimensionException
- if the parameters could not be set
EvaluationException
- if the gradient could not be evaluated properlyprotected void evaluateFunction(int index, int startClass, int startSeq, int endClass, int endSeq) throws EvaluationException
AbstractMultiThreadedOptimizableFunction
evaluateFunction
in class AbstractMultiThreadedOptimizableFunction
index
- the index of the partstartClass
- the index of the start classstartSeq
- the index of the start sequenceendClass
- the index of the end class (inclusive)endSeq
- the index of the end sequence (exclusive)
EvaluationException
- if the gradient could not be evaluated properlypublic void reset(DifferentiableSequenceScore[] funs) throws Exception
DiffSSBasedOptimizableFunction
reset
in class DiffSSBasedOptimizableFunction
funs
- the new instances
Exception
- if something went wrong
|
||||||||||
PREV CLASS NEXT CLASS | FRAMES NO FRAMES All Classes | |||||||||
SUMMARY: NESTED | FIELD | CONSTR | METHOD | DETAIL: FIELD | CONSTR | METHOD |