org.apache.mahout.classifier.discriminative
Class PerceptronTrainer

java.lang.Object
  extended by org.apache.mahout.classifier.discriminative.LinearTrainer
      extended by org.apache.mahout.classifier.discriminative.PerceptronTrainer

public class PerceptronTrainer
extends LinearTrainer

Implements training according to the perceptron update rule.


Constructor Summary
PerceptronTrainer(int dimension, double threshold, double learningRate, double init, double initBias)
           
 
Method Summary
protected  void update(double label, Vector dataPoint, LinearModel model)
          Implement this method to match your training strategy.
 
Methods inherited from class org.apache.mahout.classifier.discriminative.LinearTrainer
getModel, train
 
Methods inherited from class java.lang.Object
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
 

Constructor Detail

PerceptronTrainer

public PerceptronTrainer(int dimension,
                         double threshold,
                         double learningRate,
                         double init,
                         double initBias)
Method Detail

update

protected void update(double label,
                      Vector dataPoint,
                      LinearModel model)
Implement this method to match your training strategy. Perceptron update works such that in case the predicted label does not match the real label, the weight vector is updated as follows: In case the prediction was positive but should have been negative, the weight vector is set to the sum of weight vector and example (multiplied by the learning rate). In case the prediction was negative but should have been positive, the example vector (multiplied by the learning rate) is subtracted from the weight vector.

Specified by:
update in class LinearTrainer
Parameters:
label - the target label of the wrongly classified data point.
dataPoint - the data point that was classified incorrectly.
model - the model to update.


Copyright © 2008-2012 The Apache Software Foundation. All Rights Reserved.