Converts a real valued classifier into a conditional probability
estimator. This is achieved by fitting a sigmoid with parameters A and B
to the values of the decision function: f(x) -->
1/(1+exp(A*f(x)+B)
The fitting procedure is a Levenberg-Marquardt optimization derived by
Tobias Mann using Mathematica, to optimize the objective function in:
John C. Platt. Probabilistic Outputs for Support Vector Machines and
Comparisons to Regularized Likelihood Methods. in: Advances in Large
Margin Classifiers A. J. Smola, B. Schoelkopf, D. Schuurmans, eds. MIT
Press (1999).
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
test(classifier,
data,
**args)
test a classifier on a given dataset |
source code
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
cv(classifier,
data,
numFolds=5,
**args)
perform k-fold cross validation |
source code
|
|
|
|
|
|
|
|
|
nCV(classifier,
data,
**args)
runs CV n times, returning a 'ResultsList' object. |
source code
|
|
|
project(self,
data)
project a test dataset to the training data features. |
source code
|
|
|
stratifiedCV(classifier,
data,
numFolds=5,
**args)
perform k-fold stratified cross-validation; in each fold the number of
patterns from each class is proportional to the relative fraction of the
class in the dataset |
source code
|
|
|
|
|
trainTest(classifierTemplate,
data,
trainingPatterns,
testingPatterns,
**args)
Train a classifier on the list of training patterns, and test it
on the test patterns |
source code
|
|
|
|