How to do multi class classification using Support Vector Machines (SVM)
Asked Answered
A

8

55

In every book and example always they show only binary classification (two classes) and new vector can belong to any one class.

Here the problem is I have 4 classes(c1, c2, c3, c4). I've training data for 4 classes.

For new vector the output should be like

C1 80% (the winner)

c2 10%

c3 6%

c4 4%

How to do this? I'm planning to use libsvm (because it most popular). I don't know much about it. If any of you guys used it previously please tell me specific commands I'm supposed to use.

Anabal answered 24/12, 2009 at 13:1 Comment(0)
S
41

LibSVM uses the one-against-one approach for multi-class learning problems. From the FAQ:

Q: What method does libsvm use for multi-class SVM ? Why don't you use the "1-against-the rest" method ?

It is one-against-one. We chose it after doing the following comparison: C.-W. Hsu and C.-J. Lin. A comparison of methods for multi-class support vector machines, IEEE Transactions on Neural Networks, 13(2002), 415-425.

"1-against-the rest" is a good method whose performance is comparable to "1-against-1." We do the latter simply because its training time is shorter.

Saprophyte answered 23/2, 2011 at 8:50 Comment(0)
K
22

Commonly used methods are One vs. Rest and One vs. One. In the first method you get n classifiers and the resulting class will have the highest score. In the second method the resulting class is obtained by majority votes of all classifiers.

AFAIR, libsvm supports both strategies of multiclass classification.

Kyles answered 24/12, 2009 at 13:20 Comment(1)
I thought libsvm only supports one vs. one. But it works really well, though.Moberg
D
7

You can always reduce a multi-class classification problem to a binary problem by choosing random partititions of the set of classes, recursively. This is not necessarily any less effective or efficient than learning all at once, since the sub-learning problems require less examples since the partitioning problem is smaller. (It may require at most a constant order time more, e.g. twice as long). It may also lead to more accurate learning.

I'm not necessarily recommending this, but it is one answer to your question, and is a general technique that can be applied to any binary learning algorithm.

Deemphasize answered 24/12, 2009 at 13:9 Comment(0)
S
1

Use the SVM Multiclass library. Find it at the SVM page by Thorsten Joachims

Smock answered 2/12, 2011 at 20:7 Comment(0)
U
0

It does not have a specific switch (command) for multi-class prediction. it automatically handles multi-class prediction if your training dataset contains more than two classes.

Unpin answered 1/10, 2015 at 15:44 Comment(0)
A
0

Nothing special compared with binary prediction. see the following example for 3-class prediction based on SVM.

install.packages("e1071")
library("e1071")
data(iris)
attach(iris)
## classification mode
# default with factor response:
model <- svm(Species ~ ., data = iris)
# alternatively the traditional interface:
x <- subset(iris, select = -Species)
y <- Species
model <- svm(x, y) 
print(model)
summary(model)
# test with train data
pred <- predict(model, x)
# (same as:)
pred <- fitted(model)
# Check accuracy:
table(pred, y)
# compute decision values and probabilities:
pred <- predict(model, x, decision.values = TRUE)
attr(pred, "decision.values")[1:4,]
# visualize (classes by color, SV by crosses):
plot(cmdscale(dist(iris[,-5])),
     col = as.integer(iris[,5]),
     pch = c("o","+")[1:150 %in% model$index + 1])
Abbeyabbi answered 25/7, 2016 at 18:10 Comment(0)
C
0
data=load('E:\dataset\scene_categories\all_dataset.mat');
    meas = data.all_dataset;
    species = data.dataset_label;
    [g gn] = grp2idx(species);                      %# nominal class to numeric

%# split training/testing sets
[trainIdx testIdx] = crossvalind('HoldOut', species, 1/10);
%# 1-vs-1 pairwise models
num_labels = length(gn);
clear gn;
num_classifiers = num_labels*(num_labels-1)/2;
pairwise = zeros(num_classifiers ,2);
row_end = 0;
for i=1:num_labels - 1
    row_start = row_end + 1;
    row_end = row_start + num_labels - i -1;
    pairwise(row_start : row_end, 1) = i;
    count = 0;
    for j = i+1 : num_labels        
        pairwise( row_start + count , 2) = j;
        count = count + 1;
    end    
end
clear row_start row_end count i j num_labels num_classifiers;
svmModel = cell(size(pairwise,1),1);            %# store binary-classifers
predTest = zeros(sum(testIdx),numel(svmModel)); %# store binary predictions

%# classify using one-against-one approach, SVM with 3rd degree poly kernel
for k=1:numel(svmModel)
    %# get only training instances belonging to this pair
    idx = trainIdx & any( bsxfun(@eq, g, pairwise(k,:)) , 2 );

    %# train
    svmModel{k} = svmtrain(meas(idx,:), g(idx), ...
                 'Autoscale',true, 'Showplot',false, 'Method','QP', ...
                 'BoxConstraint',2e-1, 'Kernel_Function','rbf', 'RBF_Sigma',1);

    %# test
    predTest(:,k) = svmclassify(svmModel{k}, meas(testIdx,:));
end
pred = mode(predTest,2);   %# voting: clasify as the class receiving most votes

%# performance
cmat = confusionmat(g(testIdx),pred);
acc = 100*sum(diag(cmat))./sum(cmat(:));
fprintf('SVM (1-against-1):\naccuracy = %.2f%%\n', acc);
fprintf('Confusion Matrix:\n'), disp(cmat)
Conversation answered 4/1, 2017 at 12:6 Comment(2)
can you add any description to help user understand it? your answer has less value if it is the code alone :/Alert
all_dataset.mat has 15 classes, i use multi-classification svm based on constructing ons vs one binary svm classifier.Conversation
P
0

For multi class classification using SVM; It is NOT (one vs one) and NOT (one vs REST).

Instead learn a two-class classifier where the feature vector is (x, y) where x is data and y is the correct label associated with the data.

The training gap is the Difference between the value for the correct class and the value of the nearest other class.

At Inference choose the "y" that has the maximum value of (x,y).

y = arg_max(y') W.(x,y') [W is the weight vector and (x,y) is the feature Vector]

Please Visit link: https://nlp.stanford.edu/IR-book/html/htmledition/multiclass-svms-1.html#:~:text=It%20is%20also%20a%20simple,the%20label%20of%20structural%20SVMs%20.

Peeve answered 17/6, 2020 at 12:41 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.