Matlab libsvm - how to find the w coefficients
Asked Answered
L

1

17

How can find what the vector w is, i.e. the perpendicular to the separation plane?

Luckless answered 12/4, 2012 at 20:34 Comment(1)
and please give solution, how to find vector w in JAVA?Legging
P
20

This is how I did it here. If I remember correctly, this is based on how the dual form of the SVM optimisation works out.

model = svmtrain(...);
w = (model.sv_coef' * full(model.SVs));

And the bias is (and I don't really remember why its negative):

bias = -model.rho;

Then to do the classification (for a linear SVM), for a N-by-M dataset 'features' with N instances and M features,

predictions = sign(features * w' + bias);

If the kernel is not linear, then this won't give you the right answer.

For more information see How could I generate the primal variable w of linear SVM? , from the manual of libsvm.

Papain answered 12/4, 2012 at 22:1 Comment(16)
And how exactly do I use these to do manual classification? I am talking about the 2-classes case.Luckless
I've edited my question to explain, though there isn't really any point in doing it... it will return exactly the same result as probs in [guess, acc, probs] = svmpredict(...);.Papain
Nope, it didn't really return the same result, it only returns -1s. I know it has no point, but I would like to make sure it is correct, and then I will simply use the weights in a separate application to 'manually' do classification without explicitly performing any SVM stuff. Thanks a lot, this is really a bottleneck in my project. Can you please double check this and see where the bug is?Luckless
it worked for me. Is your model trained properly? do w and bias look like sensible values before you do the classification with them? What options are you passing to svmtrain?Papain
Here is what I do model = svmtrain(yTrain, xTrain, '-b 1'); [predicted_label, accuracy, z] = svmpredict(yTest, xTest, model, '-b 1');w = (model.sv_coef' * full(model.SVs));bias = -model.rho;manPredictions = sign(xTest * w' - bias); Thanks a lot for trying to help out Richard, I really appreciate it. P.S. Somebody please format the code.Luckless
Maybe I should compare with norm(w) to determine the sign(class)?Luckless
Ah, you are using an RBF kernel (to use linear, do '-t 0'). In order to compute the predictions, you need to put xTest through the kernel function. RBF kernel looks like K(a, b) = exp(-gamma*(a - b)^2). For each example, you compute K(a, b) with xTest(i, :) and each of the model.SVs, then you multiply this by the model.sv_coefs. The gamma value is a parameter - I think somewhere in model.Parameters. If you actually want to do this, for an RBF kernel, then I can give more details.Papain
No, I simply want to use basic, linear kernel, which I thought was the default. So, to switch to this, I simply need to add the '-t 0' parameter?Luckless
yes. Then you should find that the code I gave produces the same result as the z result from svmpredict (unless '-b 1' changes this - if it seems wrong try '-b 0').Papain
I do model = svmtrain(yTrain, xTrain, '-t 0'); [predicted_label, accuracy, z] = svmpredict(yTest, xTest, model, '-t 0'); it says Unknown option: -tLuckless
By the way, can you please also give me more details about how to use it with RBF also? Thank you very much!!!Luckless
Awesome, I actually got it to work for the linear kernel. And how would I do it with the RBF?Luckless
I'll need to take some time to look into the RBF kernel, I couldn't say how to find the prediction manually off the top of my head. I'm a bit busy but will try to get back to you in a few hoursPapain
phew, got it. It's a bit complicated! arrayfun(@(i)a.svm.sv_coef' * exp(-a.svm.Parameters(4) .* sum((repmat(features(i, :), size(a.svm.SVs, 1), 1) - a.svm.SVs).^2, 2)) - a.svm.rho, 1:numel(labels))' The idea is that we run the kernel function on each feature + support vector, then multiply by the SV coefficient, and finally add the bias. Parameters(4) is the gamma parameter.Papain
and how to find vector w in JAVA?Legging
Should the ' after the w be removed? As w should be a column vector and the features are row vector. By the way, could you update your demo for MATLAB R2016a? Thank You.Derron

© 2022 - 2024 — McMap. All rights reserved.