I'm trying to solve the problem
d = 0.5 * ||X - \Sigma||_{Frobenius Norm} + 0.01 * ||XX||_{1},
where X is a symmetric positive definite matrix, and all the diagnoal element should be 1. XX is same with X except the diagonal matrix is 0. \Sigma is known, I want minimum d with X.
My code is as following:
using Convex
m = 5;
A = randn(m, m);
x = Semidefinite(5);
xx=x;
xx[diagind(xx)].=0;
obj=vecnorm(A-x,2)+sumabs(xx)*0.01;
pro= minimize(obj, [x >= 0]);
pro.constraints+=[x[diagind(x)].=1];
solve!(pro)
MethodError: no method matching diagind(::Convex.Variable)
I just solve the optimal problem by constrain the diagonal elements in matrix, but it seems diagind function could not work here, How can I solve the problem.
||XX||_1
if you define it assum(abs(XX))
? As far as I remember, the operator (1,1)-norm would be the maximum of column sums, and the Schatten 1-norm the sum of singular values. Am I overlooking something? – Jaunitajaunt