*My Training Model*
def train(model,criterion,optimizer,iters):
epoch = iters
train_loss = []
validaion_loss = []
train_acc = []
validation_acc = []
states = ['Train','Valid']
for epoch in range(epochs):
print("epoch : {}/{}".format(epoch+1,epochs))
for phase in states:
if phase == 'Train':
model.train() *training the data if phase is train*
dataload = train_data_loader
else:
model.eval()
dataload = valid_data_loader
run_loss,run_acc = 0,0 *creating variables to calculate loss and acc*
for data in dataload:
inputs,labels = data
inputs = inputs.to(device)
labels = labels.to(device)
labels = labels.byte()
optimizer.zero_grad() #Using the optimizer
with torch.set_grad_enabled(phase == 'Train'):
outputs = model(inputs)
loss = criterion(outputs,labels.unsqueeze(1).float())
predict = outputs>=0.5
if phase == 'Train':
loss.backward() #backward propagation
optimizer.step()
acc = torch.sum(predict == labels.unsqueeze(1))
run_loss+=loss.item()
run_acc+=acc.item()/len(labels)
if phase == 'Train': #calulating train loss and accucracy
epoch_loss = run_loss/len(train_data_loader)
train_loss.append(epoch_loss)
epoch_acc = run_acc/len(train_data_loader)
train_acc.append(epoch_acc)
else: #training validation loss and accuracy
epoch_loss = run_loss/len(valid_data_loader)
validaion_loss.append(epoch_loss)
epoch_acc = run_acc/len(valid_data_loader)
validation_acc.append(epoch_acc)
print("{}, loss :{},accuracy:{}".format(phase,epoch_loss,epoch_acc))
history = {'Train_loss':train_loss,'Train_accuracy':train_acc,
'Validation_loss':validaion_loss,'Validation_Accuracy':validation_acc}
return model,history[enter image description here][1]
I was experiencing the error as 0D or 1D target tensor expected, multi-target not supported could you please help in rectifying the code which is described above. Referred the previous related articles but unable to get the desired result. What are the code snippets I had to change so that my model will run successfully. Any suggestions are mostly welcome. Thanks in Advance.
labels
andpredict
, please – Belvabelvedere