In relation to the previous post on stackoverflow Model() got multiple values for argument 'nr_class' - SpaCy multi-classification model (BERT integration) in which my problem partialy have beed resolved I wanted to share the issue which comes up after implementing the solution.
if I take out the nr_class
argument, I get this error here:
ValueError: operands could not be broadcast together with shapes (1,2) (1,5)
I actually thought this would happen because I didn't specify the nr_class argument. Is this correct?
one more time my code for the multi-class model:
nlp = spacy.load('en_pytt_bertbaseuncased_lg')
textcat = nlp.create_pipe(
'pytt_textcat',
config={
"nr_class":5,
"exclusive_classes": True,
}
)
nlp.add_pipe(textcat, last = True)
textcat.add_label("class1")
textcat.add_label("class2")
textcat.add_label("class3")
textcat.add_label("class4")
textcat.add_label("class5")
The code for the training is as follows and is based on the example from here(https://pypi.org/project/spacy-pytorch-transformers/):
def extract_cat(x):
for key in x.keys():
if x[key]:
return key
# get names of other pipes to disable them during training
n_iter = 250 # number of epochs
train_data = list(zip(train_texts, [{"cats": cats} for cats in train_cats]))
dev_cats_single = [extract_cat(x) for x in dev_cats]
train_cats_single = [extract_cat(x) for x in train_cats]
cats = list(set(train_cats_single))
recall = {}
for c in cats:
if c is not None:
recall['dev_'+c] = []
recall['train_'+c] = []
optimizer = nlp.resume_training()
batch_sizes = compounding(1.0, round(len(train_texts)/2), 1.001)
for i in range(n_iter):
random.shuffle(train_data)
losses = {}
batches = minibatch(train_data, size=batch_sizes)
for batch in batches:
texts, annotations = zip(*batch)
nlp.update(texts, annotations, sgd=optimizer, drop=0.2, losses=losses)
print(i, losses)
So the structure of my data looks like this:
[('TEXT TEXT TEXT',
{'cats': {'class1': False,
'class2': False,
'class3': False,
'class4': True,
'class5': False}}), ... ]
(1,2)
refer to the 2 classes of the "resumed" optimizer and(1,5)
refers to the 5 classes of your problem. When I came across the same problem, I could make the code work by reducing my problem to a 2-class problem. This is, of course, not a solution.. – Rosati