I am trying to understand the epochs
parameter in the Doc2Vec
function and epochs
parameter in the train
function.
In the following code snippet, I manually set up a loop of 4000 iterations. Is it required or passing 4000 as epochs parameter in the Doc2Vec enough? Also how epochs
in Doc2Vec
is different from epochs in train
?
documents = Documents(train_set)
model = Doc2Vec(vector_size=100, dbow_words=1, dm=0, epochs=4000, window=5,
seed=1337, min_count=5, workers=4, alpha=0.001, min_alpha=0.025)
model.build_vocab(documents)
for epoch in range(model.epochs):
print("epoch "+str(epoch))
model.train(documents, total_examples=total_length, epochs=1)
ckpnt = model_name+"_epoch_"+str(epoch)
model.save(ckpnt)
print("Saving {}".format(ckpnt))
Also, how and when are the weights updated?