word-embedding Questions

1

Solved

I am currently developing a text classification tool using Keras. It works (it works fine and I got up to 98.7 validation accuracy) but I can't wrap my head around about how exactly 1D-convolution ...
Laboured asked 16/9, 2018 at 8:52

2

I'm building an RNN LSTM network to classify texts based on the writers' age (binary classification - young / adult). Seems like the network does not learn and suddenly starts overfitting: Red: ...

1

I am trying to build a translation network using embedding and RNN. I have trained a Gensim Word2Vec model and it is learning word associations pretty well. However, I couldn’t get my head around h...
Emeldaemelen asked 24/7, 2018 at 7:23

2

In word embedding, what should be a good vector representation for the start_tokens _PAD, _UNKNOWN, _GO, _EOS?
Holton asked 26/1, 2017 at 19:40

1

Solved

I am confused about how to format my own pre-trained weights for Keras Embedding layer if I'm also setting mask_zero=True. Here's a concrete toy example. Suppose I have a vocabulary of 4 words [1,...
Loudmouthed asked 17/7, 2018 at 13:29

1

I'm training up an RNN with a very reduced set of word features, around 10,000. I was planning on starting with an embedding layer before adding RNNs, but it is very unclear to me what dimensionali...
Rossie asked 13/7, 2018 at 15:33

1

Solved

I'm currently working with a Keras model which has a embedding layer as first layer. In order to visualize the relationships and similarity of words between each other I need a function that return...
Ayah asked 8/7, 2018 at 18:53

2

Solved

I read the paper and googled as well if there is any good example of the learning method(or more likely learning procedure) For word2vec, suppose there is corpus sentence I go to school with lu...
Sharpeyed asked 13/4, 2018 at 7:22

1

Solved

I understand how to use the Keras Embedding layer in case there is a single text feature like in IMDB review classification. However, I am confused how to use the Embedding Layers when I have a Cla...
Indre asked 2/4, 2018 at 5:28

2

Solved

In my model, I use GloVe pre-trained embeddings. I wish to keep them non-trainable in order to decrease the number of model parameters and avoid overfit. However, I have a special symbol whose embe...
Oleviaolfaction asked 27/2, 2018 at 13:2

1

I'm attempting to load some pre-trained vectors into a gensim Word2Vec model, so they can be retrained with new data. My understanding is I can do the retraining with gensim.Word2Vec.train(). Howev...
Untread asked 8/2, 2018 at 16:35

1

Solved

At 15:10 of this video about fastText it mentions syntactic analogy and semantic analogy. But I am not sure what the difference is between them. Could anybody help explain the difference with exam...
Selfaddressed asked 20/1, 2018 at 12:58

4

I've been trying to understand the sample code with https://www.tensorflow.org/tutorials/recurrent which you can find at https://github.com/tensorflow/models/blob/master/tutorials/rnn/ptb/ptb_word_...
Spermatozoid asked 8/9, 2017 at 21:55

2

Solved

I am building TensorFlow model for NLP task, and I am using pretrained Glove 300d word-vector/embedding dataset. Obviously some tokens can't be resolved as embeddings, because were not included in...
Unbounded asked 3/8, 2017 at 21:58

1

Solved

Suppose we're training a neural network model to learn the mapping from the following input to output, where the output is Name Entity (NE). Input: EU rejects German call to boycott British lamb ....
Arceliaarceneaux asked 7/11, 2017 at 0:51

2

Solved

In the paper that I am trying to implement, it says, In this work, tweets were modeled using three types of text representation. The first one is a bag-of-words model weighted by tf-idf (term ...
Biebel asked 9/12, 2017 at 9:16

1

Solved

When I try to create a word2vec model (skipgram with negative sampling) I received 3 files as output as follows. word2vec (File) word2vec.syn1nef.npy (NPY file) word2vec.wv.syn0.npy (NPY file) I...
Astrid asked 8/11, 2017 at 7:7

4

Objective : Identifying class label using user entered question (like Question Answer system). Data extracted from Big PDF file, and need to predict page number based on user input. Majorly used...
Bungalow asked 8/5, 2017 at 8:56

3

If I have a text string to be vectorized, how should I handle numbers inside it? Or if I feed a Neural Network with numbers and words, how can I keep the numbers as numbers? I am planning on maki...
Toothed asked 1/7, 2017 at 22:16

1

Solved

I followed the tutorial here: (https://blog.keras.io/using-pre-trained-word-embeddings-in-a-keras-model.html) However, I modified the code to be able to save the generated model through h5py. Thus...

1

Solved

I need to ask few questions regarding word embeddings.....could be basic. When we convert a one-hot vector of a word for instance king [0 0 0 1 0] into an embedded vector E = [0.2, 0.4, 0.2, 0.2]....
Languish asked 3/7, 2017 at 9:24

2

from gensim.models import word2vec sentences = word2vec.Text8Corpus('TextFile') model = word2vec.Word2Vec(sentences, size=200, min_count = 2, workers = 4) print model['king'] Is the output vecto...
Acroterion asked 9/9, 2016 at 7:28

1

I have a Word2Vec model which is trained in Gensim. How can I use it in Tensorflow for Word Embeddings. I don't want to train Embeddings from scratch in Tensorflow. Can someone tell me how to do it...
Intemperate asked 28/3, 2017 at 13:16

1

The question is simple. Which of the CBOW & skip-gram works better for a big dataset? (And the answer for small dataset follows.) I am confused since, by Mikolov himself, [Link] Skip-gram: ...
Reasonless asked 30/8, 2016 at 9:50

2

Solved

I am trying to learn the word representation of the imdb dataset "from scratch" through the TensorFlow tf.nn.embedding_lookup() function. If I understand it correctly, I have to set up an embedding...

© 2022 - 2024 — McMap. All rights reserved.