TransWikia.com

Word2Vec - CBOW and Skip-Grams

Data Science Asked on July 30, 2021

I’m wondering how Word2Vec is being constructed.

I’ve read tutorials simply stating that we can train a skip grams neural network model and use the weights that are trained as word vectors.

However, I’ve also seen this picture:

enter image description here

If reading this diagram correctly:

1) The CBOW and Skip Grams model are both trained with some inputs

2) The output of CBOW is used as an input to the middle neural network

3) The output of skip grams is used as an output to the middle neural network.

The output of CBOW is the a prediction of the center word given a context, and the output of skip grams is the prediction of the surrounding center word.

These outputs are then used to train another set of neural network.

Hence we first train the CBOW, and Skip-gram neural network, then train the middle neural network afterwards? And the input to the middle neural network is one hot encoded.

Is the above interpretation correct?

3 Answers

There is only one neural network to train in word2vec. CBOW, continuous-bag-of-words', and skip-gram are two different methods of constructing the training data for the neural network. You have to pick one of those two training methods.

Correct answer by Brian Spiering on July 30, 2021

To understand exactly how word2vec model is defined and trained it may be helpful to look at code based on this tutorial:

https://github.com/tensorflow/tensorflow/blob/r1.3/tensorflow/examples/tutorials/word2vec/word2vec_basic.py

Answered by Vadim Smolyakov on July 30, 2021

CBOW and Skipgram are alternative model choices, the only difference is the loss function.

Call it a neural network if you like, but both models are really just a pretty simple loss function involving two matrices of trainable parameters (each with one vector per word). Once trained, one matrix is usually dropped and the other gives the "word embeddings".

There are certainly not multiple neural networks. The top part of the diagram is pre-processing of the text.

Answered by Carl on July 30, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP