CS5720 - Week 7
Slide 134 of 140
Word2Vec: Learning Word Representations
Word2Vec Models
CBOW (Continuous Bag of Words)
Predicts target word from context words
Skip-gram
Predicts context words from target word
💡 Key Innovation
Word2Vec transforms language modeling into a self-supervised task
Training Process
1. Create Context Windows
2. Train Shallow Neural Network
3. Optimize with Negative Sampling
4. Extract Word Embeddings
Skip-gram Architecture
Input
fox
→
Hidden Layer
300D
Embedding
→
Output
The
quick
brown
jumps
over
...
Example:
"The quick brown
fox
jumps over"
Skip-gram predicts context words given the center word "fox"
← Previous
Next →
Prepared by Dr. Gorkem Kar
Modal Title
×
Modal content goes here...