CS5720 - Week 6
Slide 113 of 120
Many-to-Many RNN: Language Translation
Many-to-Many Translation Architecture
๐ช๐ธ Spanish (Source)
Hola
mundo
hermoso
โ
Encoder-Decoder RNN
Encoder
Eโ
Eโ
Eโ
Decoder
Dโ
Dโ
Dโ
โ
๐บ๐ธ English (Target)
Hello
beautiful
world
Multiple inputs โ Processing โ Multiple outputs with different sequence lengths
Key Concepts
๐ Encoder-Decoder Architecture
Two RNNs working in sequence for translation
๐ฆ Context Vector
Fixed-size representation of source sentence
๐๏ธ Attention Mechanism
Focus on relevant source words during generation
๐ Beam Search
Efficient search for best translation sequences
Translation Challenges
๐ Word Order Differences
Languages have different syntactic structures
๐ Variable Sequence Lengths
Source and target sentences have different lengths
โ Semantic Ambiguity
Words can have multiple meanings in context
๐ค Out-of-Vocabulary Words
Handling unknown or rare words in translation
Translation Examples
Spanish โ English
Easy
๐ช๐ธ "El gato estรก durmiendo en la cama"
๐บ๐ธ "The cat is sleeping on the bed"
French โ English
Medium
๐ซ๐ท "Je voudrais une table pour deux personnes, s'il vous plaรฎt"
๐บ๐ธ "I would like a table for two people, please"
German โ English
Hard
๐ฉ๐ช "Wegen des schlechten Wetters wurde das Spiel abgesagt"
๐บ๐ธ "Due to the bad weather, the game was cancelled"
Japanese โ English
Hard
๐ฏ๐ต "็งใฏๆฏๆใณใผใใผใ้ฃฒใฟใพใ"
๐บ๐ธ "I drink coffee every morning"
Interactive Translation Demo
Explore how many-to-many RNNs handle language translation tasks
๐ Live Translation
๐๏ธ Attention Weights
๐ Compare Models
๐ Beam Search Demo
โ Previous
Next โ
Prepared by Dr. Gorkem Kar
Modal Title
ร
Modal content goes here...