Goal:
+train an efficient universal model that can translate b/n any language
Progress:
+
https://abs.twimg.com/emoji/v2/... draggable="false" alt="👇" title="Down pointing backhand index" aria-label="Emoji: Down pointing backhand index"> work sets a new milestone towards building a single model 
Massively Multilingual Neural Machine Translation
in the Wild: Findings and Challenges https://arxiv.org/pdf/1907.05019.pdf
https://arxiv.org/pdf/1907.... href="https://twitter.com/fbk_mt">@fbk_mt
https://abs.twimg.com/emoji/v2/... draggable="false" alt="👇" title="Down pointing backhand index" aria-label="Emoji: Down pointing backhand index">
                        
                        
                        
                        
                                                
                    
                    
                                    
                    +train an efficient universal model that can translate b/n any language
Progress:
+
Massively Multilingual Neural Machine Translation
in the Wild: Findings and Challenges https://arxiv.org/pdf/1907.05019.pdf
https://arxiv.org/pdf/1907.... href="https://twitter.com/fbk_mt">@fbk_mt
                        
                        
                        Languages in total: 103, can result > 10k translation directions. 
Training examples: 25Billion
Core points:
- transfer-learning ability across languages
- benefits low-resource languages
- keeps performance of high-resource languages
- detailed analysis on model training
https://abs.twimg.com/emoji/v2/... draggable="false" alt="👇" title="Down pointing backhand index" aria-label="Emoji: Down pointing backhand index">
                        
                        
                        
                        
                                                
                    
                    
                                    
                    Training examples: 25Billion
Core points:
- transfer-learning ability across languages
- benefits low-resource languages
- keeps performance of high-resource languages
- detailed analysis on model training
                        
                        
                        Open problems areas as mentioned in the paper: 
- Data & supervision
- Learning
- Model capacity
- Arch & Vocab
Continues ...
                    
                
                - Data & supervision
- Learning
- Model capacity
- Arch & Vocab
Continues ...
                        
Read on Twitter