Seq2Seq
Family of ML tasks which turns an input sequence into another output sequence
E.g. language to language translation, chatbot answers to questions
In the classical architecture:
- Word Embeddings used for each word of the sentence
- Bi-directional RNN encoder used to propagate and encode context
- Decoder RNN reads the context to produce a new sentence