Key Concepts

Review core concepts you need to learn to master this subject

Generating text with seq2seq

The seq2seq (sequence to sequence) model is a type of encoder-decoder deep learning model commonly employed in natural language processing that uses recurrent neural networks like LSTM to generate output. seq2seq can generate output token by token or character by character. In machine translation, seq2seq networks have an encoder accepting language as input and outputting state vectors and a decoder accepting the encoder’s final state and outputting possible translations.

Generating Text with Deep Learning
Lesson 1 of 1

How you'll master it

Stress-test your knowledge with quizzes that help commit syntax to memory

Pro Logo