A sequence-to-sequence (seq2seq) transformer model created for basic text-based interaction. Roughly based on the concepts proposed in 'Attention is all you need' by Vaswani et al. (2017).
Snippets of code taken from: https://www.tensorflow.org/text/tutorials/transformer