Skip to content

Dissertation Project - A seq2seq transformer model created for basic text-based interaction.

Notifications You must be signed in to change notification settings

astellj/transformer-chatbot

Repository files navigation

NMT With A Transformer Language Model

A sequence-to-sequence (seq2seq) transformer model created for basic text-based interaction. Roughly based on the concepts proposed in 'Attention is all you need' by Vaswani et al. (2017).

Snippets of code taken from: https://www.tensorflow.org/text/tutorials/transformer

About

Dissertation Project - A seq2seq transformer model created for basic text-based interaction.

Resources

Stars

Watchers

Forks

Languages