1

In the age of decoder-only LLMs, I'll like to ask if there's any competitive encoder-decoder architectures that are known to scale well for multilingual seq2seq tasks?

There's these that reported state-of-the-art NLI scores but they were not known to be multilingual

There's some ideas on doing encoder with mamba https://github.com/state-spaces/mamba/issues/78 but it looks like an open question.

Other than the above, are there any competitive encoder-decoder architectures that are known to scale well for multilingual seq2seq tasks?

alvas
  • 2,510
  • 7
  • 28
  • 40

0 Answers0