Google released T5X checkpoints for MADLAD-400 a couple of months ago, but nobody could figure out how to run them. Turns out the vocabulary was wrong, but they uploaded the correct one last week.

I’ve converted the models to the safetensors format, and I created this space if you want to try the smaller model.

I also published quantized GGUF weights you can use with candle. It decodes at ~15tokens/s on a M2 Mac.

It seems that NLLB is the most popular machine translation model right now, but the license only allows non commercial usage. MADLAD-400 is CC BY 4.0.

  • danigoncalvesB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    What would be the equivalent models based on open source and free for commercial use?

  • remixer_decB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Thanks a lot for converting and quantizing these. I have a couple of questions.

    How does it compare to ALMA? (13B)

    Is it capable of translating more than 1 sentence at a time?

    Is there a way to specify source language or does it always detect it on its own?

    • jbochi
      cake
      OPB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Thanks!

      - I’m not familiar with ALMA, but it seems to be similar to MADLAD-400. Both are smaller than NLLB-54B, but competitive with it. Because ALMA is a LLM and not a seq2seq model with cross-encoding, I’d guess it’s faster.
      - You can translate up to 128 tokens at the time.
      - You can only specify the target language, not the source language.

    • jbochi
      cake
      OPB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Sorry, but what is not working?

      • Puzzleheaded_Mall546B
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        I write text that is incomplete to see how it will translate it and the results is a coninuation of my text not the translation.

        • jbochi
          cake
          OPB
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          How are you running it? Did you prepended a “<2xx>” token for the target language? For example, “<2fr> hello” will translate “hello” to French. If you are using this space, you can select the target language in the dropdown.

  • phoneixAdi
    cake
    B
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Nice thank you!! Tried in space. Works well for me. Noob question. Can I run this with llama.cpp? Since it’s gguf. Can I download this and run it locally?

  • vasileerB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    I tested the 3B model for Romanian, Russian, French, and German translations of the “The sun rises in the East and sets in the West.” and it works 100%: it gets 10/10 from ChatGPT