• Desm0ntB
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 个月前

    Is it still only 4k context size?

    I hope one day someone somehow find a way to extend context of Tiefighter atleast to 8k.
    Because it’s the perfect model for real-time RP and stories even on weak PCs. It’s smarter than all 7b and 13b models and smarter than many 30b models, but the modest context of 4k tokens is eaten up faster than you can enjoy its potential…

    • Majestical-psycheOPB
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 个月前

      Sorry I was playing around with it for the last day… So far I prefer it over 34B Dolphin Yi (GGUF Q4_K_m)… As for context size I only used 8k and it was pretty good with going far back. It might be able to do 12K, idk… Haven’t tried it.