I’m struggling to get the 7b models to do something useful, obviously I’m doing something wrong as it appears many people strive for 7b models.

But myself I can not get them to follow instructions, they keep repeating stuff and occasionally they start to converse with themselves.

Does anyone have any pointers what I’m doing wrong?

  • MbandoB
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    Falcon – 7B fine tuned is pretty powerful. Within domain, and in a RAG stack it out performs GPT – 3.5.