I’m curious if there’s an ideal setup or pipeline that you can get an LLM to listen and “learn” from you if you just feed it info everyday like a personal diary? Would be interested to see how the model recalls or processes details of my life. Would you just use a web ui like oogabooga to feed info and adapt the model?

  • Severin_SuverenB
    link
    fedilink
    English
    arrow-up
    1
    ·
    11 months ago

    You will need to feed the model with the conversation log every time you query it, and as such you’d be limited by the context length on the model.

    With a 100k context model you’d be able to keep a chat log of about 70-100 000 words, which is about the length of a normal book.