I’m curious if there’s an ideal setup or pipeline that you can get an LLM to listen and “learn” from you if you just feed it info everyday like a personal diary? Would be interested to see how the model recalls or processes details of my life. Would you just use a web ui like oogabooga to feed info and adapt the model?

  • MordyOfTheMooMooB
    link
    fedilink
    English
    arrow-up
    1
    ·
    11 months ago

    Do you have a specific use case or need in mind? If you want it to remember things, you wouldn’t necessarily ‘feed it into an LLM’ but if you want it to produce output more like how you’d speak, then fine-tuning would probably be appropriate.

    Depending on what you wanna do, it will have different design requirements.

    In general, I’d ask what’s the desired goal first.