Hey everyone, I’m looking for a way to use an open source local large language model (LLM) on Linux, particularly on low-spec hardware like Raspberry Pi, to generate lengthy, coherent stories of 10k+ words from a single prompt. I recall reading about methods described in scientific papers such as “Re3: Generating Longer Stories With Recursive Reprompting and Revision”, announced in this Twitter thread from October 2022 and “DOC: Improving Long Story Coherence With Detailed Outline Control”, announced in this Twitter thread from December 2022. These papers used GPT-3, and since it’s been a while since then, I was hoping there might be something similar made using only open source tools. Does anyone have experience with this or know of any resources that could help me achieve long, coherent story generation with an open source LLM? Any advice or pointers would be greatly appreciated. Thank you!

  • Shawdow194@kbin.social
    link
    fedilink
    arrow-up
    10
    ·
    9 months ago

    I think you’ll struggle with the coherent part

    Most LLMs can do a few paragraphs and stay on topic but after thet they need better guidance, usually by changing the prompt to stay relevant. 10k+ words can be hard for normal authors to stay coherent on a single prompt, let alone a GPT-3 model

    • Shawdow194@kbin.social
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      9 months ago

      Reading the articles you attached OP, this is exactly the technology they are still struggling with. I dont think any open source consumer level models will have quite what you’re looking for… yet!