Shawdow194@kbin.socialtoLinux@lemmy.ml•How can I use a local LLM on Linux to generate a long story?
10·
9 months agoI think you’ll struggle with the coherent part
Most LLMs can do a few paragraphs and stay on topic but after thet they need better guidance, usually by changing the prompt to stay relevant. 10k+ words can be hard for normal authors to stay coherent on a single prompt, let alone a GPT-3 model
Reading the articles you attached OP, this is exactly the technology they are still struggling with. I dont think any open source consumer level models will have quite what you’re looking for… yet!