You can use oobabooga API to do that. I haven’t done it myself, can’t say much about it.
- 3 Posts
- 11 Comments
You can start with reading Oobabooga’s wiki, I think it’s one of most beginner friendly tools. https://github.com/oobabooga/text-generation-webui/wiki/05-‐-Training-Tab
tgredditfcBto
LocalLLaMA@poweruser.forum•When training an LLM how do you decide to use a 7b, 30b, 120b, etc model (assuming you can run them all)?English
1·2 years agoIf I can run them all I will just pick the biggest one.
tgredditfcBto
LocalLLaMA@poweruser.forum•What prompts/questions do you use to test a model’s capabilities? Ideally ones that aren’t included in their training data.English
1·2 years ago“Write the snake game using pygame”
tgredditfcBto
LocalLLaMA@poweruser.forum•ExLlamaV2: The Fastest Library to Run LLMsEnglish
1·2 years agoThanks for sharing! I have been struggling with llama.cpp loader and GGUF (using oobabooga and the same LLM model), no matter how I set the parameters and how many offloaded layers to GPUs, llama.cpp is way slower to ExLlama (v1&2), not just a bit slower but 1 digit slower. I really don’t know why.
tgredditfcBto
LocalLLaMA@poweruser.forum•ExLlamaV2: The Fastest Library to Run LLMsEnglish
1·2 years agoIn my experience it’s the fastest and llama.cpp is the slowest.
tgredditfcOPBto
LocalLLaMA@poweruser.forum•Is it possible to fine tune a 33B model with 48GB vRAM?English
1·2 years agoThank you! It looks very deep to me, I will look into it.
tgredditfcOPBto
LocalLLaMA@poweruser.forum•Is it possible to fine tune a 33B model with 48GB vRAM?English
1·2 years agoThanks! I have some problems to load GPTQ models with transformer loader.
tgredditfcOPBto
LocalLLaMA@poweruser.forum•Is it possible to fine tune a 33B model with 48GB vRAM?English
1·2 years agoThanks for sharing!
tgredditfcBto
LocalLLaMA@poweruser.forum•🐺🐦⬛ LLM Format Comparison/Benchmark: 70B GGUF vs. EXL2 (and AWQ)English
1·2 years agoI have 2 gpus and AWQ never works for me on Oobabooga, no matter how I split the vRAM, oom in most of the cases.
Maybe. I have not done it yet so I don’t know. You can google around.