Hi!
I’m quite new to LLMs and want to use it to make training workouts. My idea would be to feed it scientific studies and a bunch of example workouts.
Is this what “training a model” is for? Any resource where I can start to learn how to train one?
Can I use and already fine tuned model like Mistral, or do I need to train a base model like LLama2?
Can I train a quantized model or do I need to use a vanilla one? And quantize it after training?
I have 2x3090, 5950x and 64GB of Ram. If that matters. If I can load a model for inference can I train? Are the resources needed the same?
Thanks!
Hi! It’s the first time I’m seeing SPR, any resource where I can learn more about it? I’ve seen privateGPT, I believe it’s a front end that lets you upload files and I guess it build a database using something like chromaDB that learns what you feed it and takes it into consideration when giving answers, is that right?