derpgod123B to LocalLLaMA@poweruser.forumEnglish · 1 year agoIs using WSL good enough for running LLM models locally?message-squaremessage-square5fedilinkarrow-up11arrow-down10file-text
arrow-up11arrow-down1message-squareIs using WSL good enough for running LLM models locally?derpgod123B to LocalLLaMA@poweruser.forumEnglish · 1 year agomessage-square5fedilinkfile-text
minus-squareAnomalyNexusBlinkfedilinkEnglisharrow-up1·1 year agoIt’s fine…you take a bit of a hit on the VRAM by virtue of well windows running though. Only really an issue if you’re on 8gb though
It’s fine…you take a bit of a hit on the VRAM by virtue of well windows running though. Only really an issue if you’re on 8gb though