TheHumanFixerB to LocalLLaMA@poweruser.forumEnglish · 1 year agoIs there really no way you can run 70b models without having a very fast GPU or a lot of ram?plus-squaremessage-squaremessage-square0fedilinkarrow-up11arrow-down10
arrow-up11arrow-down1message-squareIs there really no way you can run 70b models without having a very fast GPU or a lot of ram?plus-squareTheHumanFixerB to LocalLLaMA@poweruser.forumEnglish · 1 year agomessage-square0fedilink
TheHumanFixerB to LocalLLaMA@poweruser.forumEnglish · 1 year agoWill Local LLM just won’t run on a computer with low ram or will it run but be incredibly slow?plus-squaremessage-squaremessage-square0fedilinkarrow-up11arrow-down10
arrow-up11arrow-down1message-squareWill Local LLM just won’t run on a computer with low ram or will it run but be incredibly slow?plus-squareTheHumanFixerB to LocalLLaMA@poweruser.forumEnglish · 1 year agomessage-square0fedilink
minus-squareTheHumanFixerOPBtoLocalLLaMA@poweruser.forum•Is it possible to run Llama on a 4gb ram?linkfedilinkarrow-up1·1 year agoNope regular ram linkfedilink
TheHumanFixerB to LocalLLaMA@poweruser.forumEnglish · 1 year agoIs it possible to run Llama on a 4gb ram?plus-squaremessage-squaremessage-square7fedilinkarrow-up11arrow-down10
arrow-up11arrow-down1message-squareIs it possible to run Llama on a 4gb ram?plus-squareTheHumanFixerB to LocalLLaMA@poweruser.forumEnglish · 1 year agomessage-square7fedilink
Nope regular ram