ForsookComparisonB to LocalLLaMA@poweruser.forumEnglish · 10 months agoCheapest GPU/Way to run 30b or 34b "Code" Models with GPT4ALL?plus-squaremessage-squaremessage-square1fedilinkarrow-up11arrow-down10
arrow-up11arrow-down1message-squareCheapest GPU/Way to run 30b or 34b "Code" Models with GPT4ALL?plus-squareForsookComparisonB to LocalLLaMA@poweruser.forumEnglish · 10 months agomessage-square1fedilink