ForsookComparisonB to LocalLLaMA@poweruser.forumEnglish · 1 year agoCheapest GPU/Way to run 30b or 34b "Code" Models with GPT4ALL?plus-squaremessage-squaremessage-square1fedilinkarrow-up11arrow-down10
arrow-up11arrow-down1message-squareCheapest GPU/Way to run 30b or 34b "Code" Models with GPT4ALL?plus-squareForsookComparisonB to LocalLLaMA@poweruser.forumEnglish · 1 year agomessage-square1fedilink