minus-squareMonkeyMaster64BtoLocalLLaMA@poweruser.forum•ExLlamaV2: The Fastest Library to Run LLMslinkfedilinkEnglisharrow-up1·1 year agoIs this able to use CPU (similar to llama.cpp)? linkfedilink
MonkeyMaster64B to LocalLLaMA@poweruser.forumEnglish · 1 year agoLarge-scale LLM deployment with GBNF supportplus-squaremessage-squaremessage-square1fedilinkarrow-up11arrow-down10
arrow-up11arrow-down1message-squareLarge-scale LLM deployment with GBNF supportplus-squareMonkeyMaster64B to LocalLLaMA@poweruser.forumEnglish · 1 year agomessage-square1fedilink
Is this able to use CPU (similar to llama.cpp)?