alchemist1e9B to LocalLLaMA@poweruser.forumEnglish · 1 year agoExLlamaV2: The Fastest Library to Run LLMstowardsdatascience.comexternal-linkmessage-square22fedilinkarrow-up11arrow-down10file-text
arrow-up11arrow-down1external-linkExLlamaV2: The Fastest Library to Run LLMstowardsdatascience.comalchemist1e9B to LocalLLaMA@poweruser.forumEnglish · 1 year agomessage-square22fedilinkfile-text
minus-squaretgredditfcBlinkfedilinkEnglisharrow-up1·1 year agoIn my experience it’s the fastest and llama.cpp is the slowest.
In my experience it’s the fastest and llama.cpp is the slowest.