send a PR with your patch!
- 2 Posts
- 7 Comments
huge fan of server.cpp too! I actually embed a universal binary (created with lipo) in my macOS app (FreeChat) and use it as an LLM backend running on localhost. Seeing how quickly it improves makes me very happy about this architecture choice.
I just saw the improvements issue today. Pretty excited about the possibility of getting chat template functionality since currently all of that complexity has to live in my client.
Also, TIL about the batching stuff. I’m going to try getting multiple responses using that.
sleeper-2OPBto
LocalLLaMA@poweruser.forum•llama.cpp for normies: FreeChat is now live on the mac app storeEnglish
1·2 years agohell yeah, glad you can use it!
sleeper-2Bto
LocalLLaMA@poweruser.forum•Skywork-13B: a new foundation model trained on 3.2 trillion tokensEnglish
1·2 years agoand is taiwan part of china?
sleeper-2OPBto
LocalLLaMA@poweruser.forum•llama.cpp for normies: FreeChat is now live on the mac app storeEnglish
1·2 years agoLooks like it’s been available since macOS 12 https://developer.apple.com/documentation/swiftui/shapestyle/ultrathinmaterial?changes=_5
sleeper-2OPBto
LocalLLaMA@poweruser.forum•llama.cpp for normies: FreeChat is now live on the mac app storeEnglish
1·2 years agoI think I needed a newer API for something. Min is 13.5 (last minor of last major). What version are you on?
sweet, ty 😎!