I am using kobold.cpp and it couldn’t code anything outside of hello world. Am I doing something wrong?
You must log in or register to comment.
It is only 1.3B :-) I have noticed that smaller models work a lot better with longer, more detailed prompts (at least 440 characters, better with twice that many).
2 ideas
- use deepseek-coder-1.3b-instruct not the base model
- check that you use the correct prompting template for the model
It is the instruct model. You can see underneath the prompt box that it’s the deepseek-coder-1.3b-instruct_Q5_K_s model. I used the prompting template in the model, and it slightly improved answers.
But if I ask if to write some code, it almost never does and says something gibberish.
Does your GPU/CPU quality affect the AI’s output? My device is potato.