Hello after a long time :)

I am TokenBender.
Some of you may remember my previous model - codeCherryPop
It was very kindly received so I am hoping I won’t be killed this time as well.

Releasing EvolvedSeeker-1.3B v0.0.1
A 1.3B model with 68.29% on HumanEval.
The base model is quite cracked, I just did with it what I usually try to do with every coding model.

Here is the model - https://huggingface.co/TokenBender/evolvedSeeker_1_3
I will post this in TheBloke’s server for GGUF but I find that Deepseek coder’s GGUF sucks for some reason so let’s see.

EvolvedSeeker v0.0.1 (First phase)

This model is a fine-tuned version of deepseek-ai/deepseek-coder-1.3b-base on 50k instructions for 3 epochs.

I have mostly curated instructions from evolInstruct datasets and some portions of glaive coder.

Around 3k answers were modified via self-instruct.

Recommended format is ChatML, Alpaca will work but take care of EOT token

This is a very early version of 1.3B sized model in my major project PIC (Partner-in-Crime)
Going to teach this model json/md adherence next.

https://preview.redd.it/jhvz3xoj7y1c1.png?width=1500&format=png&auto=webp&s=3c0ec081768293885a9953766950758e9bf6db7d

I will just focus on simple things that I can do for now but anything you guys will say will be taken into consideration for fixes.

  • naptasticB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Ok, it finally downloaded and I’ve spent a few minutes with it. It keeps getting into endless pathways of jaron (e.g., “fair play make world communal environment tolerant embraces diversity embrace equity promote unity instill resilience proactive leadership” and it just goes on like that–no punctuation, no connecting words–until it reaches the token limit.) What loader and settings work best with this model?

    • ahm_rimerOPB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Try the chat inference code mentioned in the model card if you’re running it on GPU. The size is good enough to test on free colab as well.

      • naptasticB
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        That definitely works better. I wouldn’t trust it too far though. It just told me I can remove the first part of a file with one seek() and one truncate() call…