Has anyone been able to get ANY open source LLM to use Langchain tools? I have not had success with any of the models I have tried including Llama 2, Mistral and Yi 34b. I usually get “Cannot parse LLM output” type errors. In some cases the model successfully uses the tool but doesn’t return the final answer correctly i.e the model invokes the tool correctly and I can see the answer as an observation but the model doesn’t return the answer correctly.
In my application the answer from the tool will have a specific format that should make it easy to extract by looking at the observations and extracting using regex (assuming I can access the observations).
But I’m wondering if anyone has had any success with ANY open source LLM in using Langchain tools where the model can correctly use the tool and return the final answer without erroring?
Tried following this with Llama 2 13b
https://www.pinecone.io/learn/llama-2/
I get "ValueError: unknown format from LLM: "
Got it working with Llama 2 70b following the tutorial from James Briggs. Note it did not work with Llama 2 13b which returned an empty output at the end. https://stackoverflow.com/questions/77491941/llama-2-with-langchain-tools