Hello everyone,

I’m encountering a issue while trying to launch the Multimodal/LLaVA modelinto the text-generation-webui framework. Despite following the standard setup procedures, I’m faced with a `KeyError: ‘llava’` during the model loading phase.

Here’s what I’ve tried so far:

- Changing the `model_type` in the `config.json` of `llava` to `“llama”` as suggested in similar cases.

- Experimenting with different models and versions. all using the flags found here (https://github.com/oobabooga/text-generation-webui/tree/main/extensions/multimoda )

- Using the following flags: `–listen --api --chat --trust-remote-code --model liuhaotian_llava-v1.5-7b --multimodal-pipeline llava-v1.5-7b --load-in-4bit`

Unfortunately, these steps haven’t resolved the issue. The error occurs specifically when the `AutoConfig.from_pretrained` method is called in the `transformers` library, and the `llava` key isn’t found in the configuration mapping. (error below)

I’d appreciate any insights or suggestions from the community. Has anyone else encountered this issue or have any advice on how to troubleshoot it further?

2023-11-25 14:16:42 INFO:Loading the extension “multimodal”…

Traceback (most recent call last):

File “D:\text-generation-webui-main\server.py”, line 244, in

create_interface()

File “D:\text-generation-webui-main\server.py”, line 142, in create_interface

extensions_module.create_extensions_block() # Extensions block

^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

File “D:\text-generation-webui-main\modules\extensions.py”, line 192, in create_extensions_block

extension.ui()

File “D:\text-generation-webui-main\extensions\multimodal\script.py”, line 99, in ui

multimodal_embedder = MultimodalEmbedder(params)

^^^^^^^^^^^^^^^^^^^^^^^^^^

File “D:\text-generation-webui-main\extensions\multimodal\multimodal_embedder.py”, line 27, in __init__

pipeline, source = load_pipeline(params)

^^^^^^^^^^^^^^^^^^^^^

File “D:\text-generation-webui-main\extensions\multimodal\pipeline_loader.py”, line 34, in load_pipeline

model_name = shared.args.model.lower()

^^^^^^^^^^^^^^^^^^^^^^^

AttributeError: ‘NoneType’ object has no attribute ‘lower’

Press any key to continue . . .