Error while loading AutoProcessor
I got an error while loading:
processor = AutoProcessor.from_pretrained(model_name, trust_remote_code=True)
processor = AutoProcessor.from_pretrained(model_name, trust_remote_code=True)
File "/home/ls/miniforge3/envs/phi4/lib/python3.10/site-packages/transformers/models/auto/processing_auto.py", line 303, in from_pretrained
config = AutoConfig.from_pretrained(
File "/home/ls/miniforge3/envs/phi4/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py", line 1091, in from_pretrained
raise ValueError(
ValueError: Unrecognized model in microsoft/Phi-4-multimodal-instruct. Should have a `model_type` key in its config.json, or contain one of the following strings in its name:
@pudashi Maybe you can look at the requirements https://huggingface.co/microsoft/Phi-4-multimodal-instruct#requirements and see with this setting your code work.
Can you double check the transformers version? Is it 4.48.2?
Got it working here
https://github.com/anastasiosyal/phi4-multimodal-instruct-server/
You will find a dockerfile with the right dependencies
Also an OpenAI compatible chat completion endpoint for self hosting the model
Can you double check the transformers version? Is it 4.48.2?
yep I pip list|grep transformers
4.48.2~
Got it working here
https://github.com/anastasiosyal/phi4-multimodal-instruct-server/You will find a dockerfile with the right dependencies
Also an OpenAI compatible chat completion endpoint for self hosting the model
thx I will try it out
@anastasiosyal Thank you for making the dockerfile.
You're welcome! I think it will help if more of models are released with docker files to standardise the dependencies, dependencies are moving targets, changing pretty quickly