Skip to content

demo doesn't work : Error while finding module specification for 'llava.serve.cli' (ModuleNotFoundError: No module named 'llava') #9

@chuangzhidan

Description

@chuangzhidan

**python -m llava.serve.cli
--model-path ./checkpoints/llava-hr-7b-sft-1024
--image-file "./assets/example.jpg"

i have downloaded model from huggingface and didn't find any model card info ,sad . so i tried above command line and changed the aboved path to my own model file on the server . and it doesn't work. how do i fix this ?**

bug:
def convnextv2_small(pretrained=False, **kwargs) -> ConvNeXt:
/root/autodl-tmp/hongan/NewFolder/llava-hr/LLaVA-HR-main/llava_hr/model/multimodal_encoder/convnext.py:1072: UserWarning: Overwriting convnextv2_base in registry with llava_hr.model.multimodal_encoder.convnext.convnextv2_base. This is because the name being registered conflicts with an existing name. Please check if this is not expected.
def convnextv2_base(pretrained=False, **kwargs) -> ConvNeXt:
/root/autodl-tmp/hongan/NewFolder/llava-hr/LLaVA-HR-main/llava_hr/model/multimodal_encoder/convnext.py:1079: UserWarning: Overwriting convnextv2_large in registry with llava_hr.model.multimodal_encoder.convnext.convnextv2_large. This is because the name being registered conflicts with an existing name. Please check if this is not expected.
def convnextv2_large(pretrained=False, **kwargs) -> ConvNeXt:
/root/autodl-tmp/hongan/NewFolder/llava-hr/LLaVA-HR-main/llava_hr/model/multimodal_encoder/convnext.py:1086: UserWarning: Overwriting convnextv2_huge in registry with llava_hr.model.multimodal_encoder.convnext.convnextv2_huge. This is because the name being registered conflicts with an existing name. Please check if this is not expected.
def convnextv2_huge(pretrained=False, **kwargs) -> ConvNeXt:
Traceback (most recent call last):
File "/root/miniconda3/lib/python3.8/runpy.py", line 185, in _run_module_as_main
mod_name, mod_spec, code = _get_module_details(mod_name, _Error)
File "/root/miniconda3/lib/python3.8/runpy.py", line 111, in _get_module_details
import(pkg_name)
File "/root/autodl-tmp/hongan/NewFolder/llava-hr/LLaVA-HR-main/llava_hr/init.py", line 1, in
from .model import LlavaLlamaForCausalLM
File "/root/autodl-tmp/hongan/NewFolder/llava-hr/LLaVA-HR-main/llava_hr/model/init.py", line 1, in
from .language_model.llava_llama import LlavaLlamaForCausalLM, LlavaConfig
File "/root/autodl-tmp/hongan/NewFolder/llava-hr/LLaVA-HR-main/llava_hr/model/language_model/llava_llama.py", line 139, in
AutoConfig.register("llava_hr", LlavaConfig)
File "/root/miniconda3/lib/python3.8/site-packages/transformers/models/auto/configuration_auto.py", line 1123, in register
raise ValueError(
ValueError: The config you are passing has a model_type attribute that is not consistent with the model type you passed (config has llava and you passed llava_hr. Fix one of those so they match!

can you add model card info on your huggingface repo?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions