8000 Add support for phi-4-mini and phi-4-multimodal · Issue #3071 · huggingface/text-generation-inference · GitHub
[go: up one dir, main page]

Skip to content
Add support for phi-4-mini and phi-4-multimodal #3071
@farzanehnakhaee70

Description

@farzanehnakhaee70

Feature request

Currently, when I check phi-4-mini model with TGI and also phi-4-multimodal model, none of them can be loaded with TGI version 3.1.1. Phi-4-mini need transformer version 4.49.0 and phi-4-multimodal got the following error:

Unsupported model type phi4mm

Is there any possibility to support these two models?
Also is it possible to use TensorRT-LLM backend?

Motivation

Resolve the issues with phi-4-mini and phi-4-multimodal model to be deployed with TGI

Your contribution

Upgrade transformer to version 4.49.0 and add support for multimodal models

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions

      0