Skip to content

paligemma2-3B-mix chat无法进行正确的回复,但是官方提供的transformers可以 #7180

@hanggun

Description

@hanggun

Reminder

  • I have read the above rules and searched the existing issues.

System Info

  • llamafactory version: 0.9.2.dev0
  • Platform: Linux-5.15.0-86-generic-x86_64-with-glibc2.35
  • Python version: 3.12.3
  • PyTorch version: 2.6.0+cu124 (GPU)
  • Transformers version: 4.49.0
  • Datasets version: 3.2.0
  • Accelerate version: 1.2.1
  • PEFT version: 0.12.0
  • TRL version: 0.9.6
  • GPU type: NVIDIA A100-PCIE-40GB
  • GPU number: 8
  • GPU memory: 39.39GB

Reproduction

Image

我使用上面的配置运行,结果说vlm无法识别

Image

虽然用的不是同一张图,但是我用官方的transformers代码,是可以进行描述的

Others

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    solvedThis problem has been already solved

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions