Skip to content

fix chatting with VLM model via CLI#3862

Merged
lvhan028 merged 1 commit intoInternLM:mainfrom
lvhan028:fix-chat
Aug 21, 2025
Merged

fix chatting with VLM model via CLI#3862
lvhan028 merged 1 commit intoInternLM:mainfrom
lvhan028:fix-chat

Conversation

@lvhan028
Copy link
Copy Markdown
Collaborator

Modification

Use positional args instead of kwargs to resolve the TypeError issue:

TypeError: VLAsyncEngine.chat() missing 1 required positional argument: 'prompts'

Check sess.history before accessing it.

@lvhan028 lvhan028 requested a review from irexyc August 21, 2025 04:47
@lvhan028 lvhan028 requested a review from zhulinJulia24 August 21, 2025 04:47
@lvhan028 lvhan028 merged commit 630f5b1 into InternLM:main Aug 21, 2025
5 checks passed
littlegy pushed a commit to littlegy/lmdeploy that referenced this pull request Sep 11, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants