Skip to content

Fix an issue when piping attn_logits_soft_cap through in vllm.#8600

Merged
lsy323 merged 14 commits intopytorch:masterfrom
fenghuizhang:master
Jan 22, 2025
Merged

Fix an issue when piping attn_logits_soft_cap through in vllm.#8600
lsy323 merged 14 commits intopytorch:masterfrom
fenghuizhang:master

Conversation

@fenghuizhang
Copy link
Copy Markdown
Contributor

@fenghuizhang fenghuizhang commented Jan 22, 2025

PR on vllm here: vllm-project/vllm#12294.

Found an issue when piping the soft cap through in vllm: https://buildkite.com/vllm/fastcheck/builds/12156#01948bc4-58cd-4863-9eca-e2ea098879f9

It looks like torch compile wasn't able to trace the kernel due to the arg was of float type and we couldn't pass None into the func.

@lsy323 lsy323 merged commit 5b877be into pytorch:master Jan 22, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants