Skip to content
This repository was archived by the owner on Jun 13, 2025. It is now read-only.
/ vllm Public archive
This repository was archived by the owner on Jun 13, 2025. It is now read-only.

Build rocm vLLM image #10

@edamamez

Description

@edamamez

Lamini v0.6.5 --> ran into wheel issue
Lamini v0.6.5 with cherrypicked commit: vllm-project/vllm#12172
and vLLM v0.6.5 with cherrypicked commit

8.190 /opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/utils/_pytree.py:185: FutureWarning: optree is installed but the version is too old to support PyTorch Dynamo in C++ pytree. C++ pytree support is disabled. Please consider upgrading optree using `python3 -m pip install --upgrade 'optree>=0.13.0'`.
8.190   warnings.warn(
9.359 No ROCm runtime is found, using ROCM_HOME='/opt/rocm'
9.424 /vllm-workspace/libs/flash-attention/setup.py:95: UserWarning: flash_attn was requested, but nvcc was not found.  Are you sure your environment has nvcc available?  If you're installing within a container from https://hub.docker.com/r/pytorch/pytorch, only images whose names contain 'devel' will provide nvcc.
9.424   warnings.warn(
9.425 Traceback (most recent call last):
9.425   File "/vllm-workspace/libs/flash-attention/setup.py", line 179, in <module>
9.425
9.425
9.425 torch.__version__  = 2.6.0+cu124
9.425
9.425
9.426     CUDAExtension(
9.426   File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/utils/cpp_extension.py", line 1130, in CUDAExtension
9.426     library_dirs += library_paths(device_type="cuda")
9.426   File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/utils/cpp_extension.py", line 1264, in library_paths
9.426     if (not os.path.exists(_join_cuda_home(lib_dir)) and
9.426   File "/opt/conda/envs/py_3.9/lib/python3.9/site-packages/torch/utils/cpp_extension.py", line 2525, in _join_cuda_home
9.426     raise OSError('CUDA_HOME environment variable is not set. '
9.426 OSError: CUDA_HOME environment variable is not set. Please set it to your CUDA install root.
------
Dockerfile.rocm:82
--------------------
  81 |     # Build ROCm flash-attention wheel if `BUILD_FA = 1`
  82 | >>> RUN --mount=type=cache,target=${CCACHE_DIR} \
  83 | >>>     if [ "$BUILD_FA" = "1" ]; then \
  84 | >>>         mkdir -p libs \
  85 | >>>         && cd libs \
  86 | >>>         && git clone https://github.com/ROCm/flash-attention.git \
  87 | >>>         && cd flash-attention \
  88 | >>>         && git checkout "${FA_BRANCH}" \
  89 | >>>         && git submodule update --init \
  90 | >>>         && GPU_ARCHS="${FA_GFX_ARCHS}" python3 setup.py bdist_wheel --dist-dir=/install; \
  91 | >>>     # Create an empty directory otherwise as later build stages expect one
  92 | >>>     else mkdir -p /install; \
  93 | >>>     fi
  94 |
--------------------
ERROR: failed to solve: process "/bin/sh -c if [ \"$BUILD_FA\" = \"1\" ]; then         mkdir -p libs         && cd libs         && git clone https://github.com/ROCm/flash-attention.git         && cd flash-attention         && git checkout \"${FA_BRANCH}\"         && git submodule update --init         && GPU_ARCHS=\"${FA_GFX_ARCHS}\" python3 setup.py bdist_wheel --dist-dir=/install;     else mkdir -p /install;     fi" did not complete successfully: exit code: 1

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions