Skip to content

Commit 6c05755

Browse files
nono-SangJohnsonms
authored andcommitted
[diffusion] fix: fix the bug of redundant memory usage on GPU-0 (sgl-project#18221)
1 parent 68fa2ea commit 6c05755

File tree

1 file changed

+3
-0
lines changed
  • python/sglang/multimodal_gen/runtime/platforms

1 file changed

+3
-0
lines changed

python/sglang/multimodal_gen/runtime/platforms/cuda.py

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -124,6 +124,9 @@ def get_available_gpu_memory(
124124
if empty_cache:
125125
torch.cuda.empty_cache()
126126

127+
if torch.distributed.is_initialized():
128+
device_id = torch.distributed.get_rank()
129+
127130
device_props = torch.cuda.get_device_properties(device_id)
128131
if device_props.is_integrated:
129132
free_gpu_memory = psutil.virtual_memory().available

0 commit comments

Comments
 (0)