Skip to content

Commit

Permalink
Update sampling_metadata.py
Browse files Browse the repository at this point in the history
temporary change based on #246 for testing only, will remove later
  • Loading branch information
michalkuligowski authored Oct 24, 2024
1 parent 3c423a4 commit 7fd552a
Showing 1 changed file with 4 additions and 2 deletions.
6 changes: 4 additions & 2 deletions vllm/model_executor/sampling_metadata.py
Original file line number Diff line number Diff line change
Expand Up @@ -266,8 +266,10 @@ def _prepare_seq_groups(

if seq_group_metadata.is_prompt:
if sampling_params.seed is not None:
generator = torch.Generator(device=device).manual_seed(
sampling_params.seed)
import habana_frameworks.torch.hpu.random as htrandom
generator = \
htrandom.default_generators[
0].manual_seed(sampling_params.seed)
if generators is not None:
generators[seq_group_metadata.request_id] = generator

Expand Down

0 comments on commit 7fd552a

Please sign in to comment.