Skip to content

Commit

Permalink
enable custom_ops
Browse files Browse the repository at this point in the history
  • Loading branch information
ProExpertProg committed Oct 11, 2024
1 parent 8c887b6 commit 272326b
Showing 1 changed file with 3 additions and 3 deletions.
6 changes: 3 additions & 3 deletions vllm/model_executor/custom_op.py
Original file line number Diff line number Diff line change
Expand Up @@ -55,9 +55,9 @@ def forward_gaudi(self, *args, **kwargs):
def dispatch_forward(self):
# NOTE(woosuk): Here we assume that vLLM was built for only one
# specific backend. Currently, we do not support dynamic dispatching.

if envs.VLLM_TORCH_COMPILE_LEVEL >= CompilationLevel.INDUCTOR:
return self.forward_native
#
# if envs.VLLM_TORCH_COMPILE_LEVEL >= CompilationLevel.INDUCTOR:
# return self.forward_native

if is_hip():
return self.forward_hip
Expand Down

0 comments on commit 272326b

Please sign in to comment.