Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support FP16 for user buffer #690

Merged
merged 6 commits into from
Mar 7, 2024
Merged
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 2 additions & 1 deletion transformer_engine/pytorch/module/base.py
Original file line number Diff line number Diff line change
Expand Up @@ -111,6 +111,7 @@ def initialize_ub(
shape: list,
tp_size: int,
use_fp8: bool = False,
dtype: torch.dtype = torch.bfloat16
ksivaman marked this conversation as resolved.
Show resolved Hide resolved
ub_cfgs: Optional[dict] = None
) -> None:
"""Initialize communicators for TP comm overlap using userbuffers."""
Expand Down Expand Up @@ -151,7 +152,7 @@ def add_ub(
num_splits: int = 4,
aggregate: int = 0,
) -> None:
dtype = torch.uint8 if (use_fp8 and name in fp8_buf) else torch.bfloat16
dtype = torch.uint8 if (use_fp8 and name in fp8_buf)
timmoon10 marked this conversation as resolved.
Show resolved Hide resolved
sample_buffer = torch.empty(shape, dtype=dtype, device='cuda')
if method == 'ring_exchange':
ub_obj = tex.UbufP2PCommOverlap(
Expand Down