Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

RPR on top of causal attention #453

Draft
wants to merge 21 commits into
base: main
Choose a base branch
from

Conversation

nicolasvasilache
Copy link
Contributor

No description provided.

ftynse and others added 19 commits February 3, 2025 23:25
Signed-off-by: Alex Zinenko <[email protected]>
Signed-off-by: Nicolas Vasilache <[email protected]>
Signed-off-by: Nicolas Vasilache <[email protected]>
This revision uses `tkw.apply_expr` to circumvent type mismatches such as:

```
ValueError: Expected an fx.Node but got <class 'int'>
ValueError: Expected an fx.Node but got <class 'Symbol'>
```

This further requires supporting index_cast in `tkw.cast` and
playgroun/vanialla_attention.py now produces valid IR.

Signed-off-by: Nicolas Vasilache <[email protected]>
Refactor and add default torch implementation against which we allclose.
Set sizes to known good values that pass the checks; it is easy to fall off the cliff with various size combinations.

Additionally, with the following, one can remove the inplace hack.
```
pip install -r pytorch-rocm-requirements.txt  -e .
```

Signed-off-by: Nicolas Vasilache <[email protected]>
Signed-off-by: Nicolas Vasilache <[email protected]>
Signed-off-by: Nicolas Vasilache <[email protected]>
Signed-off-by: Stanley Winata <[email protected]>
@nicolasvasilache nicolasvasilache force-pushed the rpe_on_top_of_stans branch 2 times, most recently from 06180db to 42a7fb4 Compare February 4, 2025 17:35
Signed-off-by: Nicolas Vasilache <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants