Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add softmaxcrossentropy op #64

Merged

Conversation

TedThemistokleous
Copy link

@TedThemistokleous TedThemistokleous commented Sep 7, 2024

Description

Adds support for the softmaxcrossentropyloss operator - Remove the check for op as not supported

Update MIGraphX EP check for supported ops by leveraging API call we use in migraphx-driver onnx --list to get a list of supported onnx parsers built into MIGraphX. Stops us from having to continually update the EP when a new operator is added
Now things should auto updated automatically.

Motivation and Context

Used for runs with LlamaV2 and above. Allows us to run this fully in GPU.

Wont merge until: ROCm/AMDMIGraphX#3008 is merged into MIGraphX

Should be fine to merge with latest change should it work. Will test end to end once other PR merged.

@TedThemistokleous TedThemistokleous self-assigned this Sep 7, 2024
@TedThemistokleous TedThemistokleous force-pushed the add_softmaxcross_entropy_op branch from 4048831 to 36cd087 Compare September 17, 2024 15:46
@TedThemistokleous TedThemistokleous merged commit 8db3c3d into rocm6.3_internal_testing Sep 17, 2024
16 of 26 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant