Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Full state-of-the-art blas/lapack support #12

Open
cnuernber opened this issue Jan 19, 2019 · 0 comments
Open

Full state-of-the-art blas/lapack support #12

cnuernber opened this issue Jan 19, 2019 · 0 comments

Comments

@cnuernber
Copy link
Collaborator

cnuernber commented Jan 19, 2019

An integration wtih magma would provide state of the art lapack support for cuda and rocm platforms.

This, along with jna bindings to lapack in tech.compute would I think be enough to enable a good portion of scientific computing to happen across multiple backends assuming the basic tensor ops were implemented for the various backends (currently only assignment is done).

A very interesting potential side project. Integrating lapack and magma into tvm I think would be well received by the tvm and magma communities. That could happen independent of tvm-clj and the results would be easily used then from the tvm-clj system.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant