Skip to content
This repository has been archived by the owner on Oct 9, 2023. It is now read-only.

How to use Lightning-Flash with TPU on Kaggle? #632

Answered by akihironitta
prikmm asked this question in Q&A
Discussion options

You must be logged in to vote

@prikmm Hi, thanks for opening a discussion! The error may be caused by the wrong pair of versions of PyTorch and xla. Currently, it's tested only with torch-xla==1.8, so please use the version and try to match the torch version with torch-xla. You can check installed versions with pip list | grep torch.

Related issue: Lightning-AI/pytorch-lightning#8315

Replies: 1 comment 3 replies

Comment options

You must be logged in to vote
3 replies
@akihironitta
Comment options

@prikmm
Comment options

@ethanwharris
Comment options

Answer selected by akihironitta
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
3 participants