Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Possible GPU Memory Leak? #190

Closed
ProfFan opened this issue Sep 30, 2020 · 1 comment
Closed

Possible GPU Memory Leak? #190

ProfFan opened this issue Sep 30, 2020 · 1 comment
Assignees
Labels
bug Something isn't working

Comments

@ProfFan
Copy link
Collaborator

ProfFan commented Sep 30, 2020

The notebook cannot be run with GPU S4TF because of constant OOM when calculating the loss. It appears that GPU memory is filled after

let data = BeeVideo(videoName: "bee_video_1")!
print(data.frames.count)
print(data.tracks.count)
> nvidia-smi
Wed Sep 30 11:00:26 2020
+-----------------------------------------------------------------------------+
| NVIDIA-SMI 455.23.04    Driver Version: 455.23.04    CUDA Version: 11.1     |
|-------------------------------+----------------------+----------------------+
| GPU  Name        Persistence-M| Bus-Id        Disp.A | Volatile Uncorr. ECC |
| Fan  Temp  Perf  Pwr:Usage/Cap|         Memory-Usage | GPU-Util  Compute M. |
|                               |                      |               MIG M. |
|===============================+======================+======================|
|   0  GeForce GTX 1080    Off  | 00000000:01:00.0  On |                  N/A |
| 28%   54C    P2    53W / 180W |   7540MiB /  8119MiB |      1%      Default |
|                               |                      |                  N/A |
+-------------------------------+----------------------+----------------------+

+-----------------------------------------------------------------------------+
| Processes:                                                                  |
|  GPU   GI   CI        PID   Type   Process name                  GPU Memory |
|        ID   ID                                                   Usage      |
|=============================================================================|
|    0   N/A  N/A      1121      G   /usr/lib/Xorg                     102MiB |
|    0   N/A  N/A      2609      G   /usr/lib/Xorg                     545MiB |
|    0   N/A  N/A      3261      G   ...akonadi_archivemail_agent        2MiB |
|    0   N/A  N/A      3264      G   .../bin/akonadi_ews_resource       14MiB |
|    0   N/A  N/A      3268      G   ...bin/akonadi_imap_resource        2MiB |
|    0   N/A  N/A      3279      G   .../akonadi_mailfilter_agent        2MiB |
|    0   N/A  N/A      3288      G   ...n/akonadi_sendlater_agent        2MiB |
|    0   N/A  N/A      3291      G   ...nadi_unifiedmailbox_agent        2MiB |
|    0   N/A  N/A      3879      G   ...AAAAAAAAA= --shared-files      309MiB |
|    0   N/A  N/A      4179      G   /usr/bin/krunner                   11MiB |
|    0   N/A  N/A      4285      G   /usr/bin/alacritty                 11MiB |
|    0   N/A  N/A    349438      G   /usr/bin/plasmashell              160MiB |
|    0   N/A  N/A   3385747      G   ...AAAAAAAAA= --shared-files       39MiB |
|    0   N/A  N/A   4011142      C   ...lchain/usr/bin/repl_swift     6313MiB |
+-----------------------------------------------------------------------------+

However when I look at the device of the Tensors, they are all on the CPU:

print(beeBatch.device)
print(data.frames[0].device)

For now I'll just use the CPU-only version of S4TF instead as a workaround.

@ProfFan ProfFan added the bug Something isn't working label Sep 30, 2020
@ProfFan
Copy link
Collaborator Author

ProfFan commented Oct 2, 2020

Closing as this is upstream issue

tensorflow/swift-apis#1089
tensorflow/swift-apis#1094

@ProfFan ProfFan closed this as completed Oct 2, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants