Skip to content

Actions: Dobiasd/frugally-deep

Actions

All workflows

Actions

Loading...
Loading

Showing runs from all workflows
183 workflow runs
183 workflow runs

Filter by Event

Filter by Status

Filter by Branch

Filter by Actor

Auto-format C++ code (#411)
ci #569: Commit afdaea7 pushed by Dobiasd
January 1, 2024 13:20 9m 6s master
January 1, 2024 13:20 9m 6s
Auto-format C++ code
ci #568: Pull request #411 synchronize by Dobiasd
January 1, 2024 13:09 9m 24s auto-format-code
January 1, 2024 13:09 9m 24s
format
ci #567: Commit 2897ddc pushed by Dobiasd
January 1, 2024 13:09 9m 21s auto-format-code
January 1, 2024 13:09 9m 21s
Auto-format C++ code
ci #566: Pull request #411 opened by Dobiasd
January 1, 2024 13:01 10m 10s auto-format-code
January 1, 2024 13:01 10m 10s
Auto-format C++ code
ci #565: Commit 419954b pushed by Dobiasd
January 1, 2024 13:00 9m 11s auto-format-code
January 1, 2024 13:00 9m 11s
Re-order imports
ci #564: Commit eb40909 pushed by Dobiasd
January 1, 2024 12:54 8m 57s master
January 1, 2024 12:54 8m 57s
Revert debug changes in application_performance.cpp
ci #563: Commit 2479061 pushed by Dobiasd
January 1, 2024 09:25 9m 23s master
January 1, 2024 09:25 9m 23s
Remove redundant internal function subtract_tensor
ci #562: Commit 646c71d pushed by Dobiasd
January 1, 2024 09:15 9m 23s master
January 1, 2024 09:15 9m 23s
Merge branch 'master' of https://github.com/Dobiasd/frugally-deep
ci #561: Commit f82b6fd pushed by Dobiasd
January 1, 2024 09:14 9m 19s master
January 1, 2024 09:14 9m 19s
Update TensorFlow to version 2.15.0
ci #560: Commit 5b44839 pushed by Dobiasd
January 1, 2024 09:09 8m 48s master
January 1, 2024 09:09 8m 48s
Update TensorFlow to version 2.15.0
ci #559: Pull request #410 opened by Dobiasd
January 1, 2024 08:44 8m 50s tensorflow-2-15-0
January 1, 2024 08:44 8m 50s
Update TensorFlow to version 2.15.0
ci #558: Commit c0c060e pushed by Dobiasd
January 1, 2024 08:44 10m 4s tensorflow-2-15-0
January 1, 2024 08:44 10m 4s
Bump version to 0.15.30
ci #557: Commit 6e673fb pushed by Dobiasd
December 31, 2023 18:11 9m 4s v0.15.30
December 31, 2023 18:11 9m 4s
Bump version to 0.15.30
ci #556: Commit 6e673fb pushed by Dobiasd
December 31, 2023 18:10 9m 30s master
December 31, 2023 18:10 9m 30s
Add MultiHeadAttention layer (#392)
ci #555: Commit 7104dd0 pushed by Dobiasd
December 31, 2023 18:09 9m 43s master
December 31, 2023 18:09 9m 43s
Add MultiHeadAttention layer
ci #554: Pull request #392 synchronize by Dobiasd
December 31, 2023 17:58 9m 19s multi-head-attention
December 31, 2023 17:58 9m 19s
Add MultiHeadAttention layer
ci #552: Pull request #392 synchronize by Dobiasd
December 31, 2023 17:54 9m 19s multi-head-attention
December 31, 2023 17:54 9m 19s
Revert debug output
ci #551: Commit b35deb9 pushed by Dobiasd
December 31, 2023 17:54 7m 31s multi-head-attention
December 31, 2023 17:54 7m 31s
Add MultiHeadAttention layer
ci #550: Pull request #392 synchronize by Dobiasd
December 31, 2023 17:52 9m 37s multi-head-attention
December 31, 2023 17:52 9m 37s
double-check weights shapes
ci #549: Commit 41ac53a pushed by Dobiasd
December 31, 2023 17:52 8m 57s multi-head-attention
December 31, 2023 17:52 8m 57s
Add MultiHeadAttention layer
ci #548: Pull request #392 synchronize by Dobiasd
December 31, 2023 17:43 9m 19s multi-head-attention
December 31, 2023 17:43 9m 19s
Do not pass unused attention_axes
ci #547: Commit a95abf4 pushed by Dobiasd
December 31, 2023 17:43 9m 15s multi-head-attention
December 31, 2023 17:43 9m 15s
Add MultiHeadAttention layer
ci #546: Pull request #392 synchronize by Dobiasd
December 31, 2023 12:58 9m 23s multi-head-attention
December 31, 2023 12:58 9m 23s
Check for attention_axes=None in conversion
ci #545: Commit fd6e7c4 pushed by Dobiasd
December 31, 2023 12:58 9m 10s multi-head-attention
December 31, 2023 12:58 9m 10s