Skip to content

Actions: Dobiasd/frugally-deep

Actions

ci

Actions

Loading...
Loading

Show workflow options

Create status badge

Loading
183 workflow runs
183 workflow runs

Filter by Event

Filter by Status

Filter by Branch

Filter by Actor

Add MultiHeadAttention layer
ci #544: Pull request #392 synchronize by Dobiasd
December 31, 2023 12:55 9m 6s multi-head-attention
December 31, 2023 12:55 9m 6s
remove todo comment
ci #543: Commit 0d2be86 pushed by Dobiasd
December 31, 2023 12:55 8m 59s multi-head-attention
December 31, 2023 12:55 8m 59s
Add MultiHeadAttention layer
ci #542: Pull request #392 synchronize by Dobiasd
December 31, 2023 12:50 9m 56s multi-head-attention
December 31, 2023 12:50 9m 56s
remove debug tests
ci #541: Commit 0c6cc0a pushed by Dobiasd
December 31, 2023 12:50 8m 59s multi-head-attention
December 31, 2023 12:50 8m 59s
Add MultiHeadAttention layer
ci #540: Pull request #392 synchronize by Dobiasd
December 31, 2023 12:50 9m 1s multi-head-attention
December 31, 2023 12:50 9m 1s
Add MultiHeadAttention layer
ci #538: Pull request #392 synchronize by Dobiasd
December 31, 2023 11:31 7m 42s multi-head-attention
December 31, 2023 11:31 7m 42s
shorten
ci #537: Commit 702cb60 pushed by Dobiasd
December 31, 2023 11:31 7m 29s multi-head-attention
December 31, 2023 11:31 7m 29s
Add MultiHeadAttention layer
ci #536: Pull request #392 synchronize by Dobiasd
December 31, 2023 11:18 7m 34s multi-head-attention
December 31, 2023 11:18 7m 34s
add more tests
ci #535: Commit db62540 pushed by Dobiasd
December 31, 2023 11:18 7m 43s multi-head-attention
December 31, 2023 11:18 7m 43s
Add MultiHeadAttention layer
ci #534: Pull request #392 synchronize by Dobiasd
December 31, 2023 11:13 7m 21s multi-head-attention
December 31, 2023 11:13 7m 21s
create dense output layer separately
ci #533: Commit 3176f89 pushed by Dobiasd
December 31, 2023 11:13 7m 37s multi-head-attention
December 31, 2023 11:13 7m 37s
Add MultiHeadAttention layer
ci #532: Pull request #392 synchronize by Dobiasd
December 30, 2023 17:49 7m 31s multi-head-attention
December 30, 2023 17:49 7m 31s
fix shapes, add tests
ci #531: Commit 5b3fbd2 pushed by Dobiasd
December 30, 2023 17:49 7m 18s multi-head-attention
December 30, 2023 17:49 7m 18s
Add MultiHeadAttention layer
ci #530: Pull request #392 synchronize by Dobiasd
December 30, 2023 17:10 7m 27s multi-head-attention
December 30, 2023 17:10 7m 27s
December 30, 2023 17:10 7m 50s
Add MultiHeadAttention layer
ci #528: Pull request #392 synchronize by Dobiasd
December 29, 2023 15:20 7m 38s multi-head-attention
December 29, 2023 15:20 7m 38s
adjust tests
ci #527: Commit 4184515 pushed by Dobiasd
December 29, 2023 15:20 7m 52s multi-head-attention
December 29, 2023 15:20 7m 52s
Add MultiHeadAttention layer
ci #526: Pull request #392 synchronize by Dobiasd
December 29, 2023 15:20 7m 25s multi-head-attention
December 29, 2023 15:20 7m 25s
apply dense layers to query, value and key
ci #525: Commit 118f663 pushed by Dobiasd
December 29, 2023 15:20 7m 28s multi-head-attention
December 29, 2023 15:20 7m 28s
Add MultiHeadAttention layer
ci #524: Pull request #392 synchronize by Dobiasd
December 26, 2023 20:01 7m 16s multi-head-attention
December 26, 2023 20:01 7m 16s
Separate weights and biases
ci #523: Commit 212d609 pushed by Dobiasd
December 26, 2023 20:01 7m 35s multi-head-attention
December 26, 2023 20:01 7m 35s
Add MultiHeadAttention layer
ci #522: Pull request #392 synchronize by Dobiasd
December 26, 2023 19:47 7m 56s multi-head-attention
December 26, 2023 19:47 7m 56s
Convert weights to tensors for ctor
ci #521: Commit 09d5868 pushed by Dobiasd
December 26, 2023 19:47 7m 2s multi-head-attention
December 26, 2023 19:47 7m 2s
Bump FunctionalPlus requirement to version 0.2.22
ci #520: Commit 71b6f7c pushed by Dobiasd
December 26, 2023 16:23 9m 18s v0.15.29
December 26, 2023 16:23 9m 18s