Skip to content

Commit

Permalink
Check for attention_axes=None in conversion
Browse files Browse the repository at this point in the history
  • Loading branch information
Dobiasd committed Dec 31, 2023
1 parent 0d2be86 commit fd6e7c4
Showing 1 changed file with 1 addition and 0 deletions.
1 change: 1 addition & 0 deletions keras_export/convert_model.py
Original file line number Diff line number Diff line change
Expand Up @@ -566,6 +566,7 @@ def show_multi_head_attention_layer(layer):
assert len(layer.input_shape) == 3
assert layer.input_shape[0] is None
assert layer._output_shape is None
assert layer._attention_axes == (1,), "MultiHeadAttention supported only with attention_axes=None"
return {
'weight_shapes': list(map(lambda w: list(w.shape), layer.weights)),
'weights': list(map(lambda w: encode_floats(w.numpy()), layer.weights)),
Expand Down

0 comments on commit fd6e7c4

Please sign in to comment.