Skip to content
This repository has been archived by the owner on Jul 1, 2023. It is now read-only.

[summary] Add annotations to Recurrent #1074

Open
texasmichelle opened this issue Sep 9, 2020 · 1 comment
Open

[summary] Add annotations to Recurrent #1074

texasmichelle opened this issue Sep 9, 2020 · 1 comment
Assignees

Comments

@texasmichelle
Copy link
Member

The design of Recurrent layer types is unique, but it should still be possible to add annotations. Tackling this separately from #1067, but the principle should be similar.

@texasmichelle texasmichelle self-assigned this Sep 9, 2020
@texasmichelle
Copy link
Member Author

I expected this to be possible by conforming the associated types for RecurrentLayerCell to DifferentiableTensorProtocol:

public protocol RecurrentLayerCell: Layer
where
  Input == RNNCellInput<TimeStepInput, State>,
  Output == RNNCellOutput<TimeStepOutput, State>
{
  associatedtype TimeStepInput: DifferentiableTensorProtocol
  associatedtype TimeStepOutput: DifferentiableTensorProtocol
  associatedtype State: DifferentiableTensorProtocol
}

This is problematic because LSTMCell defines TimeStepOutput as a custom struct containing both a visible and hidden Tensor<Scalar>.

With the current architecture in its complex form, it is unclear to me how to add annotations without splitting the layer definitions apart.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant