Distilled / faster coreference resolution #13218
znadrich-qf
started this conversation in
New Features & Project Ideas
Replies: 1 comment
-
Any thoughts on this? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
In comparison to
fastcoref
theen_coreference_web_trf
coreference resolution model available inspacy-experimental
is significantly slower. There was a previous discussion about potentially increasing the model inference speed by using a distilled transformer. Before undertaking this effort myself, I needed to get ahold of the OntoNotes dataset which needs to be licensed. There is another discussion stating that spaCy has a special licensing agreement allowing you to release models trained with this dataset under the MIT licenseGiven all of this, it seems rather difficult for one to train a distilled coreference resolution model without the dataset or the specialized licensing agreement. Would spaCy ever officially release a distilled coreference resolution model?
Beta Was this translation helpful? Give feedback.
All reactions