You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I ran across a mistake in the docs for Part-Of-Speech tagging. The last line contains pos_model.encode_as_tensor(&input) which does not exist. Marking the codeblock as no_run instead of ignore then reveals this when running cargo test --doc:
error[E0599]: no method named `encode_as_tensor` found for struct `POSModel` in the current scope
--> src/pipelines/mod.rs:416:24
|
8 | let output = pos_model.encode_as_tensor(&input);
| ^^^^^^^^^^^^^^^^ method not found in `POSModel`
Running that command shows there is another doctest which also fails, however this time it's because cargo test --doc does not enable the hf-tokenizers flag. This is remedied by running cargo test -Fhf-tokenizers --doc instead.
A cursory search reveals there are other rust code blocks in the docs which are annotated with ignore instead of no_run. cargo test --doc is also not part of the CI pipeline, so even the correctly annotated code examples are never checked.
I believe this is worth changing to ensure the docs are correct and stay in sync with the api.
If you let me know your opinion on this and which way you prefer this fixed I would be happy to contribute and open a PR :)
The text was updated successfully, but these errors were encountered:
Hello,
I ran across a mistake in the docs for Part-Of-Speech tagging. The last line contains
pos_model.encode_as_tensor(&input)
which does not exist. Marking the codeblock asno_run
instead ofignore
then reveals this when runningcargo test --doc
:Running that command shows there is another doctest which also fails, however this time it's because
cargo test --doc
does not enable thehf-tokenizers
flag. This is remedied by runningcargo test -Fhf-tokenizers --doc
instead.I did try modifying the
[package.metadata.docs.rs]
tag inCargo.toml
to include it but that did not change anything.Another idea would be to move the [tokenizer docs](https://github.com/guillaume-be/rust-bert/
blob/411c224cfc427f1b7b7fb6c1e449505427ba3d92/src/pipelines/mod.rs#L477) to the
hf-tokenizers
moduleA cursory search reveals there are other rust code blocks in the docs which are annotated with
ignore
instead ofno_run
.cargo test --doc
is also not part of the CI pipeline, so even the correctly annotated code examples are never checked.I believe this is worth changing to ensure the docs are correct and stay in sync with the api.
If you let me know your opinion on this and which way you prefer this fixed I would be happy to contribute and open a PR :)
The text was updated successfully, but these errors were encountered: