You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Could you guys please release TF checkpoints for DistilBERTurk?
I have my own Turkish Albert, though with less than desired performance due to me using only the Wiki dump and some pdfs as original training dataset.
I'd like to use your DistilBERT in my intent&slot prediction TF pipeline to compare the accuracies of these models.
The text was updated successfully, but these errors were encountered:
The DistilBERT model was trained with the official Transformers library. That means, PyTorch was used. I converted that model into a TensorFlow compatible format, so that it can be used with the TF* classes/interfaces in Transformers. That TensorFlow model is here:
Hi guys.
Could you guys please release TF checkpoints for DistilBERTurk?
I have my own Turkish Albert, though with less than desired performance due to me using only the Wiki dump and some pdfs as original training dataset.
I'd like to use your DistilBERT in my intent&slot prediction TF pipeline to compare the accuracies of these models.
The text was updated successfully, but these errors were encountered: