-
Notifications
You must be signed in to change notification settings - Fork 287
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Integrate Lightly with Hugging Face #1672
Comments
Hi, thanks for opening the issue! Do I understand correctly that the model code and checkpoint would be hosted on HF? We have checkpoints but the corresponding model code cannot be imported from the package as the models are benchmark implementations and not distributed with the package. |
Hi, The model code would still live in your Github repository. e.g. OpenCLIP is an example: https://github.com/mlfoundations/open_clip. The code lives on Github, but they integrate with the hub here for instance. We also do have a feature where code can live on the hub, but that's mainly for people who want to make their models compatible with the Transformers library (guide for that is here). |
Thanks for the info! We'll check internally if this is an option for us :) |
Hey, @NielsRogge I saw this issue and it's kind of a great exercise, I saw some resources that you shared, and I got some basic overview on that, I would like to give try on this so can you please help through the process |
Hey @Devparihar5, we've started an internal chat discussing various options for approaching this. We will update you on the issue once we've reached an agreement. |
Hi,
Niels here from the community science team at Hugging Face 🤗 . I noticed your work as it was trending on paperswithcode, and saw you currently host all checkpoints on your own servers, e.g. https://lightly-ssl-checkpoints.s3.amazonaws.com/imagenet_resnet50_barlowtwins_2023-08-18_00-11-03/pretrain/version_0/checkpoints/epoch%3D99-step%3D500400.ckpt.
Did you know that hosting on HF is free, and would allow for better discoverability of your work?
If you're interested, leaving a guide here on how Lightly could be integrated with the hub: https://huggingface.co/docs/hub/en/models-adding-libraries. Basically, it comes down to automated tagging of models which get pushed to the hub (if a user uses lightly for training, we could integrate a
push_to_hub
method which then automatically tags the corresponding model in the model card withlibrary_name: lightly
).We've done a similar thing in the past with libraries like Sentence Transformers, Timm, OpenCLIP, Spacy, and so on.
Let us know if you're interested, we're happy to assist the HF integration.
Kind regards,
Niels
The text was updated successfully, but these errors were encountered: