You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently this project fetches a copy of carpentries/glosario's glossary.yml every day and includes it within this repo via a GitHub Action. One issue with this is after installation of glosario-py, the glossary.yml file is never updated again.
Instead, we could add code that calculates a hash of carpentries/glosario glossary.yml and the local copy, upon start up/import. If there is a difference we download the updated copy. If write permissions for downloading the update are a problem, we can cache the updated YAML file in memory and get the library to use that instead of what's on disk.
The text was updated successfully, but these errors were encountered:
There definitely needs to be a way to update the glossary.yml; when installed via pip, the version downloaded is not even the current version from this repository.
A manually triggered update would be a good start, but ideally one that does not require the user to navigate their install paths. Eg, I'd like to use glosario while teaching with participatory live coding. It would be okay to have learners run a command that re-downloads and updates the file, but not that they navigate all of their (very different) installs.
Sure, I'd love to help. What do you think of my proposal? The YAML file would be updated on each user's machine each time glosario is imported. I was under the impression that we could get the hash of any file from a repository by asking git - that way it wouldn't need to be downloaded each time-, but now I'm not so sure, I'll have to dig a bit deeper. Optionally the carpentries/glosario (note: not carpentries/glosario-py) repository could also store the hash of the YAML file. But given that a hash cannot be attained to verify whether an update is required, the YAML file is small enough to be downloaded anew on every import: at 214KB that's realistically like importing a minified jQuery twice.
Currently this project fetches a copy of carpentries/glosario's glossary.yml every day and includes it within this repo via a GitHub Action. One issue with this is after installation of glosario-py, the glossary.yml file is never updated again.
Instead, we could add code that calculates a hash of carpentries/glosario glossary.yml and the local copy, upon start up/import. If there is a difference we download the updated copy. If write permissions for downloading the update are a problem, we can cache the updated YAML file in memory and get the library to use that instead of what's on disk.
The text was updated successfully, but these errors were encountered: