You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This means that one document in Sanity should be indexed as several records in Algolia, one for each language. Different projections of the same document should be sufficient to build the individual records.
I tried to solve this by creating two instances of indexer (one for each language) and trigger webhookSync successively for each of them. This did not work.
I guess an option is to create two webhooks and have one for each language? Are there any other workarounds?
Will this be resolved by PR #14? If so, it might be a good idea to use i18n as a use case for documentation. I believe that I might not be the only one struggling with this.
The text was updated successfully, but these errors were encountered:
In our project we both use document level and field level internationalisation, however we mostly use field level. Currently we support two languages, but in the future there we might support several. We use Algolia for search, and because all content is translated the best way for us is to use multiple indices, one for each language. https://www.algolia.com/doc/guides/managing-results/optimize-search-results/handling-natural-languages-nlp/how-to/multilingual-search/
This means that one document in Sanity should be indexed as several records in Algolia, one for each language. Different projections of the same document should be sufficient to build the individual records.
Example:
I tried to solve this by creating two instances of indexer (one for each language) and trigger
webhookSync
successively for each of them. This did not work.I guess an option is to create two webhooks and have one for each language? Are there any other workarounds?
Will this be resolved by PR #14? If so, it might be a good idea to use i18n as a use case for documentation. I believe that I might not be the only one struggling with this.
The text was updated successfully, but these errors were encountered: