-
Notifications
You must be signed in to change notification settings - Fork 3
Description
Use Case
Dependent on / closely coupled with this ticket
We calculate dedup_hash as a way of seeing if a recipe is completely unique or if it's already been run before and we have the results already. However, we should do some exploration / testing to make recipes we think are identical are actually returning the same dedup_hash and if not, we should add some normalization processing to the dedup hash generation. Some possible areas to test are float values that might be rounded differently or read slightly differently or if there are any timestamps or unique ids that are accidentally in the recipe data or if elements in a list are in different orders.
Based on the findings from testing / exploring, update DataDoc.generate_hash() to normalize the recipe data and/or do some normalization of recipe data on the client side before sending to the POST request