You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is the maximum intensity of a dataset sufficient (the minimum should be zero) for normalizing it? We could loop over all tiles to determine the overall max value and ultimately store this as a property in the tile catalog, to be used in later runs for all datasets. This would imply all datasets being normalized with the same bounds, regardless of whether a mask for data balancing is applied or not - do you think this might be an issue?
Not sure how different bands have been dealt with so far - should they be normalized independently or as a whole? Maybe due to heterogeneity between visible an NIR bands (and potentially radar data to be added as an additional band in future?), one should have normalise each band independently? If this is the case we could store in the catalog band-specific max values..
As we are using TF dataset objects to stream data, we need to reconsider how to implement normalisation of input to the VAE
The text was updated successfully, but these errors were encountered: