You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Points from different sources have uneven density which renders the heatmap useless and may lead to problems with UI
Example use case
I have GPS Logger constantly tracking location and discarding points when accuracy or distance from the last point are below a certain threshold. This results is relatively sparse but accurate data with few false positives logged at home. However, it may miss something during active movement.
I also have a fitness tracker device/app which logs my walks/runs/cycling on demand along with heart rate. It logs like crazy, every second, and it's not configurable.
I also have a lot of historical data logged by other devices with different settings, somewhere between these two.
Problems
As a result, the density of points heavily depends on device, application and its settings. It may be reflected on the heatmap as disproportionally higher frequency of certain routes even if in fact they are the same.
An unnecessarily large amount of points also leads to very slow loading and memory consumption on FE.
Suggested solutions
I can imagine writing a script for path normalisation and running all my GPX files — new and historical — through that script before uploading. However, it's not practical especially when I'm uploading from mobile.
It would be much better to integrate this functionality into DaWarIch, ideally letting me specify some normalisation rules (e.g. delete any two points which have less than X seconds between them and less than Y metres between them).
This could work either on import (per GPX file) or in background (potentially choosing adjacent points with the highest accuracy from multi-device data).
It could also work as a preprocessor plugin, possibly through a separate container/URL to reduce the burden on @Freika.
The text was updated successfully, but these errors were encountered:
TL;DR
Points from different sources have uneven density which renders the heatmap useless and may lead to problems with UI
Example use case
I have GPS Logger constantly tracking location and discarding points when accuracy or distance from the last point are below a certain threshold. This results is relatively sparse but accurate data with few false positives logged at home. However, it may miss something during active movement.
I also have a fitness tracker device/app which logs my walks/runs/cycling on demand along with heart rate. It logs like crazy, every second, and it's not configurable.
I also have a lot of historical data logged by other devices with different settings, somewhere between these two.
Problems
As a result, the density of points heavily depends on device, application and its settings. It may be reflected on the heatmap as disproportionally higher frequency of certain routes even if in fact they are the same.
An unnecessarily large amount of points also leads to very slow loading and memory consumption on FE.
Suggested solutions
I can imagine writing a script for path normalisation and running all my GPX files — new and historical — through that script before uploading. However, it's not practical especially when I'm uploading from mobile.
It would be much better to integrate this functionality into DaWarIch, ideally letting me specify some normalisation rules (e.g. delete any two points which have less than X seconds between them and less than Y metres between them).
This could work either on import (per GPX file) or in background (potentially choosing adjacent points with the highest accuracy from multi-device data).
It could also work as a preprocessor plugin, possibly through a separate container/URL to reduce the burden on @Freika.
The text was updated successfully, but these errors were encountered: