Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Simplify routes, normalise dot density (incl. multi-device data) #946

Open
neithere opened this issue Mar 13, 2025 · 1 comment
Open

Simplify routes, normalise dot density (incl. multi-device data) #946

neithere opened this issue Mar 13, 2025 · 1 comment

Comments

@neithere
Copy link

TL;DR

Points from different sources have uneven density which renders the heatmap useless and may lead to problems with UI

Example use case

I have GPS Logger constantly tracking location and discarding points when accuracy or distance from the last point are below a certain threshold. This results is relatively sparse but accurate data with few false positives logged at home. However, it may miss something during active movement.

I also have a fitness tracker device/app which logs my walks/runs/cycling on demand along with heart rate. It logs like crazy, every second, and it's not configurable.

I also have a lot of historical data logged by other devices with different settings, somewhere between these two.

Problems

As a result, the density of points heavily depends on device, application and its settings. It may be reflected on the heatmap as disproportionally higher frequency of certain routes even if in fact they are the same.

An unnecessarily large amount of points also leads to very slow loading and memory consumption on FE.

Suggested solutions

I can imagine writing a script for path normalisation and running all my GPX files — new and historical — through that script before uploading. However, it's not practical especially when I'm uploading from mobile.

It would be much better to integrate this functionality into DaWarIch, ideally letting me specify some normalisation rules (e.g. delete any two points which have less than X seconds between them and less than Y metres between them).

This could work either on import (per GPX file) or in background (potentially choosing adjacent points with the highest accuracy from multi-device data).

It could also work as a preprocessor plugin, possibly through a separate container/URL to reduce the burden on @Freika.

@neithere
Copy link
Author

Somewhat related to #736.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant