Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Metrics: Soft dynamic-time-warping divergence for FDataGrid #412

Open
wants to merge 27 commits into
base: develop
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from 22 commits
Commits
Show all changes
27 commits
Select commit Hold shift + click to select a range
0522905
add soft-DTW divergence to metrics
Clej Jan 12, 2022
44863e3
add soft-dtw divergence paper references
Clej Jan 12, 2022
12054fb
Merge branch 'metric_sdtw' of github.com:Clej/scikit-fda into metric_…
Clej Jan 12, 2022
f6c6254
code style correction
Clej Jan 13, 2022
37b5b16
conversion of sdtw in cython to python@numba
Clej Feb 7, 2022
10de3d6
sdtw cython version replaced by numba version
Clej Feb 7, 2022
0079ce4
np.einsum in inner_product_matrix replaced by np.dot
Clej Feb 15, 2022
7a0c5c9
replaced np.sum(X**2, axis=1) by inner_product
Clej Feb 15, 2022
1668b20
format, type hints
Clej Feb 15, 2022
81af9e3
corrections asked by mantainers
Clej May 31, 2022
1881d20
tabs replaced by spaces
Clej Oct 19, 2022
bca8384
shape_only parameter removed
Clej Oct 19, 2022
2cad642
np.all corrected by np.any in indiscernability check
Clej Oct 19, 2022
c4e1af3
isinstance in if's and new variable for evaluated cost
Clej Oct 19, 2022
24fddea
isinstance in if's and new variable for evaluated cost
Clej Oct 19, 2022
7d42906
set hse cost as default and constrain callable cost to be computed fd…
Clej Oct 19, 2022
1ae216e
Merge branch 'develop' into metric_sdtw
vnmabus Oct 20, 2022
e01ae84
Merge branch 'develop' into metric_sdtw
Clej Oct 20, 2022
6c9ed49
remove fastmath
Clej Oct 20, 2022
4128364
Merge remote-tracking branch 'refs/remotes/origin/metric_sdtw' into m…
Clej Oct 20, 2022
07bc966
make hse cost function more concise
Clej Oct 20, 2022
a54d0cf
Remove extra quotation mark
vnmabus Nov 3, 2022
98e7dbb
add comment
Clej Jan 19, 2023
d4de35f
Merge remote-tracking branch 'origin' into metric_sdtw
Clej Feb 6, 2023
9ca71d1
add type annotations
Clej Feb 6, 2023
e5928fa
Merge branch 'develop' into metric_sdtw
Clej Feb 6, 2023
f43c73b
pep8 corrections
Clej Feb 6, 2023
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
22 changes: 22 additions & 0 deletions docs/refs.bib
Original file line number Diff line number Diff line change
Expand Up @@ -63,6 +63,17 @@ @article{berrendero+cuevas+torrecilla_2018_hilbert
URL = {https://doi.org/10.1080/01621459.2017.1320287}
}

@inproceedings{Blondel_2021_sdtw_div,
author = {Blondel, Mathieu and Mensch, Arthur and Vert, Jean-Philippe},
title = {Differentiable Divergences Between Time Series},
booktitle = {Proceedings of The 24th International Conference on Artificial Intelligence and Statistics},
pages = {3853 -- 3861},
year = {2021},
volume = {130},
series = {Proceedings of Machine Learning Research},
publisher = {PMLR},
}

@inproceedings{breunig++_2000_outliers,
author = {Breunig, Markus and Kriegel, Hans-Peter and Ng, Raymond and Sander, Joerg},
year = {2000},
Expand Down Expand Up @@ -111,6 +122,17 @@ @article{cuevas++_2004_anova
doi = {10.1016/j.csda.2003.10.021}
}

@inproceedings{Cuturi_2017_sdtw,
title = {Soft-{DTW}: a Differentiable Loss Function for Time-Series},
author = {Marco Cuturi and Mathieu Blondel},
booktitle = {Proceedings of the 34th International Conference on Machine Learning},
pages = {894 -- 903},
year = {2017},
volume = {70},
series = {Proceedings of Machine Learning Research},
publisher = {PMLR},
}

@article{dai+genton_2018_visualization,
author = {Wenlin Dai and Marc G. Genton},
title = {Multivariate Functional Data Visualization and Outlier Detection},
Expand Down
4 changes: 3 additions & 1 deletion skfda/misc/_math.py
Original file line number Diff line number Diff line change
Expand Up @@ -327,7 +327,8 @@ def inner_product(
)
elif isinstance(arg1, np.ndarray) and isinstance(arg2, np.ndarray):
return ( # type: ignore[no-any-return]
np.einsum('n...,m...->nm...', arg1, arg2).sum(axis=-1)
np.dot(arg1, arg2.T)
# np.einsum('n...,m...->nm...', arg1, arg2).sum(axis=-1)
if _matrix else (arg1 * arg2).sum(axis=-1)
)

Expand Down Expand Up @@ -675,3 +676,4 @@ def cosine_similarity_matrix(
return _clip_cosine(
inner_matrix / norm1[:, np.newaxis] / norm2[np.newaxis, :],
)

2 changes: 2 additions & 0 deletions skfda/misc/metrics/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -30,6 +30,7 @@
],
"_mahalanobis": ["MahalanobisDistance"],
"_parse": ["PRECOMPUTED"],
"_sdtw_distances": ["sdtwDivergence"],
"_utils": [
"NormInducedMetric",
"PairwiseMetric",
Expand Down Expand Up @@ -63,6 +64,7 @@
)
from ._mahalanobis import MahalanobisDistance as MahalanobisDistance
from ._parse import PRECOMPUTED as PRECOMPUTED
from ._sdtw_distances import sdtwDivergence
from ._utils import (
NormInducedMetric as NormInducedMetric,
PairwiseMetric as PairwiseMetric,
Expand Down
Loading