-
Notifications
You must be signed in to change notification settings - Fork 26
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Egocentric coordinate transformation #239
Comments
Thanks @talmo for such a thorough description. This is indeed a very useful feature and we've been planning to add it from the project's conception. Your sketched out implementation makes sense to me conceptually, and it could fit in well into the codebase, with a few modifications. The functions would have to accept and return The best place for the main function would probably be a new I'm happy for you or someone from your team to have a shot at opening a draft pull request to get this started. If the need arises, I'd be also willing to hop on a call to explain our thoughts on the architecture. If our ongoing API changes get in the way of merging, we can always step in and help with that. |
I needed to do egocentric alignment for my data and tried to do this with cart2pol/pol2cart. I am still not sure if this is the correct approach (find out the heading angle and substract it from each keypoint:
I tried it out with some mouse data that I have and things looked okay. Does that make sense? |
Thanks for sharing your snippet @hummuscience! Great use of our available functionality :) I reproduced your approach with one of our sample datasets and it looks correct. I formalised a bit the problem, I share my thinking below in case it is useful. IIUC, you want to express each individual's keypoints in an egocentric coordinate system, that is defined for each individual as follows:
The In the egocentric coordinate system of the one individual in my dataset, the keypoints' trajectories look like this: I paste my (longish) snippet below in case you want to have a look. Note that in my sample dataset I only have one individual but with Plot keypoints in egocentric coordinate system# %%
import matplotlib.pyplot as plt
import numpy as np
import xarray as xr
from movement import sample_data
from movement.io import load_poses
from movement.utils.vector import cart2pol, convert_to_unit, pol2cart
# For interactive plots; requires `pip install ipympl`
# %matplotlib widget
# %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
# Import sample data
# one individual, 6 keypoints
ds_path = sample_data.fetch_dataset_paths(
"SLEAP_single-mouse_EPM.analysis.h5"
)["poses"]
ds = load_poses.from_sleap_file(ds_path, fps=None) # force time_unit = frames
print(ds)
# %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
# Define anterior and posterior keypoints
anterior_keypoints = ["snout", "left_ear", "right_ear"]
posterior_keypoints = ["centre", "tail_base"]
# %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
# Compute centroids
# get position data array
position = ds.position
# Compute centroid per individual
centroid = position.mean(dim="keypoints") # v
# Compute centroid for anterior and posterior keypoints
centroid_anterior = position.sel(keypoints=anterior_keypoints).mean(
dim="keypoints", skipna=True
)
centroid_posterior = position.sel(keypoints=posterior_keypoints).mean(
dim="keypoints", skipna=True
)
# %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
# Compute posterior2anterior vector per individual
# Compute vector from posterior to anterior centroid
posterior2anterior = centroid_anterior - centroid_posterior
# Compute polar angle of posterior2anterior vector
# the angle theta is positive going from the positive x-axis to the positive y-axis
posterior2anterior_pol = cart2pol(posterior2anterior)
# theta = posterior2anterior_pol.sel(space_pol="phi") # h
# %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
# Compute coordinates in egocentric coordinate system
# Compute position in image coord system translated
position_translated = position - centroid # Y_centered
position_translated_pol = cart2pol(position_translated) # Y_centered_pol
# Compute position in egocentric coordinate system
position_egocentric_pol = position_translated_pol.copy()
# rho is the same as in the translated image coordinate system
# phi angle is measured relative to the phi angle of the `posterior2anterior` vector
position_egocentric_pol.loc[{"space_pol": "phi"}] = (
position_translated_pol.sel(space_pol="phi")
- posterior2anterior_pol.sel(
space_pol="phi"
) # angle_relative_to_posterior2anterior
)
# Convert rotated position coordinates to cartesian
position_egocentric = pol2cart(position_egocentric_pol)
# Create a dataset with the `position` data array holding the egocentric coordinates
# of the keypoints
ds_egocentric = ds.copy()
ds_egocentric["position"] = position_egocentric
# %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
# Check by plotting the keypoints' trajectories in the egocentric coordinate system
fig, ax = plt.subplots(1, 1)
for kpt in ds_egocentric.coords["keypoints"].data:
ax.scatter(
x=ds_egocentric.position.sel(keypoints=kpt, space="x"),
y=ds_egocentric.position.sel(keypoints=kpt, space="y"),
label=kpt,
alpha=0.5,
)
# add axes of egocentric coordinate system
ax.quiver(
[0], # x
[0], # y
[100], # u
[0], # v
color="r",
angles="xy",
scale=1,
scale_units="xy",
) # x-axis in red
ax.quiver(
[0], # x
[0], # y
[0], # u
[100], # v
color="g",
angles="xy",
scale=1,
scale_units="xy",
) # y-axis in green
ax.legend()
ax.invert_yaxis()
ax.axis("equal")
ax.set_xlim(-200, 200)
ax.set_xlabel("x (pixels)")
ax.set_ylabel("y (pixels)")
# %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
# Check that the posterior2anterior vector in the egocentric coordinate system
# is parallel to the x-axis
# Compute centroid for anterior and posterior keypoints
# in egocentric coordinate system
centroid_anterior_rotated = position_egocentric.sel(
keypoints=anterior_keypoints
).mean(dim="keypoints", skipna=True)
centroid_posterior_rotated = position_egocentric.sel(
keypoints=posterior_keypoints
).mean(dim="keypoints", skipna=True)
# Compute posterior2anterior vector in egocentric coordinate system
posterior2anterior_rotated = (
centroid_anterior_rotated - centroid_posterior_rotated
)
# Check that the y-component of the vector is close to zero
print(np.nanmax(posterior2anterior_rotated.sel(space="y").data))
print(np.nanmin(posterior2anterior_rotated.sel(space="y").data))
# Check that the unit posterior2anterior vector is parallel to the x-axis
posterior2anterior_rotated_unit = convert_to_unit(posterior2anterior_rotated)
posterior2anterior_rotated_unit.plot.line(x="time", row="individuals") Just for further context, for this issue we were thinking of a slightly more general approach. Maybe the user provides the desired origin of the (2D) egocentric coordinate system (in your case, the centroid), and a vector that will be the x-axis (in your case, the |
@DrSRMiller @katiekly fyi |
Is your feature request related to a problem? Please describe.
For many analyses, it's super nice to have a way to align the animal such that the centroid is at the origin and the head is aligned to the x-axis. This is sometimes referred to as "egocentrization" since it places all the points in the animal-centric coordinate frame.
This is useful for:
Describe the solution you'd like
Here's the nicely vectorized implementation we've used for a while:
Describe alternatives you've considered
We've done it ourselves :)
Additional context
I know there's been some discussion on APIs and etc., so I don't want to step on your style guide. Given some guidance, happy to shoot a PR or have a student work on it :)
The text was updated successfully, but these errors were encountered: