Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Egocentric coordinate transformation #239

Open
talmo opened this issue Jul 16, 2024 · 4 comments
Open

Egocentric coordinate transformation #239

talmo opened this issue Jul 16, 2024 · 4 comments
Labels
enhancement New optional feature

Comments

@talmo
Copy link

talmo commented Jul 16, 2024

Is your feature request related to a problem? Please describe.
For many analyses, it's super nice to have a way to align the animal such that the centroid is at the origin and the head is aligned to the x-axis. This is sometimes referred to as "egocentrization" since it places all the points in the animal-centric coordinate frame.

This is useful for:

Describe the solution you'd like

Here's the nicely vectorized implementation we've used for a while:

def signed_angle(a: np.ndarray, b: np.ndarray) -> np.ndarray:
    """Finds the signed angle between two 2D vectors a and b.

    Args:
        a: Array of shape (n, 2).
        b: Array of shape (n, 2).

    Returns:
        The signed angles in degrees in vector of shape (n, 2).

        This angle is positive if a is rotated clockwise to align to b and negative if
        this rotation is counter-clockwise.
    """
    a = a / np.linalg.norm(a, axis=1, keepdims=True)
    b = b / np.linalg.norm(b, axis=1, keepdims=True)
    theta = np.arccos(np.around(np.sum(a * b, axis=1), decimals=4))
    cross = np.cross(a, b, axis=1)
    sign = np.zeros(cross.shape)
    sign[cross >= 0] = -1
    sign[cross < 0] = 1
    return np.rad2deg(theta) * sign


def normalize_to_egocentric(
    x: np.ndarray,
    rel_to: np.ndarray | None = None,
    ctr_ind: int = 0,
    fwd_ind: int = 1,
    return_angles: bool = False,
):
    """Normalize pose estimates to egocentric coordinates.

    Args:
        x: Pose of shape (joints, 2) or (time, joints, 2)
        rel_to: Pose to align x with of shape (joints, 2) or (time, joints, 2). Defaults
            to x if not specified. Useful for aligning other individuals relative to the
            centered animal.
        ctr_ind: Index of centroid joint. Defaults to 0.
        fwd_ind: Index of "forward" joint (e.g., head). Defaults to 1.
        return_angles: If True, return angles with the aligned coordinates.

    Returns:
        Egocentrically aligned poses of the same shape as the input.

        If return_angles is True, also returns a vector of angles.
    """

    if rel_to is None:
        rel_to = x

    is_singleton = (x.ndim == 2) and (rel_to.ndim == 2)

    if x.ndim == 2:
        x = np.expand_dims(x, axis=0)
    if rel_to.ndim == 2:
        rel_to = np.expand_dims(rel_to, axis=0)

    # Find egocentric forward coordinates.
    ctr = rel_to[..., ctr_ind, :]  # (t, 2)
    fwd = rel_to[..., fwd_ind, :]  # (t, 2)
    ego_fwd = fwd - ctr

    # Compute angle.
    ang = np.arctan2(
        ego_fwd[..., 1], ego_fwd[..., 0]
    )  # arctan2(y, x) -> radians in [-pi, pi]
    ca = np.cos(ang)  # (t,)
    sa = np.sin(ang)  # (t,)

    # Build rotation matrix.
    rot = np.zeros([len(ca), 3, 3], dtype=ca.dtype)
    rot[..., 0, 0] = ca
    rot[..., 0, 1] = -sa
    rot[..., 1, 0] = sa
    rot[..., 1, 1] = ca
    rot[..., 2, 2] = 1

    # Center.
    x = x - np.expand_dims(ctr, axis=1)

    # Pad, rotate and crop.
    x = np.pad(x, ((0, 0), (0, 0), (0, 1)), "constant", constant_values=1) @ rot
    x = x[..., :2]

    if is_singleton:
        x = x[0]

    if return_angles:
        return x, ang
    else:
        return x

Describe alternatives you've considered
We've done it ourselves :)

Additional context
I know there's been some discussion on APIs and etc., so I don't want to step on your style guide. Given some guidance, happy to shoot a PR or have a student work on it :)

@talmo talmo added the enhancement New optional feature label Jul 16, 2024
@niksirbi
Copy link
Member

Thanks @talmo for such a thorough description. This is indeed a very useful feature and we've been planning to add it from the project's conception.

Your sketched out implementation makes sense to me conceptually, and it could fit in well into the codebase, with a few modifications. The functions would have to accept and return xarray.DataArray objects (i.e. data variables) instead of numpy arrays, similar to the existing functions in our kinematics and filtering modules.

The best place for the main function would probably be a new transforms.py module, as we are likely to add more coordinate transforms in future. I envision something along the lines of from movement.transforms import to_egocentric. Utilities like signed_angle can go into utils/vector.py.

I'm happy for you or someone from your team to have a shot at opening a draft pull request to get this started. If the need arises, I'd be also willing to hop on a call to explain our thoughts on the architecture. If our ongoing API changes get in the way of merging, we can always step in and help with that.

@hummuscience
Copy link

hummuscience commented Oct 21, 2024

I needed to do egocentric alignment for my data and tried to do this with cart2pol/pol2cart. I am still not sure if this is the correct approach (find out the heading angle and substract it from each keypoint:

def align_egocentric_xarray(Y, anterior_keypoints=None, posterior_keypoints=None):

    if anterior_keypoints is None:
        anterior_keypoints = ['head_midpoint','right_ear','left_ear']
    if posterior_keypoints is None:
        posterior_keypoints = ['tail_base','mid_backend3']

    if not isinstance(Y, xr.Dataset):
        raise TypeError("Y must be an xarray.Dataset containing a 'position' DataArray.")

    Y_pos = Y['position']
    
    posterior_loc = Y_pos.sel(keypoints=posterior_keypoints).mean(dim='keypoints', skipna=True)
    anterior_loc = Y_pos.sel(keypoints=anterior_keypoints).mean(dim='keypoints', skipna=True)

    direction = anterior_loc - posterior_loc

    direction_pol = cart2pol(direction)
    h = direction_pol.sel(space_pol='phi')

    v = Y_pos.mean(dim='keypoints')

    Y_centered = Y_pos - v

    Y_centered_pol = cart2pol(Y_centered)

    Y_aligned_pol = Y_centered_pol.copy()
    Y_aligned_pol.loc[{'space_pol': 'phi'}] = Y_centered_pol.sel(space_pol='phi') - h

    Y_aligned = pol2cart(Y_aligned_pol)

    Y_aligned_ds = Y.copy()

    Y_aligned_ds['aligned_pos'] = Y_aligned
    Y_aligned_ds['centroid'] = v
    Y_aligned_ds['heading_angle'] = h

    return Y_aligned_ds

I tried it out with some mouse data that I have and things looked okay. Does that make sense?

@sfmig
Copy link
Contributor

sfmig commented Oct 24, 2024

Thanks for sharing your snippet @hummuscience!

Great use of our available functionality :) I reproduced your approach with one of our sample datasets and it looks correct. I formalised a bit the problem, I share my thinking below in case it is useful.

IIUC, you want to express each individual's keypoints in an egocentric coordinate system, that is defined for each individual as follows:

  • the origin of the coordinate system is at the centroid of the individual,
  • the positive x-axis of the coordinate system is parallel to the posterior2anterior vector of the individual.

The posterior2anterior vector is the vector that goes from the mean of the keypoints labelled as 'posterior', to the mean of the keypoints labelled as 'anterior'. Note that as a result each egocentric coordinate system changes its position and orientation with time.

In the egocentric coordinate system of the one individual in my dataset, the keypoints' trajectories look like this:
image

I paste my (longish) snippet below in case you want to have a look. Note that in my sample dataset I only have one individual but with xarray the approach should scale to more without modifications.

Plot keypoints in egocentric coordinate system
# %%
import matplotlib.pyplot as plt
import numpy as np
import xarray as xr

from movement import sample_data
from movement.io import load_poses
from movement.utils.vector import cart2pol, convert_to_unit, pol2cart

# For interactive plots; requires `pip install ipympl`
# %matplotlib widget

# %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
# Import sample data
# one individual, 6 keypoints

ds_path = sample_data.fetch_dataset_paths(
    "SLEAP_single-mouse_EPM.analysis.h5"
)["poses"]
ds = load_poses.from_sleap_file(ds_path, fps=None)  # force time_unit = frames


print(ds)
# %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
# Define anterior and posterior keypoints

anterior_keypoints = ["snout", "left_ear", "right_ear"]
posterior_keypoints = ["centre", "tail_base"]

# %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
# Compute centroids

# get position data array
position = ds.position

# Compute centroid per individual
centroid = position.mean(dim="keypoints")  # v

# Compute centroid for anterior and posterior keypoints
centroid_anterior = position.sel(keypoints=anterior_keypoints).mean(
    dim="keypoints", skipna=True
)
centroid_posterior = position.sel(keypoints=posterior_keypoints).mean(
    dim="keypoints", skipna=True
)


# %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
# Compute posterior2anterior vector per individual

# Compute vector from posterior to anterior centroid
posterior2anterior = centroid_anterior - centroid_posterior

# Compute polar angle of posterior2anterior vector
# the angle theta is positive going from the positive x-axis to the positive y-axis
posterior2anterior_pol = cart2pol(posterior2anterior)
# theta = posterior2anterior_pol.sel(space_pol="phi")  # h

# %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
# Compute coordinates in egocentric coordinate system

# Compute position in image coord system translated
position_translated = position - centroid  # Y_centered
position_translated_pol = cart2pol(position_translated)  # Y_centered_pol

# Compute position in egocentric coordinate system
position_egocentric_pol = position_translated_pol.copy()

# rho is the same as in the translated image coordinate system
# phi angle is measured relative to the phi angle of the `posterior2anterior` vector
position_egocentric_pol.loc[{"space_pol": "phi"}] = (
    position_translated_pol.sel(space_pol="phi")
    - posterior2anterior_pol.sel(
        space_pol="phi"
    )  # angle_relative_to_posterior2anterior
)

# Convert rotated position coordinates to cartesian
position_egocentric = pol2cart(position_egocentric_pol)

# Create a dataset with the `position` data array holding the egocentric coordinates
# of the keypoints
ds_egocentric = ds.copy()
ds_egocentric["position"] = position_egocentric  

# %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
# Check by plotting the keypoints' trajectories in the egocentric coordinate system

fig, ax = plt.subplots(1, 1)
for kpt in ds_egocentric.coords["keypoints"].data:
    ax.scatter(
        x=ds_egocentric.position.sel(keypoints=kpt, space="x"),
        y=ds_egocentric.position.sel(keypoints=kpt, space="y"),
        label=kpt,
        alpha=0.5,
    )
    # add axes of egocentric coordinate system
    ax.quiver(
        [0],  # x
        [0],  # y
        [100],  # u
        [0],  # v
        color="r",
        angles="xy",
        scale=1,
        scale_units="xy",
    )  # x-axis in red
    ax.quiver(
        [0],  # x
        [0],  # y
        [0],  # u
        [100],  # v
        color="g",
        angles="xy",
        scale=1,
        scale_units="xy",
    )  # y-axis in green

ax.legend()
ax.invert_yaxis()
ax.axis("equal")
ax.set_xlim(-200, 200)
ax.set_xlabel("x (pixels)")
ax.set_ylabel("y (pixels)")

# %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
# Check that the posterior2anterior vector in the egocentric coordinate system
# is parallel to the x-axis

# Compute centroid for anterior and posterior keypoints
# in egocentric coordinate system
centroid_anterior_rotated = position_egocentric.sel(
    keypoints=anterior_keypoints
).mean(dim="keypoints", skipna=True)
centroid_posterior_rotated = position_egocentric.sel(
    keypoints=posterior_keypoints
).mean(dim="keypoints", skipna=True)

# Compute posterior2anterior vector in egocentric coordinate system
posterior2anterior_rotated = (
    centroid_anterior_rotated - centroid_posterior_rotated
)

# Check that the y-component of the vector is close to zero
print(np.nanmax(posterior2anterior_rotated.sel(space="y").data))
print(np.nanmin(posterior2anterior_rotated.sel(space="y").data))

# Check that the unit posterior2anterior vector is parallel to the x-axis
posterior2anterior_rotated_unit = convert_to_unit(posterior2anterior_rotated)
posterior2anterior_rotated_unit.plot.line(x="time", row="individuals")

Just for further context, for this issue we were thinking of a slightly more general approach. Maybe the user provides the desired origin of the (2D) egocentric coordinate system (in your case, the centroid), and a vector that will be the x-axis (in your case, the anterior2posterior vector). In the body of the function we can compute the transformation (translation & rotation) that we need to apply to the keypoints to express them in the egocentric coordinate system (maybe using scipy's rotation objects and methods) and return the position data in that coordinate system.

@luiztauffer
Copy link

@DrSRMiller @katiekly fyi

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New optional feature
Projects
Status: 📝 Todo
Development

No branches or pull requests

5 participants