Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Need to Detect if person is walking or standing using yolo V8 or V11 #922

Open
1 task done
MShahrukhkhan13 opened this issue Nov 10, 2024 · 2 comments
Open
1 task done
Labels
detect Object Detection issues, PR's pose Pose/keypoint estimation models question Further information is requested

Comments

@MShahrukhkhan13
Copy link

Search before asking

Question

I am using yoloV8 and YoloV11 model, Using pose model I can find out easily if person is standing or sitting. But is there any way to figure it out if Person is walking. ?

Additional

For your reference i am using below code, it work's fine for standing and sitting but unable to detect walking.

Thresholds for posture classification

threshold_standing = 70 # Example value for standing posture
threshold_sitting = 30 # Example value for sitting posture
walking_threshold = 100 # Example value for walking

Function to calculate the distance between two points

def distance(pointA, pointB):
return math.sqrt((pointA[0] - pointB[0]) ** 2 + (pointA[1] - pointB[1]) ** 2)

Function to calculate the angle between three points

def angle(pointA, pointB, pointC):
vectorAB = [pointA[0] - pointB[0], pointA[1] - pointB[1]]
vectorCB = [pointC[0] - pointB[0], pointC[1] - pointB[1]]

dot_product = vectorAB[0] * vectorCB[0] + vectorAB[1] * vectorCB[1]
magnitudeAB = math.sqrt(vectorAB[0] ** 2 + vectorAB[1] ** 2)
magnitudeCB = math.sqrt(vectorCB[0] ** 2 + vectorCB[1] ** 2)

cos_value = dot_product / (magnitudeAB * magnitudeCB + 1e-6)
cos_value = max(-1.0, min(1.0, cos_value))  # Clamp the value
angle_rad = math.acos(cos_value)

angle_deg = math.degrees(angle_rad) 
return angle_deg

Function to classify posture based on keypoints

def classify_posture(keypoints):
# Check if keypoints is None or if it's a tensor with no elements
if keypoints is None or (isinstance(keypoints, torch.Tensor) and keypoints.numel() == 0):
return None # or handle the error as needed

# Extract points for easier access
nose = keypoints[0]
left_eye, right_eye = keypoints[1], keypoints[2]
left_ear, right_ear = keypoints[3], keypoints[4]
left_shoulder, right_shoulder = keypoints[5], keypoints[6]
left_elbow, right_elbow = keypoints[7], keypoints[8]
left_wrist, right_wrist = keypoints[9], keypoints[10]
left_hip, right_hip = keypoints[11], keypoints[12]
left_knee, right_knee = keypoints[13], keypoints[14]
left_ankle, right_ankle = keypoints[15], keypoints[16]

# Calculate distances and angles for posture analysis
hip_to_ankle_distance = distance(left_hip, left_ankle)
knee_angle = angle(left_hip, left_knee, left_ankle)
shoulder_alignment = abs(left_shoulder[1] - right_shoulder[1])
hip_knee_distance = distance(left_hip, left_knee)

# Check for standing posture
if hip_to_ankle_distance > threshold_standing and knee_angle > 150:
    return "Standing"

# Check for sitting posture
if hip_knee_distance < threshold_sitting and knee_angle < 120:
    return "Sitting"

# Check for walking posture
if knee_angle > 150 and (left_hip[0] - right_hip[0]) > walking_threshold:
    return "Walking"

return "Sitting"
@MShahrukhkhan13 MShahrukhkhan13 added the question Further information is requested label Nov 10, 2024
@UltralyticsAssistant UltralyticsAssistant added detect Object Detection issues, PR's pose Pose/keypoint estimation models labels Nov 10, 2024
@UltralyticsAssistant
Copy link
Member

👋 Hello @MShahrukhkhan13, thank you for raising an issue about Ultralytics HUB 🚀! An Ultralytics engineer will review your query and provide assistance soon. In the meantime, you might find our HUB Docs helpful for understanding more about the platform and capabilities:

Since you are dealing with posture and motion detection, it might be beneficial to check out sections related to model training for insights.

If this is a 🐛 Bug Report, please provide any additional information or code snippets that can help reproduce the issue. A minimum reproducible example (MRE) would be particularly helpful. You can refer to our guide on creating a minimum reproducible example.

If you have any ❓ questions, please include specifics such as the dataset, any pretrained models used, and details about your environment setup to help us provide the most accurate advice.

Thanks for your patience as we work to assist you! 😊

@pderrenger
Copy link
Member

@MShahrukhkhan13 hello! 😊

Detecting whether a person is walking using pose estimation can be a bit more complex than detecting standing or sitting, as walking involves dynamic movement. Here are a few suggestions to improve your walking detection:

  1. Keypoint Movement Over Time: Walking involves a sequence of movements. You might want to track the movement of keypoints over several frames to detect walking. For instance, you can track the movement of the hips and knees to see if they are moving in a periodic manner.

  2. Velocity and Trajectory Analysis: Calculate the velocity of keypoints like the ankles or knees over consecutive frames. Walking typically involves a consistent forward movement of these keypoints.

  3. Angle Dynamics: Instead of just checking static angles, observe how angles like the knee angle change over time. Walking usually involves alternating flexion and extension of the knees.

  4. Machine Learning Approach: Consider training a simple classifier using features extracted from the pose keypoints over time. This could be a more robust solution if you have access to labeled data of walking vs. non-walking.

Here's a simple example of how you might start tracking keypoint movement over time:

# Example: Track movement of left ankle over time
def track_movement(keypoints_sequence):
    movements = []
    for i in range(1, len(keypoints_sequence)):
        prev_keypoints = keypoints_sequence[i-1]
        curr_keypoints = keypoints_sequence[i]
        movement = distance(prev_keypoints[15], curr_keypoints[15])  # Left ankle
        movements.append(movement)
    return movements

# Use this function to analyze the movement pattern
movements = track_movement(keypoints_sequence)

This approach can help you identify patterns characteristic of walking. Remember, the key is to analyze the dynamics over time rather than a single frame.

I hope this helps you enhance your walking detection! If you have further questions or need more assistance, feel free to ask. 😊

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
detect Object Detection issues, PR's pose Pose/keypoint estimation models question Further information is requested
Projects
None yet
Development

No branches or pull requests

3 participants