Presentation: https://docs.google.com/presentation/d/1Se3ak2f9EZfd2m_n0qNktOhXFpqmqA9pYDe2H4IXAq8/edit?usp=sharing
A hands-free, camera-free wearable auditory aid with real-time feedback on users' 360° surroundings (moving and static objects) to enhance mobility for visually impaired individuals navigating urban environments. Uses a LiDAR and our novel low-memory algorithm to label and track point cloud 'segments' across frames in real time. It is the first device of its kind to be stored in a single, centralized component and perform all processing locally (no cloud).