-
-
Notifications
You must be signed in to change notification settings - Fork 2.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implement iOS VoiceOver support #18016
base: master
Are you sure you want to change the base?
Conversation
You can test this PR using the following package version. |
You can test this PR using the following package version. |
You can test this PR using the following package version. |
You can test this PR using the following package version. |
You can test this PR using the following package version. |
You can test this PR using the following package version. |
What does the pull request do?
This is a twin PR to #17704 that implements support for VoiceOver on iOS devices. Importantly, the UIAccessibility & UIAccessibilityContainer informal protocols are leveraged to create cohesion between Avalonia's automation system and that of the iOS ecosystem.
What is the current behavior?
Currently, Avalonia apps targeting the iOS platform do not expose user controls to the operating system's accessibility pipeline. This makes it impossible for blind and visually impaired users to use those apps with a screen reader or braille display.
What is the updated/expected behavior with this PR?
This PR enables users reliant on VoiceOver features to properly navigate Avalonia's view heirarchy using the appropriate, well-known shortcuts and gestures.
How was the solution implemented (if it's not obvious)?
The PR implements support for VoiceOver via Avalonia's
AutomationPeer
API by wrapping those instances in a subclass ofUIAccessibilityElement
that implements all the necessary formal and informal protocols to create a fully realized experience for low vision users.Checklist