Screen Reader Support
Feature Detail
Description
Full compatibility with platform screen readers (VoiceOver on iOS, TalkBack on Android) across all app screens and interactive elements. Every button, form field, list item, and navigation element must carry semantic labels, roles, and state announcements so blind and low-vision users can operate the app without visual reference. This is a hard requirement for Norges Blindeforbund whose peer mentors and contacts include blind users, and it underpins the WCAG 2.2 AA conformance target. Implementation must include custom semantics for complex widgets (activity wizard steps, bottom navigation, modal sheets), live-region announcements for async state changes, and a warning mechanism when sensitive personal data fields are read aloud in public settings.
Analysis
Blindeforbundet cannot deploy the app without screen reader support — it is a non-negotiable prerequisite for onboarding their organisation. It also satisfies legal obligations under Norwegian universell utforming regulations and broadens the addressable user base across all four partner organisations.
Use Flutter Semantics widget and SemanticsService throughout. Audit with both VoiceOver and TalkBack on real devices. Add a configurable 'sensitive field' flag to form fields that triggers an OS-level alert before the screen reader vocalises the content. Test every navigation transition and bottom-sheet open/close for correct focus placement.
Components (204)
Shared Components
These components are reused across multiple features
User Interface (59)
Service Layer (52)
Data Layer (33)
Infrastructure (54)
User Stories (14)
As a As a Peer Mentor (Likeperson)
I want to read, filter, and manage my notifications, and to configure notification preferences, using only a screen reader
So that I stay informed about assignment updates, certification reminders, and coordinator messages without being dependent on a sighted person to interpret notification content
- Given the peer mentor opens the notification centre with VoiceOver, when they navigate the list, then each item is announced as '[type]: [title], [body], [time ago], [read/unread]'
- Given the notification tab badge shows 3 unread notifications, when the screen reader is on the bottom nav, then it announces 'Notifications, 3 unread, tab, 5 of 5'
- Given the notification filter bar has 'Assignments' selected, when the screen reader focuses on it, then it announces 'Filter: Assignments, active'
- +3 more
As a As a Peer Mentor (Likeperson)
I want to hear spoken announcements when dynamic content changes on screen — such as new notifications, loading states, confirmation messages, or error alerts — without having to manually navigate to find the updated content
So that I am always aware of what is happening in the app without needing to visually scan the screen, enabling efficient and confident use of the app
- Given an activity is submitted, when the success confirmation appears, then a polite live region announces 'Activity registered successfully' within 500ms of the UI update
- Given a network error occurs during submission, when the error banner appears, then an assertive live region immediately announces the error message text
- Given the peer mentor is on the search screen and results load, when results appear, then a polite announcement reads 'X results found' where X is the count
- +3 more
As a As a Peer Mentor (Likeperson)
I want to fill in the post-session report form — including dynamic fields, the way-forward section, and any organisation-specific fields — entirely using a screen reader
So that I can document session outcomes, follow-up actions, and referrals for my contacts independently, fulfilling my formal reporting obligations without sighted assistance
- Given a peer mentor opens the post-session report with VoiceOver active, when the form loads, then all fields are announced with their label and input type (e.g., 'Health status, text field, required')
- Given a dynamic field is injected by the org-field-config-loader, when it appears, then a live region announces 'New field added: [field name]' and focus moves to it
- Given the way-forward section is expanded, when the peer mentor navigates into it, then task input fields and the date picker are all reachable and labelled
- +3 more
As a As a Peer Mentor (Likeperson)
I want to be confident that the app consistently meets WCAG 2.2 AA standards across all screens, so that the screen reader experience is predictable and reliable
So that I can trust the app to work correctly with my screen reader every time I use it, without encountering unexpected unlabelled elements, broken focus order, or missing announcements
- Given the accessibility-audit-service runs on a screen, when the audit completes, then it reports any elements missing labels, incorrect roles, or inadequate contrast as structured warnings
- Given the CI pipeline runs on a pull request, when the flutter-accessibility-lint-config rules are applied, then violations are reported as build failures for WCAG 2.2 AA criteria
- Given the accessibility-test-harness is used in an integration test, when simulating VoiceOver navigation through the activity registration wizard, then the test asserts that all 5 steps have correct step labels and focus management
- +3 more
As a As a Coordinator
I want to read, filter, and manage my notifications, and to configure notification preferences, using only a screen reader
So that I stay informed about assignment updates, certification reminders, and coordinator messages without being dependent on a sighted person to interpret notification content
- Given the peer mentor opens the notification centre with VoiceOver, when they navigate the list, then each item is announced as '[type]: [title], [body], [time ago], [read/unread]'
- Given the notification tab badge shows 3 unread notifications, when the screen reader is on the bottom nav, then it announces 'Notifications, 3 unread, tab, 5 of 5'
- Given the notification filter bar has 'Assignments' selected, when the screen reader focuses on it, then it announces 'Filter: Assignments, active'
- +3 more
As a As a Coordinator
I want to hear spoken announcements when dynamic content changes on screen — such as new notifications, loading states, confirmation messages, or error alerts — without having to manually navigate to find the updated content
So that I am always aware of what is happening in the app without needing to visually scan the screen, enabling efficient and confident use of the app
- Given an activity is submitted, when the success confirmation appears, then a polite live region announces 'Activity registered successfully' within 500ms of the UI update
- Given a network error occurs during submission, when the error banner appears, then an assertive live region immediately announces the error message text
- Given the peer mentor is on the search screen and results load, when results appear, then a polite announcement reads 'X results found' where X is the count
- +3 more
As a As a Coordinator
I want to fill in the post-session report form — including dynamic fields, the way-forward section, and any organisation-specific fields — entirely using a screen reader
So that I can document session outcomes, follow-up actions, and referrals for my contacts independently, fulfilling my formal reporting obligations without sighted assistance
- Given a peer mentor opens the post-session report with VoiceOver active, when the form loads, then all fields are announced with their label and input type (e.g., 'Health status, text field, required')
- Given a dynamic field is injected by the org-field-config-loader, when it appears, then a live region announces 'New field added: [field name]' and focus moves to it
- Given the way-forward section is expanded, when the peer mentor navigates into it, then task input fields and the date picker are all reachable and labelled
- +3 more
As a As a Coordinator
I want to be confident that the app consistently meets WCAG 2.2 AA standards across all screens, so that the screen reader experience is predictable and reliable
So that I can trust the app to work correctly with my screen reader every time I use it, without encountering unexpected unlabelled elements, broken focus order, or missing announcements
- Given the accessibility-audit-service runs on a screen, when the audit completes, then it reports any elements missing labels, incorrect roles, or inadequate contrast as structured warnings
- Given the CI pipeline runs on a pull request, when the flutter-accessibility-lint-config rules are applied, then violations are reported as build failures for WCAG 2.2 AA criteria
- Given the accessibility-test-harness is used in an integration test, when simulating VoiceOver navigation through the activity registration wizard, then the test asserts that all 5 steps have correct step labels and focus management
- +3 more
As a As a Peer Mentor (Likeperson)
I want to hear a spoken warning before the screen reader reads out sensitive personal data such as names, addresses, or medical details
So that I can prevent unintended disclosure of confidential contact information in public environments, maintaining privacy and trust with my contacts
- Given a contact detail screen with a phone number field, when the screen reader moves focus to it, then a live region announces 'Sensitive information: confirm to read' before the number is read
- Given the peer mentor confirms the sensitive field warning (e.g., double-tap), when the field content is read, then the full value is announced without the warning repeating
- Given the sensitive-field-configuration has 'email' marked as sensitive, when a screen reader focuses on an email field, then the warning is triggered for that field
- +3 more
As a As a Peer Mentor (Likeperson)
I want to navigate all screens in the app using VoiceOver or TalkBack without relying on visual cues
So that I can use the app independently despite being visually impaired, fulfilling my peer support responsibilities without assistance
- Given a peer mentor with VoiceOver enabled, when they open the app, then the focus is placed on the first meaningful element on the screen with a correct label read aloud
- Given any screen in the app, when navigating with swipe-right (next element), then every interactive control has a descriptive accessibility label and a role (button, heading, link, etc.)
- Given a bottom navigation bar, when a peer mentor selects a tab with VoiceOver, then the selected state and tab name are announced (e.g., 'Home, tab, selected, 1 of 5')
- +3 more
As a As a Peer Mentor (Likeperson)
I want to complete the full activity registration wizard from start to finish using only VoiceOver or TalkBack without needing sighted assistance
So that I can independently log my peer support sessions, ensuring my work is recorded accurately and counts toward organisational reporting
- Given a peer mentor opens the activity registration bottom sheet with VoiceOver active, when the first step loads, then VoiceOver announces 'Activity Registration, Step 1 of 5: Select activity type'
- Given the peer mentor is on the date step, when they interact with the date picker, then the current selected date and available navigation controls are announced with their values
- Given the peer mentor advances to the next step, when the new step loads, then focus automatically moves to the step heading and the step number is announced
- +3 more
As a As a Coordinator
I want to hear a spoken warning before the screen reader reads out sensitive personal data such as names, addresses, or medical details
So that I can prevent unintended disclosure of confidential contact information in public environments, maintaining privacy and trust with my contacts
- Given a contact detail screen with a phone number field, when the screen reader moves focus to it, then a live region announces 'Sensitive information: confirm to read' before the number is read
- Given the peer mentor confirms the sensitive field warning (e.g., double-tap), when the field content is read, then the full value is announced without the warning repeating
- Given the sensitive-field-configuration has 'email' marked as sensitive, when a screen reader focuses on an email field, then the warning is triggered for that field
- +3 more
As a As a Coordinator
I want to navigate all screens in the app using VoiceOver or TalkBack without relying on visual cues
So that I can use the app independently despite being visually impaired, fulfilling my peer support responsibilities without assistance
- Given a peer mentor with VoiceOver enabled, when they open the app, then the focus is placed on the first meaningful element on the screen with a correct label read aloud
- Given any screen in the app, when navigating with swipe-right (next element), then every interactive control has a descriptive accessibility label and a role (button, heading, link, etc.)
- Given a bottom navigation bar, when a peer mentor selects a tab with VoiceOver, then the selected state and tab name are announced (e.g., 'Home, tab, selected, 1 of 5')
- +3 more
As a As a Coordinator
I want to complete the full activity registration wizard from start to finish using only VoiceOver or TalkBack without needing sighted assistance
So that I can independently log my peer support sessions, ensuring my work is recorded accurately and counts toward organisational reporting
- Given a peer mentor opens the activity registration bottom sheet with VoiceOver active, when the first step loads, then VoiceOver announces 'Activity Registration, Step 1 of 5: Select activity type'
- Given the peer mentor is on the date step, when they interact with the date picker, then the current selected date and available navigation controls are announced with their values
- Given the peer mentor advances to the next step, when the new step loads, then focus automatically moves to the step heading and the step number is announced
- +3 more