Add WCAG 2.2 AA semantics to all steps
epic-quick-activity-registration-wizard-ui-task-011 — Audit and annotate every interactive element across all five step widgets with Flutter Semantics: labels, hints, roles (button, radio, textField), and live regions. Verify with TalkBack (Android) and VoiceOver (iOS) that focus order is logical, all chips are individually reachable, and form submission is clearly announced.
Acceptance Criteria
Technical Requirements
Execution Context
Tier 3 - 413 tasks
Can start after Tier 2 completes
Implementation Notes
Start with a semantics audit using flutter run --profile and Flutter DevTools Semantics view before writing any code — identify all gaps first, then fix. Use Semantics(label: '...', button: true, onTap: ...) rather than wrapping MaterialButton, as Flutter's default buttons already have semantics — you mostly need to customize labels and add state annotations. For chips: wrap each chip in Semantics(label: '${type.name}, ${isSelected ? 'selected' : 'not selected'}', selected: isSelected).
For the step progress indicator: use Semantics(liveRegion: true, label: 'Step $currentStep of 5'). For decorative icons and dividers: use ExcludeSemantics to remove them from the semantics tree to reduce noise. For the notes TextFormField: the hint text serves as the accessible label only if no separate label widget exists — ensure a visible label or set Semantics(label: 'Notes') explicitly. Test early on actual devices — Flutter's semantics emulation in tests does not fully replicate screen reader behavior.
Per the workshop findings, Blindeforbundet users with VoiceOver/JAWS are a primary target audience — this task is non-negotiable for launch.
Testing Requirements
Automated semantics tests using flutter_test SemanticsController (tester.ensureSemantics()). Test: every interactive element in each step has a non-empty semanticsLabel. Test: chips have correct 'selected'/'not selected' state in semantic tree. Test: live region is present on step progress indicator.
Test: focus order using SemanticsNode traversal matches expected top-to-bottom order. Manual device testing: TalkBack on Android, VoiceOver on iOS — document test results in a checklist. Use the Flutter Inspector's Semantics view during development to verify the tree continuously. Regression: add a semantics smoke test to CI that runs on every PR.
As wizard steps accumulate additional features (duplicate warning, retroactive date chips, custom duration entry), the two-tap happy path may inadvertently require extra interactions. A step that previously auto-advanced may start requiring a confirmation tap, breaking the core promise of the feature and increasing friction for high-frequency users like HLF's 380-registration peer mentor.
Mitigation & Contingency
Mitigation: Define and automate a regression test that performs the complete two-tap happy path (open bottom sheet → confirm → confirm) and asserts the confirmation view is reached in exactly two tap events. Run this test in CI on every PR touching the wizard. Treat any failure as a blocking defect.
Contingency: If a new feature unavoidably adds a tap to the happy path, provide a 'quick mode' toggle in user settings that collapses the wizard to a single-confirmation screen for users who never change defaults.
Flutter bottom sheets are dismissed on back-button press or background tap by default. If the wizard state is not preserved, a peer mentor who accidentally dismisses mid-flow loses all their entered data and must start over — a significant frustration for users with cognitive disabilities or motor impairments who take longer to fill each step.
Mitigation & Contingency
Mitigation: Implement the wizard state as a persistent Cubit that outlives the bottom sheet widget's lifecycle, scoped to the registration feature route. On re-open, the Cubit restores the previous step and field values. Add a 'discard changes?' confirmation dialog when the user explicitly dismisses a partially filled wizard.
Contingency: If persistent state proves difficult to implement with the chosen routing strategy, implement draft auto-save to a local draft repository every time a field value changes, and restore from draft on the next open.
Multi-step wizard bottom sheets are among the most complex accessibility scenarios in Flutter. Screen readers (TalkBack, VoiceOver) may not announce step transitions, focus may land on the wrong element after advancing, and animated transitions can interfere with the accessibility tree update cycle — making the feature unusable for Blindeforbundet users who rely on screen readers.
Mitigation & Contingency
Mitigation: Assign each wizard step a unique Semantics container with a live region announcement on mount. Use ExcludeSemantics on inactive steps during transition animations. Test each step transition manually with TalkBack and VoiceOver as part of the definition of done for each step component.
Contingency: If animated transitions cause accessibility tree corruption, disable step transition animations entirely in accessibility mode (detected via MediaQuery.accessibleNavigation) and use instant step replacement instead.
The NotesStep relies on the OS keyboard's built-in dictation button for speech-to-text input. This button's availability, position, and behaviour varies significantly between iOS (reliable, visible dictation key) and Android (varies by keyboard, OEM skin, and language settings). HLF and Blindeforbundet specifically requested this capability; if it is unreliable on Android, it fails a SHOULD HAVE requirement for a significant portion of users.
Mitigation & Contingency
Mitigation: Document that the notes dictation feature depends on the device's native keyboard dictation and requires no in-app microphone permission. Add explicit placeholder copy informing users they can use their keyboard's dictation button. Test on a minimum of three Android OEM keyboards (Gboard, Samsung, Swiftkey) and two iOS versions.
Contingency: If native keyboard dictation is too unreliable on Android, implement a fallback in-app microphone button in the NotesStep that triggers the platform's SpeechRecognition API directly via a method channel, scoped only to the notes field with no session recording capability.