Implement FeatureFlagProvider with per-org toggle fetching and caching
epic-organization-selection-and-onboarding-foundation-task-008 — Build the FeatureFlagProvider Riverpod provider that fetches per-organization feature flags from Supabase (table: org_feature_flags), caches them with a configurable TTL in memory and SecureStorageAdapter, and exposes a synchronous isEnabled(flagKey) API to the widget tree. Implement a rollout evaluator that supports percentage-based rollouts and org-level overrides.
Acceptance Criteria
Technical Requirements
Execution Context
Tier 3 - 413 tasks
Can start after Tier 2 completes
Implementation Notes
Represent the provider as a Riverpod `Notifier
orgOverride : (rollout evaluation)`. TTL management: store a `DateTime loadedAt` in `FeatureFlagState`; in `isEnabled`, check if `DateTime.now().difference(loadedAt) > ttl` and if so, schedule a background `_refresh()` via `Future.microtask`. Define a `FeatureFlagRepository` abstraction that wraps Supabase access — this makes the unit tests in task-009 straightforward. The `SecureStorageAdapter` used here should be the same interface as defined for `OrgBrandingCache` to maintain consistency.
Testing Requirements
Unit tests required (see task-009). For implementation verification: manually test that `isEnabled` returns synchronously after `loadForOrg` completes by wrapping a widget test with `ProviderScope` and asserting the flag value is available in the first build cycle without `FutureBuilder`. Verify TTL behavior with `fakeAsync` advancing time past the TTL boundary. Verify percentage rollout determinism by calling `isEnabled` 100 times with the same user ID and asserting the result never changes.
iOS Keychain and Android Keystore have meaningfully different failure modes and permission models. The secure storage plugin may throw platform-specific exceptions (e.g., biometric enrollment required, Keystore wipe after device re-enrolment) that crash higher-level flows if not caught at the adapter boundary.
Mitigation & Contingency
Mitigation: Wrap all storage plugin calls in try/catch at the adapter layer and expose a typed StorageResult<T> instead of throwing. Write integration tests on real device simulators for both platforms in CI using Fastlane. Document the exception matrix during spike.
Contingency: If a platform-specific failure cannot be handled gracefully, fall back to in-memory-only storage for the current session and surface a non-blocking warning to the user; log the event for investigation.
Setting a session-level Postgres variable (app.current_org_id) via a Supabase RPC requires that RLS policies on every table reference this variable. If the Supabase project schema has not yet defined these policies, the configurator will set the variable but queries will return unfiltered data, giving a false sense of security.
Mitigation & Contingency
Mitigation: Include a smoke-test RPC in the SupabaseRLSTenantConfigurator that verifies the variable is readable from a policy-scoped query before marking setup as complete. Coordinate with the database migration task to ensure RLS policies reference app.current_org_id before the configurator is shipped.
Contingency: If RLS policies are not in place at integration time, gate all data-fetching components behind a runtime check in SupabaseRLSTenantConfigurator.isRlsScopeVerified(); block data access and surface a developer warning until policies are confirmed.
Fetching feature flags from Supabase on every cold start adds network latency before the first branded screen renders. On slow connections this may cause a perceptible blank-screen gap or cause the app to render with default (unflagged) state before flags arrive.
Mitigation & Contingency
Mitigation: Persist the last-known flag set to disk in the FeatureFlagProvider and serve stale-while-revalidate on startup. Gate flag refresh behind a configurable TTL (default 15 minutes) so network calls are not made on every launch.
Contingency: If stale flags cause a feature to appear that should be hidden, add a post-load re-evaluation pass that reconciles the live flag set with the rendered widget tree and triggers a targeted rebuild where needed.