critical priority medium complexity database pending database specialist Tier 1

Acceptance Criteria

StatsRepository implements IStatsRepository and is provided via a Riverpod Provider that receives the Supabase client as a dependency
getCoordinatorStats(ChapterScope, StatsFilter) calls the correct Supabase RPC function with chapter_id, date_from, date_to, and activity_type parameters
getCoordinatorStats(OrganisationScope, StatsFilter) calls the correct RPC function with org_id parameter instead of chapter_id
getPersonalStats(peerId, StatsFilter) calls the peer-mentor-scoped RPC function with peer_mentor_id parameter
RPC call parameters are typed — no raw dynamic maps passed to Supabase without a typed wrapper
Network errors are caught and rethrown as StatsNetworkException with the original error message preserved
Supabase RLS violations (error code 42501) are caught and rethrown as StatsPermissionException
Empty result sets return empty ViewModel objects (zero counts), not null or exceptions
Repository does not perform any caching — that responsibility belongs to StatsCacheManager
All RPC function names are defined as string constants in a StatsRpcConstants class, not inlined as string literals
Integration tests against a local Supabase instance pass for coordinator-scoped and org-scoped queries

Technical Requirements

frameworks
Flutter
Dart
Riverpod
Supabase client Dart SDK
apis
Supabase RPC (rpc())
Supabase PostgREST
data models
proxy_activities
chapters
organisations
peer_mentors
StatsFilter
StatsScope
CoordinatorStatsViewModel
PersonalStatsViewModel
performance requirements
Each RPC call must return within 2 seconds under normal Supabase load
Repository must not issue more than 3 RPC calls per getCoordinatorStats invocation — aggregate server-side
security requirements
All RPC functions must be defined with SECURITY DEFINER in Supabase and enforce row-level scoping server-side
The repository must never pass org_id or chapter_id derived from client state without confirming it matches the authenticated user's session scope
No raw SQL must be constructed client-side — use only named RPC calls

Execution Context

Execution Tier
Tier 1

Tier 1 - 540 tasks

Can start after Tier 0 completes

Implementation Notes

Define Supabase RPC functions as PostgreSQL functions with explicit parameter types — avoid using `filter()` chaining on views for stats aggregations since it bypasses the query planner's ability to use indexes on the aggregate. Example RPC signature: `get_coordinator_stats(p_chapter_id uuid, p_date_from timestamptz, p_date_to timestamptz, p_activity_type text) RETURNS json`. Map the raw JSON response rows to intermediate DTOs before constructing ViewModels — this isolates the repository from schema changes. Use Riverpod's `ref.read(supabaseClientProvider)` to obtain the client, not a global Supabase singleton, so the provider can be overridden in tests.

For error handling, wrap the entire RPC call in a try/catch on `PostgrestException` and inspect `code` for '42501' to distinguish permission errors from generic network failures. Ensure the Riverpod provider for StatsRepository is `autoDispose` unless a long-lived singleton is intentionally required.

Testing Requirements

Write integration tests that connect to a local Supabase instance (via Docker or the Supabase CLI) seeded with known test data. Verify: (1) Coordinator-scoped query returns only records from the correct chapter. (2) Org-admin-scoped query returns aggregated records across all chapters in the org. (3) Personal stats query returns only the specified peer mentor's activities.

(4) Applying a StatsFilter with a date range excludes out-of-range records. (5) Network failure (simulate by pointing to a non-existent host) throws StatsNetworkException. (6) RLS violation scenario throws StatsPermissionException. Use flutter_test with a real Supabase test client — do not mock the Supabase client for integration tests.

Unit tests with a mocked Supabase client are acceptable for error-path coverage only.

Component
Coordinator Statistics Service
service high
Epic Risks (3)
medium impact medium prob technical

fl_chart's default colour palette may not meet WCAG 2.2 AA contrast requirements when rendered on the app's dark or light backgrounds. If segment colours are insufficient, the donut chart will fail accessibility audits, which is a compliance blocker for all three organisations.

Mitigation & Contingency

Mitigation: Define all chart colours in the design token system with pre-validated contrast ratios. Run the contrast-ratio-validator against every chart colour during the adapter's unit tests. Use the contrast-safe-color-palette as the source palette.

Contingency: If a colour fails validation, replace with the nearest compliant token. If activity types exceed the available token set, implement a deterministic hashing algorithm that maps activity type IDs to compliant colours.

medium impact medium prob technical

StatsBloc subscribing to the activity registration stream creates a long-lived subscription. If the subscription is not disposed correctly when the dashboard is closed, it will cause a stream leak and potentially trigger re-fetches on a disposed BLoC, resulting in uncaught errors in production.

Mitigation & Contingency

Mitigation: Implement subscription disposal in the BLoC's close() override. Write a widget test that navigates away from the dashboard and asserts no BLoC events are emitted after disposal.

Contingency: If leaks are detected in QA, add a mounted check guard before emitting states from async callbacks, and audit all other BLoC stream subscriptions in the codebase for the same pattern.

low impact low prob scope

PersonalStatsService's Phase 4 gamification data structure is designed against an assumed future schema. If the Phase 4 Spotify Wrapped feature defines a different data contract when it is developed, the structure built now will require a breaking change and migration.

Mitigation & Contingency

Mitigation: Document the contribution data structure with explicit field semantics and versioning comments. Keep the Phase 4 fields as optional/nullable so they do not break existing consumers if the schema evolves.

Contingency: If the Phase 4 schema diverges significantly, the personal stats data can be re-mapped in a thin adapter layer without changing PersonalStatsService's core implementation.