high priority medium complexity database pending database specialist Tier 2

Acceptance Criteria

generated_reports table is created via a versioned Supabase migration with all required columns and correct types
id column is UUID with default gen_random_uuid()
status column uses a CHECK constraint accepting only 'completed' and 'failed'
scope_level column uses a CHECK constraint accepting only 'chapter', 'region', 'national'
format column uses a CHECK constraint accepting only 'csv' and 'pdf'
validation_warnings column is JSONB with default '[]'::jsonb
created_at column defaults to now() and is NOT NULL
RLS is enabled on the table; coordinators can SELECT only rows where org_id matches their JWT org claim
RLS allows service role (edge function) full INSERT/SELECT access
No UPDATE or DELETE RLS policy exists — records are immutable once written
Composite index exists on (org_id, created_at DESC) for efficient listByOrg queries
Index exists on id for getById queries
listByOrg(orgId, limit) returns records ordered by created_at DESC, limited to the specified count, with all columns
getById(id) returns a single record or null if not found; does not throw on missing ID
createRecord() accepts a typed CreateGeneratedReportInput object, inserts the record, and returns the full persisted row including generated id and created_at
Repository methods are implemented as a TypeScript class usable inside the Deno edge function environment
All repository methods use parameterised queries — no string interpolation

Technical Requirements

frameworks
Supabase PostgreSQL 15
Supabase Migrations (SQL)
@supabase/supabase-js (Deno-compatible version)
apis
Supabase PostgreSQL (direct SQL + RLS)
Supabase Admin API (service role for inserts)
data models
bufdir_export_audit_log
performance requirements
listByOrg query must use the (org_id, created_at DESC) index — verified with EXPLAIN ANALYZE
createRecord must complete within 200ms under normal DB load
getById must complete within 100ms
security requirements
RLS prevents cross-org record visibility — coordinators cannot enumerate other orgs' export history
file_path column stores internal storage path — never returned to Flutter client directly (client receives signed URL only)
service role key used only inside edge function — never in Flutter client
No PII stored in validation_warnings — only field names and constraint violations, not actual data values
GDPR: export records constitute audit data; retention policy should be documented (suggest 3 years)

Execution Context

Execution Tier
Tier 2

Tier 2 - 518 tasks

Can start after Tier 1 completes

Implementation Notes

Create the migration as `supabase/migrations/{timestamp}_create_generated_reports.sql`. Define the table, constraints, indexes, and RLS policies in a single migration file. The repository class should accept a `SupabaseClient` instance in its constructor for dependency injection, making it trivially testable with a mock client. Define a `GeneratedReport` TypeScript interface with all column types mapped correctly (use `string` for UUIDs, `Date` for timestamps, `object[]` for validation_warnings).

The `CreateGeneratedReportInput` type should omit auto-generated fields (id, created_at). Use `supabase.from('generated_reports').insert(data).select().single()` for createRecord to get the persisted row back in one round trip. For listByOrg, use `.order('created_at', { ascending: false }).limit(limit)`. Do not implement update or delete methods — they should not exist in this repository.

Testing Requirements

Write SQL unit tests for the migration: verify table schema matches spec, verify CHECK constraints reject invalid status/format/scope_level values, verify RLS blocks a coordinator from selecting another org's records. Write TypeScript unit tests for the repository class using a Supabase test project: test createRecord inserts and returns the full row; test listByOrg returns records in DESC order limited correctly; test getById returns null for unknown UUID. Test that createRecord fails gracefully if a required field is null. Verify the (org_id, created_at DESC) index is used via EXPLAIN output in a test script.

All DB tests should run against a test schema that is reset between test runs.

Component
Bufdir Export Edge Function
infrastructure high
Epic Risks (3)
high impact medium prob technical

Supabase Edge Functions have a default execution timeout. For large national-scope exports aggregating tens of thousands of activities across 1,400 chapters, the edge function may time out before completing, leaving coordinators with a failed export and no partial output.

Mitigation & Contingency

Mitigation: Optimise the aggregation SQL using pre-materialised aggregation views or RPC functions that run inside the database rather than iterating records in Deno. Profile query execution time against realistic production data volumes early. Request an elevated timeout limit from Supabase if needed. Implement progress checkpointing so the export can be resumed from the last completed aggregation batch.

Contingency: For organisations exceeding a configurable threshold (e.g. >5,000 activities), switch to an asynchronous export pattern: the edge function writes a 'pending' audit record and enqueues the job; the client polls for completion and is notified via Supabase Realtime when the file is ready.

medium impact medium prob technical

Server-side PDF generation in a Deno Edge Function environment restricts library choices. Many popular PDF libraries require Node.js APIs not available in Deno, or produce large bundle sizes that exceed edge function limits. Choosing the wrong library could block the entire PDF generation path.

Mitigation & Contingency

Mitigation: Spike PDF library selection as the first task of this epic, evaluating at least two Deno-compatible options (e.g. pdf-lib, jsPDF with Deno compatibility shim). Test bundle size and basic rendering before committing to an implementation. Document the chosen library's constraints.

Contingency: If no suitable Deno-native PDF library is found, generate a well-structured HTML report from the edge function and use a headless Chromium service (e.g. Browserless, Gotenberg) for HTML-to-PDF conversion, or temporarily ship CSV-only export while the PDF path is resolved.

high impact high prob technical

Peer mentors affiliated with multiple chapters (a documented NHF scenario) must not be double-counted in participant totals. Incorrect deduplication logic would overreport participation figures to Bufdir, which could be discovered during audit and damage organisational credibility.

Mitigation & Contingency

Mitigation: Define and document the deduplication contract explicitly before coding: deduplication is per-person per-period, not per-activity. Build dedicated unit tests with fixtures containing the exact multi-chapter membership patterns described in NHF's documentation. Have a NHF representative validate test fixture outputs against known-good manual counts.

Contingency: If deduplication logic produces results that cannot be verified against manual counts before launch, surface a deduplication warning in the export preview listing the affected peer mentor IDs, and require explicit coordinator acknowledgement before finalising the export.