medium priority low complexity api pending api specialist Tier 7

Acceptance Criteria

A GET request to the edge function with query parameter action=redownload&export_id={uuid} triggers the re-download path
The export record is fetched from generated_reports using getById(export_id) — this query runs under the user's JWT so RLS enforces org-scope access
If the export record does not exist, HTTP 404 is returned with body: { error: 'Export not found', code: 'NOT_FOUND' }
If the export record exists but the user's org_id (from JWT) does not match the record's org_id, HTTP 403 is returned (RLS will prevent fetch, resulting in null, which maps to 404 to avoid information leakage)
If the export record status is 'failed', HTTP 422 is returned with body: { error: 'Export failed and has no downloadable file', code: 'EXPORT_FAILED' }
If the file no longer exists in storage (storage.getPublicUrl returns no object), HTTP 410 Gone is returned
On success, a fresh signed URL is generated for the file_path stored in the export record, valid for 24 hours
Response body is: { export_id: string, download_url: string, expires_at: ISO8601, original_generated_at: ISO8601 }
Re-download action is logged to bufdir_export_audit_log with action type 'redownload'
Signed URL generation must not require reading the file contents — only storage.createSignedUrl() is called

Technical Requirements

frameworks
Supabase Edge Functions (Deno)
apis
Supabase Storage API (createSignedUrl)
Supabase PostgreSQL (generated_reports read via RLS)
data models
bufdir_export_audit_log
performance requirements
Total re-download endpoint response time must be under 500ms
Two DB/storage calls maximum: getById + createSignedUrl
security requirements
file_path is never returned to the client — only the opaque signed URL
RLS on generated_reports ensures users can only fetch records for their org — no additional application-level check needed beyond null-check on query result
Signed URL expiry reset to 24h on every re-download call — old URL remains valid until its own expiry
GDPR: re-download action logged for audit trail — includes user_id, export_id, timestamp

Execution Context

Execution Tier
Tier 7

Tier 7 - 84 tasks

Can start after Tier 6 completes

Implementation Notes

Implement this as a route branch within the existing edge function: check for `action=redownload` query param at the top of the request handler and dispatch to a separate `handleRedownload(request, context)` function. This avoids creating a second edge function deployment. The `handleRedownload` function should: (1) extract export_id from query params and validate it is a UUID, (2) call `generatedReportsRepo.getById(exportId)` using the user's JWT (not service role) so RLS applies naturally, (3) null result → 404, (4) status=failed → 422, (5) call `supabase.storage.from('exports').createSignedUrl(record.file_path, 86400)` using service role (required to access private bucket), (6) return response. Keep the re-download logic under 50 lines — this is intentionally a thin endpoint.

Testing Requirements

Unit test: mock getById returning null and assert HTTP 404. Mock getById returning a failed-status record and assert HTTP 422. Mock getById returning a valid record and mock createSignedUrl returning a URL; assert HTTP 200 with correct response shape. Integration test: create a real export record in the test DB, call the re-download endpoint with the coordinator's JWT and the export_id, assert the signed URL is returned and is fetchable.

Test with a peer mentor JWT and assert HTTP 403/404. Test with a non-existent export_id and assert HTTP 404. Verify audit log entry created on success.

Component
Bufdir Export Edge Function
infrastructure high
Epic Risks (3)
high impact medium prob technical

Supabase Edge Functions have a default execution timeout. For large national-scope exports aggregating tens of thousands of activities across 1,400 chapters, the edge function may time out before completing, leaving coordinators with a failed export and no partial output.

Mitigation & Contingency

Mitigation: Optimise the aggregation SQL using pre-materialised aggregation views or RPC functions that run inside the database rather than iterating records in Deno. Profile query execution time against realistic production data volumes early. Request an elevated timeout limit from Supabase if needed. Implement progress checkpointing so the export can be resumed from the last completed aggregation batch.

Contingency: For organisations exceeding a configurable threshold (e.g. >5,000 activities), switch to an asynchronous export pattern: the edge function writes a 'pending' audit record and enqueues the job; the client polls for completion and is notified via Supabase Realtime when the file is ready.

medium impact medium prob technical

Server-side PDF generation in a Deno Edge Function environment restricts library choices. Many popular PDF libraries require Node.js APIs not available in Deno, or produce large bundle sizes that exceed edge function limits. Choosing the wrong library could block the entire PDF generation path.

Mitigation & Contingency

Mitigation: Spike PDF library selection as the first task of this epic, evaluating at least two Deno-compatible options (e.g. pdf-lib, jsPDF with Deno compatibility shim). Test bundle size and basic rendering before committing to an implementation. Document the chosen library's constraints.

Contingency: If no suitable Deno-native PDF library is found, generate a well-structured HTML report from the edge function and use a headless Chromium service (e.g. Browserless, Gotenberg) for HTML-to-PDF conversion, or temporarily ship CSV-only export while the PDF path is resolved.

high impact high prob technical

Peer mentors affiliated with multiple chapters (a documented NHF scenario) must not be double-counted in participant totals. Incorrect deduplication logic would overreport participation figures to Bufdir, which could be discovered during audit and damage organisational credibility.

Mitigation & Contingency

Mitigation: Define and document the deduplication contract explicitly before coding: deduplication is per-person per-period, not per-activity. Build dedicated unit tests with fixtures containing the exact multi-chapter membership patterns described in NHF's documentation. Have a NHF representative validate test fixture outputs against known-good manual counts.

Contingency: If deduplication logic produces results that cannot be verified against manual counts before launch, surface a deduplication warning in the export preview listing the affected peer mentor IDs, and require explicit coordinator acknowledgement before finalising the export.