critical priority high complexity backend pending backend specialist Tier 5

Acceptance Criteria

When format=preview, the edge function returns HTTP 200 with a JSON body containing the aggregated Bufdir payload and a validation_results array
Aggregation service is called with the validated request context (org_id, scope_id, scope_level, date_from, date_to) and returns structured activity data
Bufdir serializer converts aggregated data to the canonical Bufdir JSON schema matching the bufdir_column_schema configuration for the org
Validation results array lists each row or field that triggered a warning (e.g., missing required Bufdir column, value out of expected range)
Preview response includes: { payload: BufferPayload, validation_results: ValidationResult[], row_count: number, period: { from, to }, scope: { id, level } }
If aggregation returns zero rows, response includes an empty payload with a validation warning: 'No activities found for the specified period and scope'
Aggregation errors (DB timeout, unexpected schema) return HTTP 500 with a structured error body and are logged with the full error context
Serialization errors (schema mismatch, missing column mapping) return HTTP 422 with details of which fields could not be mapped
Preview mode must never write to the database or storage — it is a pure read operation
Response Content-Type is application/json for preview mode
Total processing time for preview with up to 500 activity rows must be under 5 seconds

Technical Requirements

frameworks
Supabase Edge Functions (Deno)
Deno std/http
apis
Supabase PostgreSQL (aggregation queries)
Supabase Auth (JWT context)
data models
activity
activity_type
bufdir_column_schema
bufdir_export_audit_log
assignment
performance requirements
Aggregation query must use scope-level indexes (org_id + date range + scope_id)
Total preview response time under 5s for 500 rows
Serialization must be synchronous and CPU-bound, not I/O-bound
Avoid N+1 queries — fetch all required joins in a single aggregation query
security requirements
Aggregation queries must include org_id filter to enforce tenant isolation
RLS enforced on all database queries — service role key scoped to required tables only
Preview payload must not include PII beyond what Bufdir schema requires
GDPR: preview mode is transient — no data persisted; confirm this in function audit log if applicable

Execution Context

Execution Tier
Tier 5

Tier 5 - 253 tasks

Can start after Tier 4 completes

Implementation Notes

Structure the preview pipeline as a pure function: `aggregateActivities(context) → RawData` then `serializeToBufdirFormat(rawData, columnSchema) → { payload, validationResults }`. Keep these two concerns strictly separated so each can be tested in isolation. Fetch the bufdir_column_schema for the org at the start of serialization — this drives which columns are included and in what order. If the org has no column schema configured, fall back to the default schema version.

The validation pass should run after serialization and check each row against Bufdir's known constraints (e.g., activity_type must map to a recognised Bufdir activity code, hours must be > 0). Use a collect-all-errors approach (not fail-fast) so coordinators see all issues in one preview. Log aggregation duration and row count for observability but not the row data itself.

Testing Requirements

Unit test the aggregation service with mock Supabase client returning known fixture data; assert the serialized Bufdir payload matches expected structure field-by-field. Unit test the serializer with edge cases: activity with null activity_type, activity spanning multiple scope levels, activity with missing optional fields. Integration test: call the deployed edge function with format=preview and a coordinator JWT; assert response shape and that no records were written to any table. Test that a scope with no activities returns the expected empty payload with warning.

Test error handling: mock DB timeout and assert HTTP 500 with structured error. Target 90% branch coverage on aggregation and serializer modules.

Component
Bufdir Export Edge Function
infrastructure high
Dependencies (3)
Add a validation pass to the Bufdir format serializer that checks the produced payload against Bufdir submission rules: required fields present, numeric totals non-negative, date range valid, org number format correct, and total participant count plausible. Return a structured validation result with field-level errors so the edge function can surface actionable feedback in the preview response without producing a malformed export. epic-bufdir-report-export-core-backend-task-007 Build the main transformation function in the Bufdir format serializer that converts the raw aggregation output into the canonical Bufdir JSON payload. The serializer must: group activities by Bufdir category, compute totals per category per peer mentor, apply org-level metadata (org number, reporting period), embed duplicate warnings as annotations, and produce the single intermediate representation that CSV and PDF services consume. Output must be deterministic and idempotent. epic-bufdir-report-export-core-backend-task-006 Create the Supabase Edge Function entry point for Bufdir export. Implement request parsing and validation for the input parameters: org_id, scope_id, scope_level (chapter/region/national), date_from, date_to, format (csv/pdf/preview), and requesting user JWT. Validate that the requesting user has coordinator or admin role for the specified scope. Return typed error responses for missing or invalid parameters before any expensive processing begins. epic-bufdir-report-export-core-backend-task-011
Epic Risks (3)
high impact medium prob technical

Supabase Edge Functions have a default execution timeout. For large national-scope exports aggregating tens of thousands of activities across 1,400 chapters, the edge function may time out before completing, leaving coordinators with a failed export and no partial output.

Mitigation & Contingency

Mitigation: Optimise the aggregation SQL using pre-materialised aggregation views or RPC functions that run inside the database rather than iterating records in Deno. Profile query execution time against realistic production data volumes early. Request an elevated timeout limit from Supabase if needed. Implement progress checkpointing so the export can be resumed from the last completed aggregation batch.

Contingency: For organisations exceeding a configurable threshold (e.g. >5,000 activities), switch to an asynchronous export pattern: the edge function writes a 'pending' audit record and enqueues the job; the client polls for completion and is notified via Supabase Realtime when the file is ready.

medium impact medium prob technical

Server-side PDF generation in a Deno Edge Function environment restricts library choices. Many popular PDF libraries require Node.js APIs not available in Deno, or produce large bundle sizes that exceed edge function limits. Choosing the wrong library could block the entire PDF generation path.

Mitigation & Contingency

Mitigation: Spike PDF library selection as the first task of this epic, evaluating at least two Deno-compatible options (e.g. pdf-lib, jsPDF with Deno compatibility shim). Test bundle size and basic rendering before committing to an implementation. Document the chosen library's constraints.

Contingency: If no suitable Deno-native PDF library is found, generate a well-structured HTML report from the edge function and use a headless Chromium service (e.g. Browserless, Gotenberg) for HTML-to-PDF conversion, or temporarily ship CSV-only export while the PDF path is resolved.

high impact high prob technical

Peer mentors affiliated with multiple chapters (a documented NHF scenario) must not be double-counted in participant totals. Incorrect deduplication logic would overreport participation figures to Bufdir, which could be discovered during audit and damage organisational credibility.

Mitigation & Contingency

Mitigation: Define and document the deduplication contract explicitly before coding: deduplication is per-person per-period, not per-activity. Build dedicated unit tests with fixtures containing the exact multi-chapter membership patterns described in NHF's documentation. Have a NHF representative validate test fixture outputs against known-good manual counts.

Contingency: If deduplication logic produces results that cannot be verified against manual counts before launch, surface a deduplication warning in the export preview listing the affected peer mentor IDs, and require explicit coordinator acknowledgement before finalising the export.