Document Bufdir Export API Contract and Serializer Spec
epic-bufdir-report-export-core-backend-task-018 — Write technical documentation for the Bufdir export edge function API: request/response schemas, supported scope levels, format options, error codes, and rate limits. Document the canonical Bufdir JSON payload structure as the single source of truth for future consumers including the Phase 3 API client. Document the serializer's category mapping table and validation rules so future developers can extend it without risk of regression. Store docs alongside the edge function source.
Acceptance Criteria
Technical Requirements
Execution Context
Tier 10 - 11 tasks
Can start after Tier 9 completes
Implementation Notes
Store all documentation in `supabase/functions/bufdir-export/` directory alongside the function source: `README.md` for the API contract and `SERIALIZER_SPEC.md` for the mapping table and validation rules. Keep the canonical payload JSON example as a separate `bufdir-payload-example.json` file so it can be imported as a test fixture by the integration tests from task-017. Use a simple markdown table for the category mapping (internal_type | bufdir_field | notes) rather than prose to make it scannable. Include a 'Breaking changes' section explaining what kinds of changes to the serializer would require a version bump and coordination with downstream consumers (Phase 3 API client).
Since this documentation will serve the Phase 3 Bufdir API client team, ensure the language is precise enough to generate an API client without additional clarification.
Testing Requirements
No automated tests for documentation. Validation is manual: conduct a peer review where a developer who did not write the edge function reads only the documentation and attempts to write a correct API call and extend the serializer with a new category. Document any gaps found during review and resolve them before marking this task complete.
Supabase Edge Functions have a default execution timeout. For large national-scope exports aggregating tens of thousands of activities across 1,400 chapters, the edge function may time out before completing, leaving coordinators with a failed export and no partial output.
Mitigation & Contingency
Mitigation: Optimise the aggregation SQL using pre-materialised aggregation views or RPC functions that run inside the database rather than iterating records in Deno. Profile query execution time against realistic production data volumes early. Request an elevated timeout limit from Supabase if needed. Implement progress checkpointing so the export can be resumed from the last completed aggregation batch.
Contingency: For organisations exceeding a configurable threshold (e.g. >5,000 activities), switch to an asynchronous export pattern: the edge function writes a 'pending' audit record and enqueues the job; the client polls for completion and is notified via Supabase Realtime when the file is ready.
Server-side PDF generation in a Deno Edge Function environment restricts library choices. Many popular PDF libraries require Node.js APIs not available in Deno, or produce large bundle sizes that exceed edge function limits. Choosing the wrong library could block the entire PDF generation path.
Mitigation & Contingency
Mitigation: Spike PDF library selection as the first task of this epic, evaluating at least two Deno-compatible options (e.g. pdf-lib, jsPDF with Deno compatibility shim). Test bundle size and basic rendering before committing to an implementation. Document the chosen library's constraints.
Contingency: If no suitable Deno-native PDF library is found, generate a well-structured HTML report from the edge function and use a headless Chromium service (e.g. Browserless, Gotenberg) for HTML-to-PDF conversion, or temporarily ship CSV-only export while the PDF path is resolved.
Peer mentors affiliated with multiple chapters (a documented NHF scenario) must not be double-counted in participant totals. Incorrect deduplication logic would overreport participation figures to Bufdir, which could be discovered during audit and damage organisational credibility.
Mitigation & Contingency
Mitigation: Define and document the deduplication contract explicitly before coding: deduplication is per-person per-period, not per-activity. Build dedicated unit tests with fixtures containing the exact multi-chapter membership patterns described in NHF's documentation. Have a NHF representative validate test fixture outputs against known-good manual counts.
Contingency: If deduplication logic produces results that cannot be verified against manual counts before launch, surface a deduplication warning in the export preview listing the affected peer mentor IDs, and require explicit coordinator acknowledgement before finalising the export.