critical priority medium complexity infrastructure pending infrastructure specialist Tier 0

Acceptance Criteria

AccountingRecord Dart model (or a Map<String, dynamic> contract) is defined with all fields needed for export: runId, orgId, claimId, claimantName, amount, expenseType, accountCode, activityId, submittedAt, approvedAt
generateCsv(List<AccountingRecord> records, CsvExportConfig config) returns Uint8List of UTF-8 encoded CSV bytes
CsvExportConfig contains: columnOrder (List<String>), delimiter (String, default ','), includeHeader (bool, default true), dateFormat (String, default 'yyyy-MM-dd')
generateJson(List<AccountingRecord> records, JsonExportConfig config) returns Uint8List of UTF-8 encoded JSON bytes matching the target system schema
JsonExportConfig contains: targetSystem ('xledger'|'dynamics'), schemaVersion (String) — JSON structure differs per target system
uploadToStorage(Uint8List bytes, String orgId, String runId, String extension) uploads file to exports/{orgId}/{runId}.{extension} bucket path and returns a signed URL
Signed URL has 86400 seconds (24-hour) expiry
generateAndUploadBoth(records, orgId, runId, csvConfig, jsonConfig) uploads both files and returns ExportFileUrls(csvUrl, jsonUrl) — an atomic operation (if either upload fails, returns an error result)
Empty records list produces a valid empty CSV (header only) and valid empty JSON array — does not throw
Special characters in claimantName and other text fields are properly escaped in CSV (quoted with double-quote escaping)

Technical Requirements

frameworks
Flutter
supabase_flutter
csv (pub.dev package) or manual CSV serialization
apis
Supabase Storage API (uploadBinary, createSignedUrl)
data models
AccountingRecord
CsvExportConfig
JsonExportConfig
ExportFileUrls
performance requirements
File generation must be done in an Isolate for lists > 200 records to avoid blocking the UI thread
Upload must use streaming/chunked upload for files > 1MB — use Supabase Storage uploadBinary which handles this
security requirements
Storage bucket exports/ must have RLS restricting access to coordinator role and authenticated org
Signed URLs must use expiry of 86400s max — never generate public permanent URLs for financial exports
File path must include orgId to prevent cross-org path enumeration: exports/{orgId}/{runId}.csv

Execution Context

Execution Tier
Tier 0

Tier 0 - 440 tasks

Implementation Notes

Place in lib/features/accounting/infrastructure/csv_json_file_generator.dart. For CSV generation, consider the csv pub.dev package (ListToCsvConverter) rather than manual string concatenation to correctly handle RFC 4180 escaping. For JSON schemas: obtain the exact intake format spec from Xledger and Dynamics API documentation before implementing — do not guess schema field names. If specs are unavailable, use a configurable Map field-name mapping in JsonExportConfig.

Run CPU-intensive serialization in a Flutter Isolate using compute() to keep the export UI responsive. The Supabase Storage bucket name ('exports') must be created in the Supabase dashboard with appropriate policies before upload calls will succeed.

Testing Requirements

Unit tests: Test generateCsv with 3 records — verify header row, data rows, correct delimiter, correct date format. Test CSV special character escaping: claimantName with commas and double-quotes. Test generateJson with Xledger config — verify JSON structure matches expected schema. Test generateJson with Dynamics config — verify different structure.

Test empty list — verify valid empty CSV and JSON output. Integration test (mocked Storage): Test uploadToStorage returns a string URL. Test generateAndUploadBoth calls upload twice and returns both URLs. Test upload failure on second file rolls back gracefully (or documents that it does not and cleanup is manual).

Component
CSV / JSON File Generator
infrastructure medium
Epic Risks (3)
high impact medium prob technical

Adding exported_at and export_run_id columns to expense_claims requires a live migration on a table shared with the approval workflow. A poorly timed migration could lock the table and block claim submissions or approvals.

Mitigation & Contingency

Mitigation: Use non-blocking ADD COLUMN with a DEFAULT of NULL (no backfill needed) executed during a low-traffic window. Test migration rollback on a staging replica before production deployment.

Contingency: If migration causes table lock contention, roll back and reschedule for a maintenance window. Use a feature flag to gate the export UI until the migration completes successfully.

medium impact high prob scope

Chart of accounts mapping configurations for Xledger and Dynamics may not be fully specified by stakeholders at development time, leaving the mapper with incomplete data and causing validation failures for unmapped expense categories.

Mitigation & Contingency

Mitigation: Implement the mapper to return a structured validation error (not a crash) for any unmapped field, and surface these errors clearly in the export confirmation dialog. Request full mapping tables from Blindeforbundet and HLF stakeholders as a pre-condition for this epic.

Contingency: If mappings arrive incomplete, ship the mapper with the available subset and mark unmapped categories as excluded (skipped with reason). Coordinators see which categories are skipped and can manually submit those records.

medium impact medium prob dependency

Supabase Vault configuration for storing per-org accounting credentials may require infra permissions or environment secrets not yet provisioned in staging or production, blocking development and testing of credential retrieval.

Mitigation & Contingency

Mitigation: Provision Vault secrets and environment configuration in staging as the first task of this epic. Document the exact secret naming convention and rotation procedure before implementation begins.

Contingency: If Vault is unavailable, use environment variables scoped to the Edge Function as a temporary fallback for development. Block production deployment until Vault-based storage is confirmed operational.