critical priority high complexity integration pending integration specialist Tier 2

Acceptance Criteria

export(orgId, List<ApprovedClaim>) returns Future<ExportResult> containing fileUrl (signed Supabase Storage URL), recordCount (int), and exportRunId (uuid)
Each ApprovedClaim produces exactly two accounting rows (debit + credit) in the Xledger CSV/JSON β€” the double-entry bookkeeping invariant holds (sum of debits == sum of credits)
voucher_date is formatted as 'YYYY-MM-DD' per Xledger intake spec
account_code is resolved via ChartOfAccountsMapper.resolveForClaim(claim) β€” if no mapping exists, the claim is moved to an unresolvable list and excluded from the file (not silently dropped)
employee_id is the peer mentor's Xledger employee reference from the org's credential configuration β€” never the internal Supabase UUID
cost_center maps to the peer mentor's chapter/region code from the ChartOfAccountsMapper
description is a max 50-character truncated string: '{activity_type_name} – {date}' per Xledger field length limit
Pre-flight validation rejects the export and returns a typed ValidationFailureResult if: any required field is blank, any amount is negative, or total debit != total credit
Generated file is uploaded to the org's private Supabase Storage bucket (path: exports/{orgId}/xledger/{exportRunId}.csv) and a signed URL (1-hour expiry) is returned
An export_runs record is created in the database with status 'completed' and file_url on success, or status 'failed' with error_detail on any unhandled exception
Unit tests cover: successful mapping of 3 representative claim types, missing account code handling, field truncation, double-entry balance assertion

Technical Requirements

frameworks
Flutter
Riverpod
flutter_test
apis
Supabase Storage β€” file upload and signed URL generation
Xledger REST API β€” intake format spec (CSV/JSON column schema)
data models
activity
assignment
performance requirements
Transformation of up to 500 claims completes in under 2 seconds on device
File upload to Supabase Storage completes with progress tracking for files over 1MB
security requirements
Xledger API key and org credentials stored exclusively in Supabase Edge Function environment variables β€” never in Flutter app bundle
File upload uses service role key server-side via Edge Function β€” mobile client never holds the service role key
Generated CSV stored in private Supabase Storage bucket with per-org RLS policies
Signed URL expires after 1 hour β€” not a permanent public link
employee_id field must not expose internal UUIDs β€” only the Xledger-specific reference ID

Execution Context

Execution Tier
Tier 2

Tier 2 - 518 tasks

Can start after Tier 1 completes

Implementation Notes

Structure the service as XledgerExporter with injected dependencies: ChartOfAccountsMapper, CsvJsonFileGenerator, ExportRunRepository, SupabaseStorageClient. The transformation pipeline should be: (1) map each ApprovedClaim to an XledgerJournalEntry using XledgerRecordMapper (pure, synchronous, easily testable); (2) validate the full batch (sum debits == sum credits, no blank required fields); (3) serialize to CSV via CsvJsonFileGenerator; (4) upload to Supabase Storage; (5) create export_runs record; (6) return ExportResult. Define XledgerJournalEntry as an immutable Dart class with all 7 required fields typed (e.g., amount as int in ΓΈre to avoid floating-point rounding). Use Dart's intl DateFormat for voucher_date formatting.

For the double-entry row pair: debit row has debit_amount set and credit_amount = 0; credit row is the inverse. Wrap the entire export in a try/catch that writes a 'failed' export_run record before rethrowing β€” so the failure is always recorded. Edge Function deployment is out of scope for this task; the Flutter service should call the Edge Function endpoint rather than calling Supabase Storage directly with a service key.

Testing Requirements

Unit tests (flutter_test with mocked dependencies): test XledgerRecordMapper.toXledgerRows(claim) for each claim type (mileage, toll, parking, public transit); assert debit/credit balance; assert field truncation at 50 chars; assert missing account code moves claim to unresolvable list. Integration test: invoke full export() with a list of 5 mock ApprovedClaims against a test Supabase Storage bucket; verify file created, signed URL returned, export_runs record created with status 'completed'. Test file: test/services/xledger_exporter_test.dart. Mock ChartOfAccountsMapper and CsvJsonFileGenerator in unit tests.

Cover validation failure path: pass a claim with negative amount, verify ValidationFailureResult returned.

Component
Chart of Accounts Mapper
data medium
Epic Risks (3)
high impact medium prob technical

Adding exported_at and export_run_id columns to expense_claims requires a live migration on a table shared with the approval workflow. A poorly timed migration could lock the table and block claim submissions or approvals.

Mitigation & Contingency

Mitigation: Use non-blocking ADD COLUMN with a DEFAULT of NULL (no backfill needed) executed during a low-traffic window. Test migration rollback on a staging replica before production deployment.

Contingency: If migration causes table lock contention, roll back and reschedule for a maintenance window. Use a feature flag to gate the export UI until the migration completes successfully.

medium impact high prob scope

Chart of accounts mapping configurations for Xledger and Dynamics may not be fully specified by stakeholders at development time, leaving the mapper with incomplete data and causing validation failures for unmapped expense categories.

Mitigation & Contingency

Mitigation: Implement the mapper to return a structured validation error (not a crash) for any unmapped field, and surface these errors clearly in the export confirmation dialog. Request full mapping tables from Blindeforbundet and HLF stakeholders as a pre-condition for this epic.

Contingency: If mappings arrive incomplete, ship the mapper with the available subset and mark unmapped categories as excluded (skipped with reason). Coordinators see which categories are skipped and can manually submit those records.

medium impact medium prob dependency

Supabase Vault configuration for storing per-org accounting credentials may require infra permissions or environment secrets not yet provisioned in staging or production, blocking development and testing of credential retrieval.

Mitigation & Contingency

Mitigation: Provision Vault secrets and environment configuration in staging as the first task of this epic. Document the exact secret naming convention and rotation procedure before implementation begins.

Contingency: If Vault is unavailable, use environment variables scoped to the Edge Function as a temporary fallback for development. Block production deployment until Vault-based storage is confirmed operational.