Implement Export Run Repository data layer
epic-accounting-system-export-foundation-task-002 — Build the ExportRunRepository class in Dart with full CRUD access to the export_runs table via Supabase. Implement methods: createRun(), updateRunStatus(), getRunById(), getRunsByOrg(), markClaimsExported(), and getLastExportDate(). Use Riverpod for dependency injection and ensure all queries respect RLS tenant isolation. Map database rows to typed ExportRun model objects.
Acceptance Criteria
Technical Requirements
Execution Context
Tier 1 - 540 tasks
Can start after Tier 0 completes
Implementation Notes
Place the repository in lib/features/accounting/data/repositories/export_run_repository.dart following the existing feature-slice folder structure. Define the ExportRun model in lib/features/accounting/domain/models/export_run.dart. Use Riverpod's @riverpod code generation if the project already uses it, otherwise use a plain Provider. For markClaimsExported, use supabase.from('expense_claims').update({'exported_at': DateTime.now().toIso8601String()}).in_('id', claimIds) — confirm column name matches the migration.
Treat PostgrestException with code '42501' (insufficient privilege) as a permissions error with a user-facing message. Avoid storing the repository state in a StateNotifier — this is a pure data-access layer with no local state.
Testing Requirements
Write unit tests using flutter_test with a mocked SupabaseClient (mockito or manual mock). Test createRun: verify correct payload shape sent to Supabase insert. Test updateRunStatus: verify correct fields sent on update, and that ExportRunNotFoundException is thrown on empty response. Test markClaimsExported: verify a single .update().in_() call is made (not N calls).
Test getLastExportDate: verify correct filter on target_system and ordering. Test ExportRun.fromJson: cover null fields (record_count, file_url, completed_at), valid timestamps, and unknown status values. Aim for 90%+ line coverage on the repository class.
Adding exported_at and export_run_id columns to expense_claims requires a live migration on a table shared with the approval workflow. A poorly timed migration could lock the table and block claim submissions or approvals.
Mitigation & Contingency
Mitigation: Use non-blocking ADD COLUMN with a DEFAULT of NULL (no backfill needed) executed during a low-traffic window. Test migration rollback on a staging replica before production deployment.
Contingency: If migration causes table lock contention, roll back and reschedule for a maintenance window. Use a feature flag to gate the export UI until the migration completes successfully.
Chart of accounts mapping configurations for Xledger and Dynamics may not be fully specified by stakeholders at development time, leaving the mapper with incomplete data and causing validation failures for unmapped expense categories.
Mitigation & Contingency
Mitigation: Implement the mapper to return a structured validation error (not a crash) for any unmapped field, and surface these errors clearly in the export confirmation dialog. Request full mapping tables from Blindeforbundet and HLF stakeholders as a pre-condition for this epic.
Contingency: If mappings arrive incomplete, ship the mapper with the available subset and mark unmapped categories as excluded (skipped with reason). Coordinators see which categories are skipped and can manually submit those records.
Supabase Vault configuration for storing per-org accounting credentials may require infra permissions or environment secrets not yet provisioned in staging or production, blocking development and testing of credential retrieval.
Mitigation & Contingency
Mitigation: Provision Vault secrets and environment configuration in staging as the first task of this epic. Document the exact secret naming convention and rotation procedure before implementation begins.
Contingency: If Vault is unavailable, use environment variables scoped to the Edge Function as a temporary fallback for development. Block production deployment until Vault-based storage is confirmed operational.