critical priority high complexity backend pending backend specialist Tier 3

Acceptance Criteria

triggerExport(orgId, dateRange, targetSystem) is the sole public entry point; no other methods are accessible externally
DoubleExportGuard is invoked before any ExportRun record is created; if a duplicate is detected, no database writes occur and the method returns an ExportResult with DUPLICATE status
A pending ExportRun record is created immediately after the guard passes, before claims are fetched, ensuring partial failures are traceable
ApprovedClaimsQueryService returns only claims with status='approved' and exported_at IS NULL within the requested date range for the given orgId
Routing to XledgerExporter or DynamicsExporter is determined solely by the organisation's accounting system configuration, not by the targetSystem parameter alone
All claims included in a successful export have their exported_at timestamp set atomically within the same database transaction as the ExportRun completion update
ExportRun is updated to status='completed' with fileUrl and recordCount upon success
ExportRun is updated to status='failed' with a non-null errorDetail string on any exception; the exception is not re-thrown to the caller
If zero approved claims match the date range, the pipeline returns an ExportResult with status='empty' and does not create a completed ExportRun
The service is injectable and all dependencies (DoubleExportGuard, ExportRunRepository, ApprovedClaimsQueryService, XledgerExporter, DynamicsExporter) are injected via constructor, not instantiated internally
No PII or credential data is logged at any log level
Unit tests achieve 100% branch coverage on the routing and failure-path logic

Technical Requirements

frameworks
Dart
Supabase Dart client
apis
Supabase PostgREST (claims read + update)
Supabase Storage (file URL retrieval)
Xledger REST API (via XledgerExporter)
Microsoft Dynamics REST API (via DynamicsExporter)
data models
ExportRun
ApprovedClaim
OrgAccountingConfig
ExportResult
performance requirements
Batch the exported_at update for all affected claims in a single Supabase upsert call rather than N individual updates
The full pipeline must complete within 30 seconds for exports up to 500 claims; surface a timeout ExportResult beyond that
ExportRun creation and claim fetch must each complete within 3 seconds under normal Supabase latency
security requirements
Service must only be invoked from the generate-export Supabase Edge Function — never directly from the Flutter client
orgId must be validated against the authenticated user's organisation before any database query
exported_at update must use a server-side timestamp (now()) to prevent client clock manipulation
All Supabase queries must use Row Level Security policies; no service-role key bypass in application logic

Execution Context

Execution Tier
Tier 3

Tier 3 - 413 tasks

Can start after Tier 2 completes

Implementation Notes

Use a sealed class or enum for ExportStatus (PENDING, COMPLETED, FAILED, DUPLICATE, EMPTY) to make exhaustive switch handling compile-time safe in Dart. Wrap the entire pipeline from 'create ExportRun' through 'update ExportRun to completed' in a try/catch/finally where the finally block guarantees ExportRun is never left in PENDING state. The claims update and ExportRun completion update should be issued as two sequential Supabase calls inside a Postgres transaction if the Supabase client supports it; otherwise use a Postgres function called via rpc() to keep atomicity server-side. Inject a Clock abstraction rather than calling DateTime.now() directly to keep tests deterministic.

The routing logic should delegate to an AccountingExporterFactory that reads OrgAccountingConfig — keep AccountingExporterService free of any Xledger/Dynamics-specific imports. Use Dart's async/await with structured error handling (no unawaited futures). Log export start, completion, and failure at INFO level with orgId and dateRange (no claim content).

Testing Requirements

Unit tests (flutter_test / dart test): mock all five injected dependencies and verify each orchestration branch independently — happy path Xledger, happy path Dynamics, DUPLICATE guard abort, empty claims result, exporter throws exception (verify ExportRun set to failed), claims update called with correct IDs. Integration test (Supabase local emulator): seed approved claims for an org, call triggerExport, assert ExportRun row in DB has status='completed' and all seeded claims have exported_at set. Regression test: re-run triggerExport for same org/dateRange, assert DUPLICATE result and no new ExportRun row created. Test coverage gate: 90% line coverage on AccountingExporterService class.

Component
Approved Claims Query Service
data medium
Dependencies (5)
Build the ExportRunRepository class in Dart with full CRUD access to the export_runs table via Supabase. Implement methods: createRun(), updateRunStatus(), getRunById(), getRunsByOrg(), markClaimsExported(), and getLastExportDate(). Use Riverpod for dependency injection and ensure all queries respect RLS tenant isolation. Map database rows to typed ExportRun model objects. epic-accounting-system-export-foundation-task-002 Build the ApprovedClaimsQueryService that fetches approved expense claims filtered by org_id, date range (submitted_at between start and end), and exclusion of already-exported claims (exported_at IS NULL). Support optional filter by claim type. Return a typed list of ApprovedClaim objects including all fields required for accounting output: claimant, amount, expense_type, receipt_url, activity_id, and approval metadata. Implement pagination for large result sets. epic-accounting-system-export-foundation-task-003 Build the DoubleExportGuard service that enforces the idempotency invariant: no approved claim may be included in more than one successful export run. Before any export proceeds, query export_runs for completed runs within an overlapping date range for the same org and target system. If overlap detected, surface a descriptive error with details of the conflicting run. Implement as a pre-flight check callable from the Accounting Exporter Service. Write unit tests covering overlap, no-overlap, and partial-overlap scenarios. epic-accounting-system-export-foundation-task-009 Build the XledgerExporter service that transforms a list of ApprovedClaim objects into Xledger-compatible accounting records using ChartOfAccountsMapper for account code resolution and CsvJsonFileGenerator for file output. Map fields to Xledger's required columns: voucher_date, account_code, debit_amount, credit_amount, description, employee_id, and cost_center. Validate output against Xledger intake rules before generating the file. Return a typed ExportResult including file URL and record count. epic-accounting-system-export-foundation-task-010 Build the DynamicsExporter service that transforms ApprovedClaim objects into Microsoft Dynamics 365-compatible accounting records using ChartOfAccountsMapper and CsvJsonFileGenerator. Map fields to Dynamics journal line format: transactionDate, ledgerAccount, amount, currency, description, workerDimension, and projectDimension. Support the HLF Dynamics portal intake schema. Validate all required fields before file generation. Return a typed ExportResult with file URL and record count. epic-accounting-system-export-foundation-task-011
Epic Risks (3)
high impact medium prob technical

Adding exported_at and export_run_id columns to expense_claims requires a live migration on a table shared with the approval workflow. A poorly timed migration could lock the table and block claim submissions or approvals.

Mitigation & Contingency

Mitigation: Use non-blocking ADD COLUMN with a DEFAULT of NULL (no backfill needed) executed during a low-traffic window. Test migration rollback on a staging replica before production deployment.

Contingency: If migration causes table lock contention, roll back and reschedule for a maintenance window. Use a feature flag to gate the export UI until the migration completes successfully.

medium impact high prob scope

Chart of accounts mapping configurations for Xledger and Dynamics may not be fully specified by stakeholders at development time, leaving the mapper with incomplete data and causing validation failures for unmapped expense categories.

Mitigation & Contingency

Mitigation: Implement the mapper to return a structured validation error (not a crash) for any unmapped field, and surface these errors clearly in the export confirmation dialog. Request full mapping tables from Blindeforbundet and HLF stakeholders as a pre-condition for this epic.

Contingency: If mappings arrive incomplete, ship the mapper with the available subset and mark unmapped categories as excluded (skipped with reason). Coordinators see which categories are skipped and can manually submit those records.

medium impact medium prob dependency

Supabase Vault configuration for storing per-org accounting credentials may require infra permissions or environment secrets not yet provisioned in staging or production, blocking development and testing of credential retrieval.

Mitigation & Contingency

Mitigation: Provision Vault secrets and environment configuration in staging as the first task of this epic. Document the exact secret naming convention and rotation procedure before implementation begins.

Contingency: If Vault is unavailable, use environment variables scoped to the Edge Function as a temporary fallback for development. Block production deployment until Vault-based storage is confirmed operational.