Implement Double-Export Guard Service
epic-accounting-system-export-engine-task-002 — Implement the Double-Export Guard service that atomically filters out already-exported claims before a new export run begins. The service must query the export_runs table to find claims already associated with a completed export run, exclude those claim IDs from the current batch, and after a successful export atomically mark the newly included claims with the export run reference. Use a database transaction to ensure atomicity of the mark operation.
Acceptance Criteria
Technical Requirements
Execution Context
Tier 1 - 540 tasks
Can start after Tier 0 completes
Implementation Notes
The atomicity requirement is critical. Supabase's PostgREST does not support multi-statement transactions natively via the REST API. Use a Supabase PostgreSQL RPC function (a `plpgsql` stored procedure) that wraps the filter-and-mark logic in a `BEGIN/COMMIT` block, and call it via `supabase.rpc()`. This is the only reliable way to get true atomicity without a server-side middleware layer.
The RPC should accept: `export_run_id UUID`, `candidate_claim_ids UUID[]` and return the filtered (unexported) claim IDs so the caller knows which claims to pass to the exporter. This collapses filter + mark into a single round trip, eliminating the TOCTOU (time-of-check-time-of-use) race condition that would exist if filter and mark were separate REST calls. Inject `SupabaseClient` via Riverpod provider. Name the RPC `filter_and_reserve_claims_for_export`.
Testing Requirements
Integration tests using `flutter_test` with a Supabase test project or a local Supabase instance. Required test scenarios: (1) happy path — 10 candidate claims, 3 already exported, filter returns 7, mark succeeds; (2) all candidates already exported — filter returns empty list, no mark call is made; (3) mark transaction failure (simulate via RPC error mock) — verify no partial rows exist after failure; (4) idempotent mark — call markClaimsAsExported twice with same args, verify no duplicate rows; (5) concurrent filter + mark race simulation — two parallel calls with overlapping claim IDs, verify exactly one call proceeds to mark (requires DB unique constraint, not just app logic). Unit tests for input validation (null IDs, empty list, malformed UUIDs).
The Xledger CSV/JSON import specification may not be available in full detail at implementation time. If the field format, column ordering, encoding requirements, or required fields differ from assumptions, the generated file will be rejected by Xledger on first production use.
Mitigation & Contingency
Mitigation: Obtain the official Xledger import specification document from Blindeforbundet before starting XledgerExporter implementation. Build a dedicated acceptance test that validates a sample export file against all documented constraints.
Contingency: If the spec arrives late, implement a configurable column-mapping layer so that field order and names can be adjusted via configuration without code changes. Ship a file-based export that coordinators can manually verify before connecting to Xledger import.
The atomic claim-marking transaction in Double-Export Guard could fail under high concurrency if two coordinators trigger an export for overlapping date ranges simultaneously, potentially allowing duplicate exports to proceed past the guard.
Mitigation & Contingency
Mitigation: Use a database-level advisory lock or a SELECT FOR UPDATE on the relevant claim rows within the export transaction to serialize concurrent exports per organization. Add an integration test that simulates concurrent export triggers.
Contingency: If locking proves problematic at the database level, implement an application-level distributed lock using a Supabase row in a dedicated export_locks table with an expiry timestamp and automatic cleanup on failure.
HLF's Dynamics portal API endpoint may not be available or documented in time for Phase 1, leaving DynamicsExporter unable to be validated against a real system and potentially shipping with an incorrect field schema.
Mitigation & Contingency
Mitigation: Design DynamicsExporter for file-based export first (CSV/JSON download), with the API push implemented behind a feature flag. Request a Dynamics test environment or sandbox from HLF as early as possible.
Contingency: Ship DynamicsExporter as a file export only for Phase 1. Phase the API push integration into a follow-on task once the Dynamics sandbox is available, using the same AccountingExporter interface with no breaking changes.