critical priority medium complexity integration pending backend specialist Tier 3

Acceptance Criteria

Before any exporter executes, the Double-Export Guard is invoked and returns only unexported approved claims for the given organisation and period
Claims that already belong to a completed export run are excluded from the batch returned to the exporter — zero overlap with previous runs
After the exporter successfully generates a payload, the guard's mark operation is called with the export run reference, updating all included claim records atomically
If the mark operation fails after a successful payload generation, the pipeline surfaces a recoverable error that preserves the generated export artifact (file/payload not discarded)
If the initial guard filter returns an empty batch, the pipeline terminates gracefully with a user-visible informational message rather than creating an empty export run
The guard invocation is organisation-scoped — one organisation's export run cannot affect another organisation's claim state
All guard calls are wrapped in structured error types that distinguish between filter failure, mark failure, and payload generation failure
The pipeline integration is exporter-agnostic: the same guard wiring works for both XledgerExporter and DynamicsExporter without duplication
Export run reference is persisted in the export run repository before claims are marked, ensuring the reference exists before any mark attempts
Integration is covered by the BLoC/Riverpod state machine: the UI reflects distinct states for filtering, generating, marking, and error recovery

Technical Requirements

frameworks
Flutter
BLoC
Riverpod
flutter_test
apis
Supabase PostgreSQL 15 (database writes for mark operation)
Supabase Edge Functions (Deno) for server-side atomic mark
data models
activity
assignment
annual_summary
performance requirements
Guard filter query must complete within 2 seconds for up to 500 approved claims
Mark operation must be atomic — single database transaction, no partial writes
Pipeline must not block the UI thread — all guard and exporter calls run in isolates or async Dart futures
security requirements
Row-Level Security enforced: guard queries scoped to JWT organisation claim — cross-org data never accessible
Service role key never used client-side — mark operation routed through Supabase Edge Function
Export run reference stored server-side before returning to client to prevent client-side tampering
All database writes wrapped in Supabase transactions to prevent partial state

Execution Context

Execution Tier
Tier 3

Tier 3 - 413 tasks

Can start after Tier 2 completes

Implementation Notes

Implement the pipeline as a use-case class (e.g., RunExportPipelineUseCase) that accepts an AccountingExporter and DoubleExportGuard as constructor-injected dependencies — this keeps the pipeline testable without real exporters. The sequence is strictly: (1) guard.filterBatch() → (2) if empty, return ExportResult.empty; (3) exporter.generatePayload(batch) → (4) on payload success: exportRunRepo.create(runRef) → (5) guard.markExported(claimIds, runRef) → (6) on mark failure: return ExportResult.markFailed(artifact: payload). Use a sealed class for ExportResult to force callers to handle all outcomes. Never swallow the payload on mark failure — store it in a recoverable artifact location (e.g., Supabase Storage temp bucket) so an admin can retry the mark without re-generating.

Avoid placing the mark logic in the exporter itself; keep exporters pure payload generators.

Testing Requirements

Unit tests using flutter_test with mocked DoubleExportGuard and ExportRunRepository. Integration tests against a local Supabase instance covering: (1) guard returns filtered batch correctly, (2) mark succeeds after payload generation, (3) mark failure preserves artifact and surfaces error, (4) empty batch terminates gracefully. Use fake_async for time-sensitive concurrency checks. Minimum 90% branch coverage on the pipeline integration code.

Component
Double-Export Guard
service medium
Epic Risks (3)
high impact medium prob dependency

The Xledger CSV/JSON import specification may not be available in full detail at implementation time. If the field format, column ordering, encoding requirements, or required fields differ from assumptions, the generated file will be rejected by Xledger on first production use.

Mitigation & Contingency

Mitigation: Obtain the official Xledger import specification document from Blindeforbundet before starting XledgerExporter implementation. Build a dedicated acceptance test that validates a sample export file against all documented constraints.

Contingency: If the spec arrives late, implement a configurable column-mapping layer so that field order and names can be adjusted via configuration without code changes. Ship a file-based export that coordinators can manually verify before connecting to Xledger import.

high impact low prob technical

The atomic claim-marking transaction in Double-Export Guard could fail under high concurrency if two coordinators trigger an export for overlapping date ranges simultaneously, potentially allowing duplicate exports to proceed past the guard.

Mitigation & Contingency

Mitigation: Use a database-level advisory lock or a SELECT FOR UPDATE on the relevant claim rows within the export transaction to serialize concurrent exports per organization. Add an integration test that simulates concurrent export triggers.

Contingency: If locking proves problematic at the database level, implement an application-level distributed lock using a Supabase row in a dedicated export_locks table with an expiry timestamp and automatic cleanup on failure.

medium impact high prob integration

HLF's Dynamics portal API endpoint may not be available or documented in time for Phase 1, leaving DynamicsExporter unable to be validated against a real system and potentially shipping with an incorrect field schema.

Mitigation & Contingency

Mitigation: Design DynamicsExporter for file-based export first (CSV/JSON download), with the API push implemented behind a feature flag. Request a Dynamics test environment or sandbox from HLF as early as possible.

Contingency: Ship DynamicsExporter as a file export only for Phase 1. Phase the API push integration into a follow-on task once the Dynamics sandbox is available, using the same AccountingExporter interface with no breaking changes.