critical priority medium complexity backend pending backend specialist Tier 2

Acceptance Criteria

The `export()` method of `DynamicsExporter` returns a populated `ExportPayloadWrapper` with `system: AccountingSystemType.dynamics`
The JSON payload is a JSON object (not array) with a top-level structure matching the Dynamics portal batch import schema: `{ "batchReference": "string", "exportedAt": "ISO8601", "records": [...] }`
Each record in the `records` array has field names matching the Dynamics portal import schema exactly (names confirmed with HLF)
Decimal amounts are serialized as JSON numbers with exactly 2 decimal places — not strings, not integers
Dates are serialized as ISO 8601 strings (`YYYY-MM-DDTHH:mm:ssZ`) in UTC
The `batchReference` in the payload header is the same as `ExportPayloadWrapper.exportRunId`
`ExportPayloadWrapper.payload` contains a `DynamicsExportPayload` value object with a `jsonString` field (Dynamics does not require CSV)
An empty mapped records list causes the method to return `AccountingExportError.ValidationError`, not an empty payload
`ExportPayloadWrapper.exportedClaimIds` exactly matches the claim IDs that were mapped and serialized
The JSON output is valid per `dart:convert` `jsonDecode` — no trailing commas, no unescaped control characters

Technical Requirements

frameworks
Flutter
Dart
apis
Microsoft Dynamics portal batch import JSON schema
data models
DynamicsMappedRecord
DynamicsExportPayload
ExportPayloadWrapper
performance requirements
JSON serialization of 200 records must complete in under 100ms
Output JSON must be compact (no pretty-printing) to minimize upload payload size
security requirements
The `batchReference` (exportRunId) must be a UUID — validate format before embedding in JSON
JSON string fields must be escaped per JSON spec — use `dart:convert jsonEncode` rather than manual string building
The JSON payload must not be written to device storage unencrypted — pass as in-memory string to caller

Execution Context

Execution Tier
Tier 2

Tier 2 - 518 tasks

Can start after Tier 1 completes

Implementation Notes

Use `dart:convert`'s `jsonEncode` (via `json.encode`) for all serialization — never build JSON via `StringBuffer` concatenation. Define a `toJson()` method on `DynamicsMappedRecord` that returns `Map` with the exact portal field names. The top-level batch envelope should be built in `DynamicsExporter._buildBatchEnvelope()` as a private method, keeping `export()` readable. Unlike `XledgerExporter` which produces both CSV and JSON, `DynamicsExporter` produces JSON only — do not add CSV output unless HLF explicitly requests it.

The `exportedAt` timestamp should be UTC (`DateTime.now().toUtc().toIso8601String()`). The `CsvJsonFileGenerator` component can still be used for the JSON serialization path if it supports JSON array/object output — use it for consistency, but do not force a CSV code path. Confirm with HLF whether the Dynamics portal import accepts a raw JSON file upload or requires a multipart form POST — the serialization layer (this task) is the same either way, but the upload mechanism will differ and should be noted in the API integration task.

Testing Requirements

Unit tests using `flutter_test`. Required scenarios: (1) 3-record batch produces valid JSON parseable by `jsonDecode`; (2) top-level structure has `batchReference`, `exportedAt`, `records` keys; (3) amount `150.00` serialized as JSON number `150.00`, not `"150.00"` string; (4) date serialized as ISO 8601 UTC string; (5) `batchReference` in payload matches `exportRunId` in wrapper; (6) empty mapped records returns `ValidationError`; (7) `exportedClaimIds` in wrapper matches input; (8) JSON with a string field containing special characters (`&`, `<`, `>`, `"`) is correctly escaped. Golden file snapshot test: generate JSON from fixed 2-record input and assert exact output matches stored fixture.

Component
Dynamics Exporter
service high
Epic Risks (3)
high impact medium prob dependency

The Xledger CSV/JSON import specification may not be available in full detail at implementation time. If the field format, column ordering, encoding requirements, or required fields differ from assumptions, the generated file will be rejected by Xledger on first production use.

Mitigation & Contingency

Mitigation: Obtain the official Xledger import specification document from Blindeforbundet before starting XledgerExporter implementation. Build a dedicated acceptance test that validates a sample export file against all documented constraints.

Contingency: If the spec arrives late, implement a configurable column-mapping layer so that field order and names can be adjusted via configuration without code changes. Ship a file-based export that coordinators can manually verify before connecting to Xledger import.

high impact low prob technical

The atomic claim-marking transaction in Double-Export Guard could fail under high concurrency if two coordinators trigger an export for overlapping date ranges simultaneously, potentially allowing duplicate exports to proceed past the guard.

Mitigation & Contingency

Mitigation: Use a database-level advisory lock or a SELECT FOR UPDATE on the relevant claim rows within the export transaction to serialize concurrent exports per organization. Add an integration test that simulates concurrent export triggers.

Contingency: If locking proves problematic at the database level, implement an application-level distributed lock using a Supabase row in a dedicated export_locks table with an expiry timestamp and automatic cleanup on failure.

medium impact high prob integration

HLF's Dynamics portal API endpoint may not be available or documented in time for Phase 1, leaving DynamicsExporter unable to be validated against a real system and potentially shipping with an incorrect field schema.

Mitigation & Contingency

Mitigation: Design DynamicsExporter for file-based export first (CSV/JSON download), with the API push implemented behind a feature flag. Request a Dynamics test environment or sandbox from HLF as early as possible.

Contingency: Ship DynamicsExporter as a file export only for Phase 1. Phase the API push integration into a follow-on task once the Dynamics sandbox is available, using the same AccountingExporter interface with no breaking changes.