critical priority high complexity integration pending integration specialist Tier 2

Acceptance Criteria

export(orgId, List<ApprovedClaim>) returns Future<ExportResult> containing fileUrl, recordCount, and exportRunId
Each ApprovedClaim maps to one Dynamics journal line per the HLF Dynamics portal intake schema
transactionDate is formatted as 'YYYY-MM-DD'
ledgerAccount is resolved via ChartOfAccountsMapper.resolveForClaim(claim) — claims with no mapping are placed in an unresolvable list and excluded (not silently dropped)
amount is a positive decimal; currency is hardcoded to 'NOK' for all HLF records
workerDimension maps to the peer mentor's Dynamics worker/employee dimension code from org credential config — never the internal UUID
projectDimension maps to the chapter/project dimension code from ChartOfAccountsMapper
description is max 60 characters (Dynamics field limit): '{activity_type_name} – {peer_mentor_initials} – {date}'
Pre-flight validation rejects the export if: any required dimension is blank, amount is zero or negative, or currency is not 'NOK'
Generated file uploaded to private Supabase Storage at exports/{orgId}/dynamics/{exportRunId}.csv with 1-hour signed URL returned
export_runs record created with status 'completed' on success, 'failed' with error_detail on exception
Unit tests cover: successful mapping for mileage and toll claim types, missing ledger account handling, description truncation at 60 chars, zero-amount rejection

Technical Requirements

frameworks
Flutter
Riverpod
flutter_test
apis
Microsoft Dynamics 365 REST API — HLF portal intake schema (journal line format)
Supabase Storage — file upload and signed URL generation
data models
activity
assignment
performance requirements
Transformation of up to 500 claims completes in under 2 seconds
File upload progress tracked for files over 1MB
security requirements
Azure AD credentials and Dynamics API key stored exclusively in Supabase Edge Function environment variables — never in Flutter app binary
Export execution performed server-side via Edge Function; mobile client receives only the signed file URL
Generated file stored in private Supabase Storage bucket with org-scoped RLS
Signed URL expires after 1 hour
workerDimension must use Dynamics-specific worker codes — never expose internal Supabase UUIDs
OAuth token rotation enforced via Azure AD token lifetime policies; Edge Function must handle token refresh

Execution Context

Execution Tier
Tier 2

Tier 2 - 518 tasks

Can start after Tier 1 completes

Implementation Notes

Mirror the XledgerExporter architecture but with a DynamicsRecordMapper producing DynamicsJournalLine objects. The key difference from Xledger: Dynamics uses a single-row-per-claim format (no double-entry pair) with amount as a signed decimal (positive for expenses). Define DynamicsJournalLine as an immutable Dart class with the 7 required fields. Use the same pipeline pattern: map → validate → serialize → upload → record → return.

The ChartOfAccountsMapper should be capable of resolving both Xledger account codes and Dynamics ledger account / dimension codes from the same org configuration — confirm this is in scope when ChartOfAccountsMapper is implemented (task-004). Reuse CsvJsonFileGenerator from task-005 with a DynamicsColumnSchema configuration object. Keep the two exporters as sibling classes implementing a shared abstract AccountingExporter interface — this allows the Accounting Exporter Service to call them polymorphically. The interface: Future export(String orgId, List claims).

Store Dynamics-specific field length constants (60 char description limit) in a DynamicsExportConstants class rather than magic numbers.

Testing Requirements

Unit tests (flutter_test with mocked dependencies): test DynamicsRecordMapper.toDynamicsJournalLine(claim) for mileage and toll claim types; assert currency is 'NOK'; assert description truncation at 60 chars; assert missing ledger account moves claim to unresolvable list; assert zero-amount claim triggers validation failure. Integration test: invoke full export() with 5 mock ApprovedClaims against test Supabase Storage; verify file created and export_runs record has status 'completed'. Test file: test/services/dynamics_exporter_test.dart. Mock ChartOfAccountsMapper and CsvJsonFileGenerator.

Confirm DynamicsExporter and XledgerExporter share no logic beyond the injected ChartOfAccountsMapper and CsvJsonFileGenerator — they must remain independently testable.

Component
Chart of Accounts Mapper
data medium
Epic Risks (3)
high impact medium prob technical

Adding exported_at and export_run_id columns to expense_claims requires a live migration on a table shared with the approval workflow. A poorly timed migration could lock the table and block claim submissions or approvals.

Mitigation & Contingency

Mitigation: Use non-blocking ADD COLUMN with a DEFAULT of NULL (no backfill needed) executed during a low-traffic window. Test migration rollback on a staging replica before production deployment.

Contingency: If migration causes table lock contention, roll back and reschedule for a maintenance window. Use a feature flag to gate the export UI until the migration completes successfully.

medium impact high prob scope

Chart of accounts mapping configurations for Xledger and Dynamics may not be fully specified by stakeholders at development time, leaving the mapper with incomplete data and causing validation failures for unmapped expense categories.

Mitigation & Contingency

Mitigation: Implement the mapper to return a structured validation error (not a crash) for any unmapped field, and surface these errors clearly in the export confirmation dialog. Request full mapping tables from Blindeforbundet and HLF stakeholders as a pre-condition for this epic.

Contingency: If mappings arrive incomplete, ship the mapper with the available subset and mark unmapped categories as excluded (skipped with reason). Coordinators see which categories are skipped and can manually submit those records.

medium impact medium prob dependency

Supabase Vault configuration for storing per-org accounting credentials may require infra permissions or environment secrets not yet provisioned in staging or production, blocking development and testing of credential retrieval.

Mitigation & Contingency

Mitigation: Provision Vault secrets and environment configuration in staging as the first task of this epic. Document the exact secret naming convention and rotation procedure before implementation begins.

Contingency: If Vault is unavailable, use environment variables scoped to the Edge Function as a temporary fallback for development. Block production deployment until Vault-based storage is confirmed operational.