critical priority high complexity integration pending integration specialist Tier 2

Acceptance Criteria

DynamicsAdapter implements the uniform adapter interface with all required methods matching the contract from task-005
Azure AD OAuth2 authorization code flow with PKCE completes successfully and access token is retrieved server-side inside Edge Function
Azure AD token refresh using refresh token is handled automatically before token expiry
Dynamics OData endpoint construction correctly forms URLs for the configured Dynamics instance base URL (stored in integration config)
Batch operations use Dynamics OData $batch endpoint with correct multipart/mixed content type and changeset boundaries
Field mapping correctly translates internal peer mentor certification data to Dynamics entity schema fields (cert_type → certification_status, issued_at → effectiveStartDate, etc.)
Sync metadata writes (last_synced_at, sync_id) succeed after each batch operation
Adapter capabilities declare: ['certification_sync', 'membership_read'] matching HLF's use case
Adapter does NOT overlap with HLF's existing 'min side' Dynamics portal project — only reads certification data and writes sync metadata as agreed with HLF
All HTTP errors from Dynamics OData (400 Bad Request, 401 Unauthorized, 412 Precondition Failed for ETags, 429 Too Many Requests) are normalized to AdapterError types
Azure AD credentials stored server-side only — not in Flutter app binary or any mobile storage
OAuth token rotation enforced via Azure AD token lifetime policies — adapter does not cache tokens beyond their lifetime
Minimal required Dynamics permissions are used: read certification, write sync metadata only
Data sync is scoped exclusively to HLF organisation via credential isolation in the vault
Integration smoke test against Dynamics sandbox reads at least one certification record successfully

Technical Requirements

frameworks
Dart
Flutter
Supabase Edge Functions (Deno)
apis
Microsoft Dynamics 365 REST API (OData v4)
Azure AD OAuth2 / OIDC
Supabase Edge Functions REST API
Supabase PostgreSQL 15
data models
certification
assignment
contact
performance requirements
Single certification read operation completes within 4 seconds including Azure AD token acquisition
OData batch operations process up to 50 records per changeset to stay within Dynamics API limits
Token refresh must complete within 3 seconds to avoid blocking sync operations
Adapter healthCheck() must respond within 5 seconds
security requirements
Azure AD credentials (client_id, client_secret, tenant_id) stored server-side only in credential vault — never in mobile app
All adapter execution inside Supabase Edge Functions — zero direct mobile-to-Dynamics communication
Minimal Dynamics API permissions: read certification data, write sync metadata only — no broader access
Data sync scoped to HLF organisation only via JWT claim validation before credential lookup
OAuth token rotation enforced: do not cache tokens beyond their expires_in value
ETag-based optimistic concurrency used for Dynamics writes to prevent data corruption
Audit log entry for every sync operation (success and failure)

Execution Context

Execution Tier
Tier 2

Tier 2 - 518 tasks

Can start after Tier 1 completes

Integration Task

Handles integration between different epics or system components. Requires coordination across multiple development streams.

Implementation Notes

Implement inside a Supabase Edge Function (Deno). The most complex aspect is the Azure AD PKCE flow — implement as a two-phase process: (1) authorization URL generation returned to the UI wizard for user redirect, (2) token exchange in a callback Edge Function endpoint. Store the resulting refresh token in the credential vault. For OData, use Deno fetch with explicit Accept: application/json and OData-Version: 4.0 headers.

Dynamics OData batch requests require careful multipart/mixed body construction — implement a DynamicsBatchBuilder helper class that accepts an array of operations and produces the correct boundary-delimited body. Key coordination risk: HLF has an existing 'min side' Dynamics portal project. Before finalising field mappings, confirm with HLF which Dynamics entities and fields are owned by which system. Enforce this in the adapter via a strict field allowlist — the adapter should throw if asked to write to a non-allowed field.

This prevents accidental overlap. For error handling, Dynamics OData errors are returned in a specific JSON envelope (error.code, error.message, error.innererror) — parse this structure in the error normalizer. 412 Precondition Failed means an ETag conflict; surface this as a retryable=false conflict error requiring manual resolution.

Testing Requirements

Unit tests (Deno test): Azure AD token acquisition and refresh flow (mocked), OData URL construction for all supported entity types, batch changeset serialization to multipart/mixed format, field mapping from internal certification model to Dynamics schema, error normalization for OData error response shapes. Integration tests: full Azure AD PKCE flow against test tenant, read certification records from Dynamics sandbox, write sync metadata, ETag concurrency conflict handling. Coordination test: verify adapter scope does not write to entities owned by HLF's 'min side' portal (use field allowlist assertion). Test coverage target: 90% on adapter business logic.

Record HTTP fixtures for unit tests to avoid live Azure dependencies in CI.

Component
REST API Adapter Registry
service high
Epic Risks (3)
medium impact high prob technical

Each of the five external systems (Xledger, Dynamics, Cornerstone, Consio, Bufdir) has a different authentication flow, field schema, and error format. Forcing them into a uniform adapter interface may require compromises that result in leaky abstractions or make the adapter contract too complex to maintain.

Mitigation & Contingency

Mitigation: Design the IntegrationAdapter interface with a loose invoke() payload rather than a typed one, allowing each adapter to declare its own input/output schema. Use integration type metadata in the registry to document per-adapter quirks. Build Xledger first as the most documented API, then adapt the interface based on learnings.

Contingency: If the uniform interface cannot accommodate all five systems, split into two interface tiers: a simple polling/export adapter and a richer bidirectional adapter, with the registry declaring which tier each system implements.

medium impact high prob dependency

Development and testing of the Cornerstone and Consio adapters depends on NHF providing sandbox API access. If credentials or documentation are delayed, these adapters cannot be validated, blocking the epic's acceptance criteria.

Mitigation & Contingency

Mitigation: Implement Xledger and Dynamics adapters first (better-documented, sandbox available). Create a mock adapter for Cornerstone/Consio using recorded API responses for CI testing. Proactively request sandbox access from NHF at project kickoff.

Contingency: Ship the epic with Cornerstone/Consio adapters in a 'stub' state (connectivity test returns a simulated success, invoke() is not production-wired) and gate the NHF integration behind a feature flag until real API access is obtained.

medium impact medium prob scope

Real-world field mappings may include nested transformations, conditional logic, and data type coercions (e.g., Norwegian date formats, currency rounding rules) that the Field Mapping Resolver's initial design does not accommodate, requiring scope expansion mid-epic.

Mitigation & Contingency

Mitigation: Gather actual field mapping examples from Blindeforbundet (Xledger) and HLF (Dynamics) before designing the resolver. Identify the most complex transformation required and ensure the resolver design handles it. Limit Phase 1 to direct field renaming and format conversion only.

Contingency: If complex transformations are required, implement a simple expression evaluator (e.g., JSONata or a custom mini-DSL) as an extension point in the resolver, delivering basic mappings first and complex ones in a follow-up task.