critical priority low complexity backend pending backend specialist Tier 0

Acceptance Criteria

Abstract class DeduplicationQueueService is defined in lib/domain/services/deduplication_queue_service.dart with no concrete implementation
UnresolvedPair model includes: pairId (String), activityIdA (String), activityIdB (String), detectedAt (DateTime), overlapScore (double), chapterId (String), resolvedAt (DateTime?), resolvedBy (String?)
QueueResolutionAction is defined as an enum with values: keepA, keepB, keepBoth, dismiss — representing all coordinator resolution choices
QueueScopeFilter model includes: chapterIds (List<String>), resolvedFilter (ResolvedFilter enum: unresolved, resolved, all), dateFrom (DateTime?), dateTo (DateTime?)
ResolvedFilter is defined as an enum with values: unresolved, resolved, all
getUnresolvedPairs() method signature accepts a QueueScopeFilter and returns Future<List<UnresolvedPair>>
getUnresolvedCount() method signature accepts a QueueScopeFilter and returns Future<int> — used for badge count display
forceResolve() method signature accepts pairId (String) and action (QueueResolutionAction) and returns Future<void>
All models implement Equatable (or manual == and hashCode) and copyWith()
Interface and all models are exported from lib/domain/services/deduplication_queue_exports.dart barrel file
The QueueScopeFilter enforces chapter-scoped access by requiring at least one chapterId — document this constraint with a code comment
No Supabase, Flutter, or repository imports exist in any interface or model file

Technical Requirements

frameworks
Dart (pure domain layer)
equatable package
data models
UnresolvedPair (new)
QueueResolutionAction (new enum)
QueueScopeFilter (new)
ResolvedFilter (new enum)
performance requirements
All models must be immutable value objects — all fields final
getUnresolvedCount() is declared as a separate method (not derived from getUnresolvedPairs()) to allow efficient COUNT query implementation
security requirements
QueueScopeFilter must require chapterIds to be non-empty — the interface contract must document that an empty chapterIds list is invalid to prevent accidental cross-org data exposure
UnresolvedPair must not include raw activity content — only IDs, allowing the UI layer to fetch display details separately

Execution Context

Execution Tier
Tier 0

Tier 0 - 440 tasks

Implementation Notes

Separate the DeduplicationQueueService interface from the DuplicateDetectionService (task-001) — these are distinct services with different responsibilities: DuplicateDetectionService is invoked pre-submit (real-time check), DeduplicationQueueService is a coordinator admin tool for reviewing persisted unresolved pairs. The chapterId scoping on UnresolvedPair is critical for NHF's multi-chapter coordinator structure — a coordinator managing 3 chapters should only see pairs within those chapters. forceResolve() with QueueResolutionAction.dismiss means 'mark as reviewed, not a duplicate' — document this distinction from keepA/keepB which imply deletion of one activity. Place the barrel export file at lib/domain/services/deduplication_queue_exports.dart and ensure it exports both the service interface and all 4 model types.

Testing Requirements

Write unit tests in test/domain/services/deduplication_queue_service_test.dart covering: (1) UnresolvedPair equality — two instances with same pairId and fields are equal, (2) UnresolvedPair with null resolvedAt and resolvedBy is valid (unresolved state), (3) copyWith() on UnresolvedPair correctly produces partial update with resolvedAt set, (4) QueueResolutionAction enum has exactly 4 values, (5) QueueScopeFilter with empty chapterIds documents the invalid state (test the model structure, not validation logic which belongs in the implementation). No mocking required — pure model tests. Target 100% line coverage for all model files.

Component
Deduplication Queue Service
service medium
Epic Risks (2)
medium impact medium prob technical

If the duplicate check RPC fails due to a network error or Supabase outage, the service must decide whether to block submission entirely (safe but disruptive) or allow submission to proceed silently (functional but risks data duplication). An incorrect choice leads to either user frustration or data quality issues.

Mitigation & Contingency

Mitigation: Define an explicit error policy in the service: RPC failures result in a DuplicateCheckResult with status: 'check_failed' and no candidates. The caller treats this as 'allow submission, flag for async review'. Document this as the intended graceful degradation behaviour in the service interface contract.

Contingency: If stakeholders require blocking on RPC failure, expose a configurable `failMode` parameter in the service that can be toggled per organisation via the feature flag system without a code deployment.

medium impact medium prob scope

The DuplicateComparisonPanel must handle varying activity schemas across organisations (NHF, HLF, Blindeforbundet each have different activity fields). A rigid layout may not accommodate all field variations, causing truncation or missing data in the comparison view.

Mitigation & Contingency

Mitigation: Design the panel to render a dynamic list of key-value pairs rather than a fixed-column layout. Define a `ComparisonField` model that each service populates with only the fields relevant to the activity type and organisation, allowing the panel to adapt without schema knowledge.

Contingency: If dynamic rendering proves too complex within the timeline, ship a simplified panel showing only the five most critical fields (peer mentor, activity type, date, chapter, submitter) and log a follow-up ticket for full field rendering in a later sprint.