Build ReportReexportCoordinator period parameter loader
epic-bufdir-report-history-services-task-005 — Implement the initial step of the ReportReexportCoordinator that loads the stored period parameters (date range, reporting period, aggregation scope) from the history record identified by the provided history entry ID. Validate that the parameters are complete and return a typed domain object for use by downstream pipeline steps.
Acceptance Criteria
Technical Requirements
Execution Context
Tier 2 - 518 tasks
Can start after Tier 1 completes
Implementation Notes
Define `ReportPeriodParameters` as a plain Dart class with all fields final and a `const` constructor. Include a static `fromJson(Map
Use Supabase's `.single()` method so a missing record throws a `PostgrestException` with code `PGRST116` (no rows returned) — catch this specifically and rethrow as `RecordNotFoundException`. Define the custom exception classes in a shared exceptions file for the bufdir feature, not inline.
Testing Requirements
Unit tests (flutter_test) with mock repository: (1) happy path — mock returns a valid record, assert domain object fields match, (2) empty historyEntryId throws ArgumentError, (3) non-UUID historyEntryId throws ArgumentError, (4) repository returns null — throws RecordNotFoundException, (5) record has null startDate — throws ReportParametersIncompleteException naming 'startDate', (6) record has null endDate — same, (7) record has null reportingPeriodId — same, (8) record has all fields — domain object constructed correctly. Integration test on staging: seed a record, call method, assert all fields round-trip correctly through the domain object.
The ReportReexportCoordinator must invoke the Bufdir export pipeline defined in the bufdir-report-export feature. If that feature's internal API changes (renamed services, altered parameters), the re-export coordinator will break silently at runtime.
Mitigation & Contingency
Mitigation: Define a stable, versioned interface (abstract class or Dart interface) for the export pipeline entry point. The re-export coordinator depends only on this interface, not on concrete export service internals. Document the contract in both features.
Contingency: If the export pipeline breaks the re-export coordinator, fall back to surfacing a clear 'regeneration unavailable' message to the coordinator with instructions to use the primary export screen for the same period as a workaround, while the interface mismatch is fixed.
The audit trail must be immutable — coordinators must not be able to edit or delete past events. If the RLS policies allow UPDATE or DELETE on audit event rows, a coordinator could suppress evidence of a re-export or failed submission.
Mitigation & Contingency
Mitigation: Apply INSERT-only RLS policies to the audit events table (no UPDATE, no DELETE for any non-service-role user). Use a separate service-role key for writing audit events, never the user's JWT. Validate this in integration tests by asserting that UPDATE and DELETE calls from coordinator-role sessions are rejected with RLS errors.
Contingency: If immutability is compromised before detection, run a database audit comparing the audit log against the main history table timestamps to identify tampered records, restore from backup if needed, and issue a patch RLS migration immediately.
The user stories require filter state (year, period type, status) to persist within a session so coordinators do not lose context when navigating away. Implementing this with Riverpod state management could cause stale filter state if the provider is not properly scoped to the session lifecycle.
Mitigation & Contingency
Mitigation: Scope the filter state provider to the router's history route scope, not globally. Use autoDispose with a keepAlive flag tied to the session so filters reset on logout but persist on tab switches within the same session.
Contingency: If filter state becomes stale or leaks between sessions, add an explicit reset in the logout handler that disposes all scoped providers. This is a UX degradation (coordinator must re-apply filters) rather than a data integrity issue.