Invoke Bufdir export pipeline for re-export
epic-bufdir-report-history-services-task-006 — Wire the ReportReexportCoordinator to call the existing Bufdir export pipeline from the bufdir-report-export feature, passing the reconstructed period parameters. Ensure the invocation reuses all existing aggregation, formatting, and file-generation logic without duplication so that re-exported reports are byte-for-byte equivalent to original submissions given the same period inputs.
Acceptance Criteria
Technical Requirements
Execution Context
Tier 3 - 413 tasks
Can start after Tier 2 completes
Implementation Notes
The key design constraint is zero duplication: the coordinator is a thin orchestrator that calls the pipeline, not a reimplementation of it. Map `ReportPeriodParameters` to the pipeline input DTO in a dedicated mapper method or extension — keep the mapping logic out of the coordinator's main method to make it independently testable. If the existing export pipeline is implemented as a Supabase Edge Function, the coordinator calls it via Supabase's `functions.invoke()` method. If it is a Dart service class, inject it via Riverpod and call its public method.
Use `try-catch` around the pipeline invocation and rethrow as `ReexportPipelineException(cause: e)`. Byte-for-byte equivalence requires that the pipeline does not include timestamps or UUIDs generated at invocation time in the CSV body — verify this with the export pipeline owner before implementation.
Testing Requirements
Unit tests (flutter_test) with mock export pipeline: (1) verify `invoke` is called once with correctly mapped parameters, (2) verify `ReexportPipelineException` is thrown when the pipeline throws, (3) verify the coordinator returns the pipeline's result object unmodified. Integration test: for a known seeded dataset with a fixed period range, run the original export, then run re-export with the same parameters and assert the resulting CSV content matches line-by-line. Contract test: assert the `ReportPeriodParameters` to export pipeline input mapping handles all field types without loss (dates, UUIDs, enums).
The ReportReexportCoordinator must invoke the Bufdir export pipeline defined in the bufdir-report-export feature. If that feature's internal API changes (renamed services, altered parameters), the re-export coordinator will break silently at runtime.
Mitigation & Contingency
Mitigation: Define a stable, versioned interface (abstract class or Dart interface) for the export pipeline entry point. The re-export coordinator depends only on this interface, not on concrete export service internals. Document the contract in both features.
Contingency: If the export pipeline breaks the re-export coordinator, fall back to surfacing a clear 'regeneration unavailable' message to the coordinator with instructions to use the primary export screen for the same period as a workaround, while the interface mismatch is fixed.
The audit trail must be immutable — coordinators must not be able to edit or delete past events. If the RLS policies allow UPDATE or DELETE on audit event rows, a coordinator could suppress evidence of a re-export or failed submission.
Mitigation & Contingency
Mitigation: Apply INSERT-only RLS policies to the audit events table (no UPDATE, no DELETE for any non-service-role user). Use a separate service-role key for writing audit events, never the user's JWT. Validate this in integration tests by asserting that UPDATE and DELETE calls from coordinator-role sessions are rejected with RLS errors.
Contingency: If immutability is compromised before detection, run a database audit comparing the audit log against the main history table timestamps to identify tampered records, restore from backup if needed, and issue a patch RLS migration immediately.
The user stories require filter state (year, period type, status) to persist within a session so coordinators do not lose context when navigating away. Implementing this with Riverpod state management could cause stale filter state if the provider is not properly scoped to the session lifecycle.
Mitigation & Contingency
Mitigation: Scope the filter state provider to the router's history route scope, not globally. Use autoDispose with a keepAlive flag tied to the session so filters reset on logout but persist on tab switches within the same session.
Contingency: If filter state becomes stale or leaks between sessions, add an explicit reset in the logout handler that disposes all scoped providers. This is a UX degradation (coordinator must re-apply filters) rather than a data integrity issue.