Implement Hierarchy Node Editor save and validation flow
epic-organizational-hierarchy-management-assignment-aggregation-task-011 — Connect the Hierarchy Node Editor form to backend services: pre-save validation via Hierarchy Structure Validator (cycle detection, depth limits), optimistic UI updates with rollback on failure, success/error state management using BLoC, and real-time feedback for the searchable parent dropdown with debounce. Ensure that the editor correctly handles both create (new node) and edit (update existing node) modes with distinct validation paths.
Acceptance Criteria
Technical Requirements
Execution Context
Tier 5 - 253 tasks
Can start after Tier 4 completes
Implementation Notes
Separate the BLoC into two event handlers: _onCreateNode and _onUpdateNode. Both call HierarchyStructureValidator.validate() first with the full in-memory tree snapshot (loaded when the form opens). For the optimistic update pattern: (1) emit an intermediate state with the tentative node added to the local tree, (2) attempt the Supabase write, (3) on success emit SaveSuccess, (4) on failure emit SaveFailure and re-emit the pre-optimistic tree snapshot stored before step 1. Use Supabase's upsert only if the create/edit distinction is not important — here they are distinct, so use insert and update separately to keep error handling clean.
The parent dropdown debounce should be implemented as a BLoC event: ParentSearchQueryChanged(query) — the BLoC handles debounce internally using Stream.debounceTime from rxdart (already likely in the project). Avoid placing debounce logic in the widget layer. For the in-memory tree cycle check, implement as a depth-first traversal starting from the proposed parent and checking if the current node's ID appears as an ancestor.
Testing Requirements
Unit tests (flutter_test): test HierarchyNodeEditorBloc save flow for create mode — mock Supabase client, assert SaveSuccess emitted on mock success and SaveFailure on mock error; test rollback by asserting tree state reverts after mock write failure; test cycle detection by constructing an in-memory tree with a known cycle and asserting ValidationError emitted before any write. Unit tests for HierarchyStructureValidator: test all cycle cases (direct parent = self, grandparent loop, multi-hop cycle) and all depth-limit violations. Integration tests: run against Supabase test instance — create a node, verify it appears in the database; edit a node, verify patch applied; attempt cross-org write, verify RLS blocks it. Target 90%+ branch coverage on BLoC and validator.
Recursive aggregation queries across four hierarchy levels (national → region → local) with 1,400 leaf nodes may be too slow for real-time dashboard requests, exceeding the 200ms target and causing spinner timeouts.
Mitigation & Contingency
Mitigation: Implement aggregation as a Supabase RPC using a single recursive CTE rather than multiple round-trip queries. Pre-compute aggregations nightly via a scheduled Edge Function and cache results. For real-time needs, aggregate only the immediate subtree on demand.
Contingency: Surface a 'Refreshing...' indicator and serve stale cached aggregations immediately. Queue an async recalculation and push updated data via Supabase Realtime when ready, avoiding blocking the admin dashboard.
The 5-chapter limit and primary-assignment constraint are NHF-specific. Applying these rules globally may break HLF and Blindeforbundet configurations where different limits apply, requiring per-organization configuration that was not initially scoped.
Mitigation & Contingency
Mitigation: Make the maximum assignment count a configurable value stored in the organization's feature-flag or settings table rather than a hardcoded constant. Design the assignment service to read this limit at runtime per organization.
Contingency: Default the limit to a high value (e.g., 100) for organizations other than NHF, effectively making it non-restrictive, while keeping the enforcement logic intact for when per-org configuration is fully implemented.
The searchable parent dropdown in HierarchyNodeEditor must search across up to 1,400 units efficiently. Client-side filtering of the full hierarchy may be slow; server-side search adds complexity and latency.
Mitigation & Contingency
Mitigation: Use the in-memory hierarchy cache as the search corpus — since the cache already holds the flat unit list, client-side filtering with a debounced input is sufficient and avoids extra Supabase calls. Pre-build a search index on cache load.
Contingency: Cap the dropdown to showing the 50 most recently accessed units by default, with a 'search all' option that triggers a server-side full-text query. This keeps the common case fast while supporting edge cases.