high priority medium complexity testing pending testing specialist Tier 1

Acceptance Criteria

Test file at test/query_builders/export_data_query_builder_test.dart runs with flutter test
Single-chapter scope test: query returns only activities belonging to the specified chapter org_id — no activities from sibling chapters
Multi-level hierarchy scope test: query with a region-level scope returns all activities under all child chapters of that region
Date range boundary test (inclusive start): activity on exactly the start date is included in the result
Date range boundary test (inclusive end): activity on exactly the end date is included in the result
Date range boundary test (exclusive): activity one day before start and one day after end are excluded
Activities with no matching activity_type_configuration row are included in the ExportDataSet with a null or sentinel value for the configuration fields — not silently dropped
Empty result set: when no activities match the scope/date range, ExportDataSet is returned with an empty activities list and no exception is thrown
Duplicate row test: fixture with two activities sharing the same ID (simulating a bad join) results in exactly one entry per unique activity_id in ExportDataSet
50-activity fixture test: the full ExportDataSet from the fixture matches a pre-computed expected output object field-by-field (golden assertion)
All field mappings in ExportDataSet are verified — no required Bufdir field is missing from joined results
Test suite runs in under 5 seconds with no real Supabase calls

Technical Requirements

frameworks
Flutter
flutter_test
mockito or mocktail
apis
Supabase PostgREST client (mocked)
data models
ExportDataSet
ActivityRecord
ActivityTypeConfiguration
OrgScope
performance requirements
Query builder must handle the 50-activity fixture and produce results in under 100 ms in test environment
security requirements
Fixture data must use synthetic IDs — no real member or org data
Verify query builder does not leak data across org scopes in multi-tenant fixture

Execution Context

Execution Tier
Tier 1

Tier 1 - 540 tasks

Can start after Tier 0 completes

Implementation Notes

Structure ExportDataQueryBuilder to accept an abstract DataSource interface (not SupabaseClient directly) so tests can inject a fake that returns fixture data without mocking the full Supabase chain. The join correctness tests are essentially contract tests — define the expected SQL/PostgREST query shape and assert the builder produces it, then separately assert the domain mapping from raw rows to ExportDataSet. For the duplicate row test, inject a fake that returns two rows with identical activity_id and verify the builder deduplicates before returning. For missing activity_type_configuration entries, decide and document the policy (null fields vs.

default values) before writing the test — the test is the specification. Store the 50-activity fixture as a static const list in a fixture helper class to keep the test file readable.

Testing Requirements

Unit tests with flutter_test and mocked Supabase. Define fixture files in test/fixtures/export_data/ as Dart const maps or JSON files loaded at test startup. Group tests by scenario: scope tests, date boundary tests, missing-config tests, deduplication, golden output. For the 50-activity golden test, commit the expected ExportDataSet as a fixture and assert equality using a deep-equals matcher.

Use verify() to assert the mock was called with the correct query parameters (table, filters, select fields). Measure test runtime and fail CI if it exceeds 5 seconds.

Component
Export Data Query Builder
data high
Epic Risks (3)
high impact medium prob technical

NHF's three-level hierarchy (national / region / chapter) with 1,400 chapters may have edge cases such as chapters belonging to multiple regions, orphaned nodes, or missing parent links in the database. Incorrect scope expansion would silently under- or over-report activities, which could invalidate a Bufdir submission.

Mitigation & Contingency

Mitigation: Obtain a full hierarchy fixture export from NHF before implementation begins. Write exhaustive unit tests covering boundary cases: single chapter, full national roll-up, chapters with no activities, and chapters assigned to multiple regions. Validate resolver output against a known-good manual count.

Contingency: If hierarchy data quality is too poor for automated resolution at launch, implement a manual scope override in the coordinator UI that allows the coordinator to explicitly select org units from a tree picker, bypassing the resolver.

medium impact high prob dependency

The activity_type_configuration table may not cover all activity types currently in use, leaving a subset unmapped at launch. Bufdir submissions with unmapped categories will be incomplete and may be rejected by Bufdir.

Mitigation & Contingency

Mitigation: Run a query against production activity data before implementation to enumerate all distinct activity type IDs. Cross-reference with Bufdir's published category schema (request from Norse Digital Products). Flag every gap as a known issue and build the warning surface into the preview panel.

Contingency: Implement a fallback 'Other' category bucket for unmapped types and surface a prominent warning in the export preview requiring coordinator acknowledgement before proceeding. Log unmapped types for post-launch cleanup.

high impact low prob security

Supabase RLS policies on generated_reports and the storage bucket must enforce strict org isolation. A misconfigured policy could allow a coordinator from one organisation to read another organisation's export files, creating a serious data breach with GDPR implications.

Mitigation & Contingency

Mitigation: Write RLS integration tests that attempt cross-org reads with explicitly different JWT tokens and assert that all attempts return empty sets or 403 errors. Include RLS policy review in the pull request checklist. Use Supabase's built-in policy tester during development.

Contingency: If a policy gap is discovered post-deployment, immediately revoke all signed URLs for affected exports, audit the access log for unauthorised reads, and issue a coordinated disclosure to affected organisations per GDPR breach notification requirements.