Skip to content

perf(queriesObserver): fix O(n²) performance issue in batch updates #9467

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

joseph0926
Copy link
Contributor

@joseph0926 joseph0926 commented Jul 20, 2025

This is an additional workaround attempt for an issue I've been challenged to solve before.
Previous issue
First Resolution Attempt PR

Note: I'm not a native English speaker, so I've used AI to help organize and articulate my thoughts clearly in this PR.

Background

This PR addresses a long-standing performance issue first reported in #8295, where using useQueries with batch data fetching patterns (like DataLoader) causes severe performance degradation due to O(n²) complexity.

The Problem

When multiple queries resolve simultaneously (common with DataLoader pattern), the performance degrades quadratically:

// Example: DataLoader batches multiple requests
const userLoader = new DataLoader(async (ids) => {
  const users = await fetch(`/api/users?ids=${ids.join(',')}`);
  return users; // Returns [user1, user2, user3, ...] all at once
});

// Used with useQueries
const queries = useQueries({
  queries: userIds.map(id => ({
    queryKey: ["user", id],
    queryFn: () => userLoader.load(id),
  })),
});

The issue occurs because:

  1. DataLoader fetches all data in one batch
  2. Each individual promise resolves separately
  3. Each resolution triggers an update that searches through ALL observers
  4. Result: n queries × n searches = O(n²) complexity

extra notes

i found two remaining pain‑points:

  1. Full‑array scans even when the target observer is already known
   // Previous implementation — O(N²)
   function difference<T>(a: T[], b: T[]): T[] {
     return a.filter((x) => !b.includes(x)) // includes ⇒ O(N) per element
   }

If 100 observers change only 1 member, we still perform
100 × 100 = 10 000 comparisons.

  1. Missing early‑return / direct access

    // onUpdate still does a linear search every time
    const index = this.#observers.indexOf(observer) // O(N) per update
    // trackResult recomputes matches on every call
    const matches = this.#findMatchingObservers(queries)

Concrete mental model

Imagine a courier delivering 100 packages to 100 buildings:
– For each package he checks every single building (101 → 200)
– He repeats that 100 times.
Total look‑ups: 100 packages × 100 buildings = 10 000

Real-world impact

  • 100 queries = 10,000 operations (noticeable lag)
  • 1,000 queries = 1,000,000 operations (browser freeze)

Previous Improvements

Over the past months, several optimizations have been made:

1. Caching findMatchingObservers results (#8304)

I previously added observerMatches to cache results instead of recalculating:

// Before: Called findMatchingObservers repeatedly
// After: Cache and reuse
this.#observerMatches = newObserverMatches

2. Optimizing difference function (O(n²) → O(n))

// Before: O(n²) with includes
function difference<T>(array1: Array<T>, array2: Array<T>): Array<T> {
  return array1.filter((x) => !array2.includes(x))
}

// After: O(n) with Set
function difference<T>(array1: Array<T>, array2: Array<T>): Array<T> {
  const excludeSet = new Set(array2)
  return array1.filter((x) => !excludeSet.has(x))
}

3. Optimizing findMatchingObservers with Map

// Now uses Map for O(1) lookups
const prevObserversMap = new Map(
  this.#observers.map((observer) => [observer.options.queryHash, observer]),
)

The Remaining Issue

Despite these improvements, one critical O(n) operation remains in #onUpdate:

#onUpdate(observer: QueryObserver, result: QueryObserverResult): void {
  const index = this.#observers.indexOf(observer)  // O(n) search!
  if (index !== -1) {
    this.#result = replaceAt(this.#result, index, result)
    this.#notify()
  }
}

With batch updates, this creates:

  • 100 updates × 100 searches = 10,000 operations
  • 1,000 updates × 1,000 searches = 1,000,000 operations

This PR's Solution

This PR introduces a WeakMap to track observer indices, eliminating the O(n) search:

export class QueriesObserver {
  #indexMap: WeakMap<QueryObserver, number> = new WeakMap()

  setQueries(...) {
    // Update index map whenever observers change
    this.#indexMap = new WeakMap()
    newObservers.forEach((observer, index) => {
      this.#indexMap.set(observer, index)
    })
  }

  #onUpdate(observer: QueryObserver, result: QueryObserverResult): void {
    // O(1) lookup instead of O(n) search
    const index = this.#indexMap.get(observer)
    if (index !== undefined) {
      this.#result = replaceAt(this.#result, index, result)
      this.#notify()
    }
  }
}

Why WeakMap?

I chose WeakMap over regular Map to address the concern raised in #8686 about storing observers in both a Map and an Array:

"I'm not a fan of storing the observers in both a Map and an Array, and the whole thing becomes more complex."

WeakMap solves this elegantly:

  • Not dual storage: WeakMap only stores indices, not observers themselves. The Array remains the single source of truth for observers
  • Automatic cleanup: When observers are removed from the array, WeakMap entries are automatically garbage collected - no manual synchronization needed
  • Minimal overhead: Acts purely as a lookup table without adding complexity to the data model
  • Memory efficient: No risk of memory leaks from orphaned references

This approach maintains the simplicity of the original design while achieving O(1) performance.

Performance Results

With this change, batch updates now scale linearly:

  • Before: O(n²) - 100 queries took ~10,000 operations
  • After: O(n) - 100 queries take ~100 operations
  • Improvement: ~100x faster for typical DataLoader use cases

Summary

This completes the optimization journey for QueriesObserver:

  1. findMatchingObservers caching (my previous PR)
  2. difference function: O(n²) → O(n)
  3. findMatchingObservers: O(n) → O(1) with Map
  4. #onUpdate: O(n) → O(1) with WeakMap (this PR)

The combination of these improvements transforms what was once a quadratic bottleneck into efficient linear scaling, making useQueries viable for large-scale batch operations.

Copy link

nx-cloud bot commented Jul 20, 2025

View your CI Pipeline Execution ↗ for commit 5f51b4a

Command Status Duration Result
nx affected --targets=test:sherif,test:knip,tes... ✅ Succeeded 3m 48s View ↗
nx run-many --target=build --exclude=examples/*... ✅ Succeeded 1m 19s View ↗

☁️ Nx Cloud last updated this comment at 2025-07-20 02:58:15 UTC

Copy link

pkg-pr-new bot commented Jul 20, 2025

More templates

@tanstack/angular-query-devtools-experimental

npm i https://pkg.pr.new/@tanstack/angular-query-devtools-experimental@9467

@tanstack/angular-query-experimental

npm i https://pkg.pr.new/@tanstack/angular-query-experimental@9467

@tanstack/eslint-plugin-query

npm i https://pkg.pr.new/@tanstack/eslint-plugin-query@9467

@tanstack/query-async-storage-persister

npm i https://pkg.pr.new/@tanstack/query-async-storage-persister@9467

@tanstack/query-broadcast-client-experimental

npm i https://pkg.pr.new/@tanstack/query-broadcast-client-experimental@9467

@tanstack/query-core

npm i https://pkg.pr.new/@tanstack/query-core@9467

@tanstack/query-devtools

npm i https://pkg.pr.new/@tanstack/query-devtools@9467

@tanstack/query-persist-client-core

npm i https://pkg.pr.new/@tanstack/query-persist-client-core@9467

@tanstack/query-sync-storage-persister

npm i https://pkg.pr.new/@tanstack/query-sync-storage-persister@9467

@tanstack/react-query

npm i https://pkg.pr.new/@tanstack/react-query@9467

@tanstack/react-query-devtools

npm i https://pkg.pr.new/@tanstack/react-query-devtools@9467

@tanstack/react-query-next-experimental

npm i https://pkg.pr.new/@tanstack/react-query-next-experimental@9467

@tanstack/react-query-persist-client

npm i https://pkg.pr.new/@tanstack/react-query-persist-client@9467

@tanstack/solid-query

npm i https://pkg.pr.new/@tanstack/solid-query@9467

@tanstack/solid-query-devtools

npm i https://pkg.pr.new/@tanstack/solid-query-devtools@9467

@tanstack/solid-query-persist-client

npm i https://pkg.pr.new/@tanstack/solid-query-persist-client@9467

@tanstack/svelte-query

npm i https://pkg.pr.new/@tanstack/svelte-query@9467

@tanstack/svelte-query-devtools

npm i https://pkg.pr.new/@tanstack/svelte-query-devtools@9467

@tanstack/svelte-query-persist-client

npm i https://pkg.pr.new/@tanstack/svelte-query-persist-client@9467

@tanstack/vue-query

npm i https://pkg.pr.new/@tanstack/vue-query@9467

@tanstack/vue-query-devtools

npm i https://pkg.pr.new/@tanstack/vue-query-devtools@9467

commit: 5f51b4a

Copy link

codecov bot commented Jul 20, 2025

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 59.60%. Comparing base (05c62a0) to head (5f51b4a).

Additional details and impacted files

Impacted file tree graph

@@             Coverage Diff             @@
##             main    #9467       +/-   ##
===========================================
+ Coverage   45.30%   59.60%   +14.29%     
===========================================
  Files         208      137       -71     
  Lines        8283     5525     -2758     
  Branches     1869     1477      -392     
===========================================
- Hits         3753     3293      -460     
+ Misses       4085     1930     -2155     
+ Partials      445      302      -143     
Components Coverage Δ
@tanstack/angular-query-devtools-experimental ∅ <ø> (∅)
@tanstack/angular-query-experimental 85.00% <ø> (ø)
@tanstack/eslint-plugin-query ∅ <ø> (∅)
@tanstack/query-async-storage-persister 43.85% <ø> (ø)
@tanstack/query-broadcast-client-experimental 24.39% <ø> (ø)
@tanstack/query-codemods ∅ <ø> (∅)
@tanstack/query-core 97.69% <100.00%> (+<0.01%) ⬆️
@tanstack/query-devtools 3.55% <ø> (ø)
@tanstack/query-persist-client-core 79.47% <ø> (ø)
@tanstack/query-sync-storage-persister 84.61% <ø> (ø)
@tanstack/query-test-utils ∅ <ø> (∅)
@tanstack/react-query 95.95% <ø> (ø)
@tanstack/react-query-devtools 10.00% <ø> (ø)
@tanstack/react-query-next-experimental ∅ <ø> (∅)
@tanstack/react-query-persist-client 100.00% <ø> (ø)
@tanstack/solid-query 78.13% <ø> (ø)
@tanstack/solid-query-devtools ∅ <ø> (∅)
@tanstack/solid-query-persist-client 100.00% <ø> (ø)
@tanstack/svelte-query 87.58% <ø> (ø)
@tanstack/svelte-query-devtools ∅ <ø> (∅)
@tanstack/svelte-query-persist-client 100.00% <ø> (ø)
@tanstack/vue-query 71.10% <ø> (ø)
@tanstack/vue-query-devtools ∅ <ø> (∅)
🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.
  • 📦 JS Bundle Analysis: Save yourself from yourself by tracking and limiting bundle sizes in JS merges.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant