Skip to content

fix(pointcloud): near-term batch — correctness + code-org#614

Open
louistrue wants to merge 23 commits intomainfrom
claude/pointcloud-near-term-LBxtJ
Open

fix(pointcloud): near-term batch — correctness + code-org#614
louistrue wants to merge 23 commits intomainfrom
claude/pointcloud-near-term-LBxtJ

Conversation

@louistrue
Copy link
Copy Markdown
Owner

@louistrue louistrue commented May 1, 2026

Four items from issue #611's near-term list:

  1. computeBBox empty + non-finite guards (e57 + ifcx-points). Empty
    or NaN-poisoned chunks no longer produce ±Infinity bounds.

  2. Magic-byte-first format detection. detectPointCloudFormat now
    probes the buffer before falling back to extension, so a LAS
    file named *.ply still loads correctly.

  3. E57 packet-bounds + per-stream guards. Validate that the
    DataPacket header, length table, and each bytestream fit inside
    payloadEnd before reading.

  4. Code-org splits:

    • e57.ts (631 lines) → e57-page.ts + e57-xml.ts + e57-decode.ts
      • e57.ts orchestrator (all <400 lines)
    • point-cloud-renderer.ts (420 lines) → extract uniform writer
      to point-cloud-uniforms.ts (renderer drops to 364 lines)

Verified: 62 pointcloud unit tests, full repo typecheck (24/24).

Summary by CodeRabbit

  • New Features

    • Per-class visibility controls with a class checklist and visible-count badge
    • BIM↔scan deviation heatmap with compute button, progress and range slider
    • ASCII PTS/XYZ point-cloud support and file-picker acceptance
    • Rectangle (marquee) selection for meshes and point clouds
    • Point-cloud legends, fixed-color picker, preview density while dragging section plane
    • Stream cancel button and hover tooltips showing world coordinates
  • Bug Fixes

    • Better format detection and robust handling of malformed/empty point data

claude added 2 commits May 1, 2026 10:46
Four items from issue #611's near-term list:

1. computeBBox empty + non-finite guards (e57 + ifcx-points). Empty
   or NaN-poisoned chunks no longer produce ±Infinity bounds.

2. Magic-byte-first format detection. detectPointCloudFormat now
   probes the buffer before falling back to extension, so a LAS
   file named *.ply still loads correctly.

3. E57 packet-bounds + per-stream guards. Validate that the
   DataPacket header, length table, and each bytestream fit inside
   payloadEnd before reading.

4. Code-org splits:
   - e57.ts (631 lines) → e57-page.ts + e57-xml.ts + e57-decode.ts
     + e57.ts orchestrator (all <400 lines)
   - point-cloud-renderer.ts (420 lines) → extract uniform writer
     to point-cloud-uniforms.ts (renderer drops to 364 lines)

Verified: 62 pointcloud unit tests, full repo typecheck (24/24).
Four UX items from issue #611:

1. Hover XYZ readback. Picker now copies the depth texel alongside
   the color texel, unprojects through inverse(viewProj), and adds
   PickResult.worldXYZ. HoverTooltip shows x/y/z under the entity
   id. Reverse-Z aware (depth=1 near, 0 far/miss).

2. Solid-color picker. Native <input type="color"> in the point
   cloud panel when colour mode is "fixed". Hex round-trips through
   the [r,g,b,a] store tuple.

3. Colour-mode legend. New PointCloudLegend component renders
   inline beneath the mode buttons:
   - Classification → ASPRS LAS 1.4 class id/color/label list
   - Intensity → black-to-white gradient bar
   - Height → cool-warm gradient bar
   Palettes mirror point-shader.wgsl.ts exactly.

4. Cancel button. New activeStreamCanceller on the loading slice;
   both ingest sites register/clear streamHandle.cancel(). StatusBar
   shows a Cancel button while non-null. AbortError → "Cancelled"
   instead of an error toast.

Verified: full repo typecheck (24/24), 655/655 tests pass, viewer
Vite build green.
@vercel
Copy link
Copy Markdown

vercel Bot commented May 1, 2026

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Actions Updated (UTC)
ifc-lite Ready Ready Preview, Comment May 3, 2026 7:46am
ifc-lite-viewer-embed Ready Ready Preview, Comment May 3, 2026 7:46am

Request Review

@coderabbitai
Copy link
Copy Markdown

coderabbitai Bot commented May 1, 2026

Warning

Rate limit exceeded

@louistrue has exceeded the limit for the number of commits that can be reviewed per hour. Please wait 46 minutes and 10 seconds before requesting another review.

To keep reviews running without waiting, you can enable usage-based add-on for your organization. This allows additional reviews beyond the hourly cap. Account admins can enable it under billing.

⌛ How to resolve this issue?

After the wait time has elapsed, a review can be triggered using the @coderabbitai review command as a PR comment. Alternatively, push new commits to this PR.

We recommend that you space out your commits to avoid hitting the rate limit.

🚦 How do rate limits work?

CodeRabbit enforces hourly rate limits for each developer per organization.

Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout.

Please see our FAQ for further information.

ℹ️ Review info
⚙️ Run configuration

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

Run ID: a39497ba-79e2-47d1-bf0e-c563652abcf1

📥 Commits

Reviewing files that changed from the base of the PR and between 6013ac6 and 5d0d26d.

📒 Files selected for processing (7)
  • .changeset/pointcloud-pr614-review-r2.md
  • apps/viewer/src/hooks/ingest/pointCloudIngest.ts
  • apps/viewer/src/hooks/useIfcFederation.ts
  • apps/viewer/src/hooks/useIfcLoader.ts
  • packages/pointcloud/src/formats/e57-decode.ts
  • packages/pointcloud/src/formats/e57.test.ts
  • packages/pointcloud/src/streaming/e57-source.ts

Walkthrough

Adds broad point-cloud functionality: ASCII PTS/XYZ decoding and streaming, E57 multi-scan pose and ScaledInteger decoding, per-class LAS visibility mask, preview-density throttling, GPU BIM↔scan deviation compute with BVH, rectangle (marquee) GPU picking, hover world-space readback, streaming cancellation, renderer uniform/ shader plumbing, and viewer UI/state to control and surface these features.

Changes

Release Notes

Layer / File(s) Summary
Changesets
.changeset/*.md
Multiple changeset markdown files documenting features and bumps for pointcloud/renderer/viewer packages.

ASCII PTS/XYZ Format & Streaming

Layer / File(s) Summary
Data Shape / Types
packages/pointcloud/src/formats/ascii-points.ts, packages/pointcloud/src/formats/ascii-points.test.ts
Adds ASCII PTS/XYZ parser: layout probing, column inference (pos/intensity/RGB), comment/blank-line skipping, normalization heuristics, unit tests.
Streaming Source
packages/pointcloud/src/streaming/ascii-points-source.ts, packages/pointcloud/src/streaming/decode-worker.ts
Whole-file ASCII streaming source that decodes on open, applies optional stride downsampling, worker updated to create ASCII source.
Public Exports / Protocol
packages/pointcloud/src/index.ts, packages/pointcloud/src/streaming/protocol.ts, packages/pointcloud/src/streaming/worker-client.ts
Re-exports ASCII utilities and types; extends worker format unions to include 'pts'/'xyz'.
Viewer Ingest Integration
apps/viewer/src/hooks/ingest/pointCloudIngest.ts, apps/viewer/src/components/viewer/MainToolbar.tsx, apps/viewer/src/components/viewer/ViewportContainer.tsx
Detects/accepts .pts/.xyz, detectPointCloudFormat probes magic/ASCII before falling back to extensions, file-picker accept lists updated.

E57 Refactor + ScaledInteger + Multi-scan Poses

Layer / File(s) Summary
Page/IO & Bit utilities
packages/pointcloud/src/formats/e57-page.ts
E57 file header parsing, CRC-stripped logical buffer creation, page-offset helpers and bounds validation.
XML Prototype & Pose Parsing
packages/pointcloud/src/formats/e57-xml.ts, packages/pointcloud/src/formats/e57.test.ts
Parses prototypes and optional per-scan <pose> into typed E57Pose; tests updated for pose presence and applyPose behavior.
Binary Decoding
packages/pointcloud/src/formats/e57-decode.ts
Implements decodeE57Scan including ScaledInteger LSB bit-walk (readBitsLE), Float/Integer/ScaledInteger decoding for cartesian/intensity/color/classification, computeBBox robustness.
Orchestration & Re-exports
packages/pointcloud/src/formats/e57.ts
Orchestrator now applies per-scan poses (applyPoseInPlace), re-exports delegated modules; updated tests for ScaledInteger and pose application.

LAZ wasm fetch robustness

Layer / File(s) Summary
WASM Fetch & Factory
packages/pointcloud/src/streaming/laz-source.ts, packages/pointcloud/src/streaming/vendor.d.ts
Pre-fetches laz-perf wasm via resolved ?url, provides wasm bytes to factory (wasmBinary) and improves error messages; ambient module declaration for ?url added.

GPU Deviation Pipeline (BVH + Compute Shader)

Layer / File(s) Summary
BVH Builder
packages/renderer/src/deviation/triangle-bvh.ts
Adds triangle BVH builder that flattens triangles, computes normals/AABBs, and emits GPU-ready node/triangle buffers.
Compute Pipeline
packages/renderer/src/deviation/deviation-pipeline.ts, packages/renderer/src/deviation/deviation-shader.wgsl.ts
Implements GPU compute pipeline and WGSL shader to compute signed point-to-triangle distances with optional clamping; exposes dispatch/upload/destroy APIs.
Renderer Integration
packages/renderer/src/index.ts
Renderer.computeDeviations() orchestrates BVH build/upload, per-chunk dispatch, waits for GPU completion, returns stats and suggested range; BVH fingerprint caching added; pipeline destroyed on cleanup.
Node Storage
packages/renderer/src/pointcloud/point-cloud-node.ts
Adds per-chunk deviationBuffer and allocates/initializes it; vertex buffer usage updated to include STORAGE.

Point-cloud Rendering Uniforms, Shader, and Buffers

Layer / File(s) Summary
Uniform Writer & Types
packages/renderer/src/pointcloud/point-cloud-uniforms.ts
New centralized uniform layout and writer (writePointCloudUniforms), adds color/size mode indices, class bitmask, preview stride, deviationRange fields.
Renderer Wiring
packages/renderer/src/pointcloud/point-cloud-renderer.ts, packages/renderer/src/pointcloud/point-pipeline.ts
PointCloudRenderOptions expanded (classMask, previewStride, deviation fields), sanitization in setOptions, uniform size increased (POINT_UNIFORM_SIZE 208→240), renderer binds deviation buffer and exposes getInternalNodes().
Shader Changes
packages/renderer/src/pointcloud/point-shader.wgsl.ts
Vertex stage culls based on instance_index % stride != 0, applies per-class visibility using flags.w bitmask, and implements deviation color ramp when deviation color mode selected.
Pipeline Vertex Inputs
packages/renderer/src/pointcloud/point-pipeline.ts
Adds instanced deviation attribute at location 4 and updates uniform block size.

Picking, Rectangle Selection & Hover XYZ

Layer / File(s) Summary
Picker Core
packages/renderer/src/picker.ts
Refactors pick pass, reads both pick id and depth (copyTextureToBuffer), unprojects to world-space (unprojectPickSample) returning worldXYZ; adds pickRect for rectangle GPU readback and updateUniforms.
Picking Manager
packages/renderer/src/picking-manager.ts
Adds pickRect delegating to Picker, applies same visibility filters, returns deduped Set of expressIds.
Renderer API
packages/renderer/src/index.ts
Adds Renderer.pickRect() delegating to picking manager and returns Set of ids; PickResult augmented with optional worldXYZ in packages/renderer/src/types.ts.
Mouse/Viewport UX
apps/viewer/src/components/viewer/useMouseControls.ts, apps/viewer/src/components/viewer/Viewport.tsx, apps/viewer/src/components/viewer/RectSelectionOverlay.tsx
Adds Ctrl/⌘+LMB rectangle-select gesture: overlay rendering, suppressed orbit/pan while selecting, setRectSelection updates overlay, handleMouseUp calls renderer.pickRect and updates selection; hover picks forward worldXYZ to hover state.
Store Types
apps/viewer/src/store/types.ts
HoverState.worldXYZ optional field added.

Viewer Store, UI Components & Sync

Layer / File(s) Summary
Store Slice Fields & Actions
apps/viewer/src/store/slices/pointCloudSlice.ts, apps/viewer/src/store/slices/loadingSlice.ts
Adds pointCloudClassMask, pointCloudPreviewStride, deviation params and flags, setActiveStreamCanceller, toggles and setters with validation, default initializations.
Sync Hook
apps/viewer/src/components/viewer/usePointCloudSync.ts
Forwards classMask, previewStride, and deviation range from store to renderer.setPointCloudOptions; effect deps expanded.
UI Panels & Controls
apps/viewer/src/components/viewer/PointCloudPanel.tsx, PointCloudLegend.tsx, PointCloudClasses.tsx, DeviationPanel.tsx, HoverTooltip.tsx
PointCloudPanel integrated with legend, class checklist (32 LAS IDs), fixed-color picker (hex↔rgba), DeviationPanel (compute button, half-range slider, ramp legend), hover tooltip shows formatted world-space xyz when available.
Section Preview Hookup
apps/viewer/src/components/viewer/tools/SectionPanel.tsx
Section slider sets pointCloudPreviewStride to 4 while dragging (when point assets present) and restores to 1 on end/unmount.
Streaming Cancellation UX
apps/viewer/src/hooks/useIfcFederation.ts, apps/viewer/src/hooks/useIfcLoader.ts, apps/viewer/src/components/viewer/StatusBar.tsx
Ingest registers active stream canceller in store; cancellation via AbortError handled as user-cancelled (progress set to Cancelled); StatusBar shows Cancel button while active.
Viewer Overlays
apps/viewer/src/components/viewer/ViewportOverlays.tsx
Passes triangleCount into PointCloudPanel to enable DeviationPanel behavior.

Sequence Diagram(s)

sequenceDiagram
    participant User as User
    participant Panel as DeviationPanel
    participant Renderer as Renderer
    participant GPU as GPU Device
    participant Result as Store/Return

    User->>Panel: Click "Compute Deviations"
    Panel->>Renderer: computeDeviations({maxRange: 1.0})
    Renderer->>Renderer: collect meshes via scene.forEachMeshData
    Renderer->>Renderer: buildTriangleBVH(meshes)
    Renderer->>GPU: upload BVH nodes & triangles
    Renderer->>GPU: dispatch compute shader for point chunks
    GPU->>GPU: compute signed distances, write per-chunk deviation buffers
    GPU-->>Renderer: queue.onSubmittedWorkDone()
    Renderer->>Result: return stats + suggestedHalfRange
    Panel->>Panel: update store, set deviation computed, switch colorMode
Loading
sequenceDiagram
    participant User as User
    participant Mouse as useMouseControls
    participant Viewport as Viewport
    participant Renderer as Renderer.pickRect
    participant Picker as Picker
    participant GPU as GPU
    participant Store as Selection Store

    User->>Mouse: Ctrl+LMB drag to select
    Mouse->>Viewport: setRectSelection(rect)
    Viewport->>Viewport: RectSelectionOverlay renders
    User->>Mouse: Release LMB
    Mouse->>Renderer: pickRect(x0,y0,x1,y1)
    Renderer->>Picker: pickRect(...)
    Picker->>GPU: render pick pass for rect
    GPU->>GPU: write pick id texture
    Picker->>GPU: copyTextureToBuffer readback
    GPU-->>Picker: pick buffer data
    Picker->>Renderer: Set<expressId>
    Renderer->>Store: setSelectedEntityIds(Set)
    Mouse->>Viewport: setRectSelection(null)
Loading

Possibly related issues

Possibly related PRs

✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Commit unit tests in branch claude/pointcloud-near-term-LBxtJ

Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 11698d3678

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

Comment on lines +493 to +495
try {
await ingest.done;
} finally {
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2 Badge Treat cancelled federated streams as cancellation, not errors

Cancelling a point-cloud stream in the federated loader now calls ingest.streamHandle.cancel(), but this await ingest.done block only has finally cleanup and no AbortError handling, so cancellation bubbles to the outer catch and is surfaced via setError(err.message) as a load failure. This is inconsistent with the single-model loader path (which maps aborts to a benign cancelled state) and causes user-initiated cancel actions to look like real parse errors in federated workflows.

Useful? React with 👍 / 👎.

Codex flagged that useIfcFederation's outer catch was reporting a
user-initiated stream cancel as a load failure (setError(message)),
inconsistent with useIfcLoader.loadFile which maps AbortError to a
benign cancelled state.

Detect DOMException with name === "AbortError" before the general
catch path, set progress to "Cancelled" and clear error.
claude added 14 commits May 1, 2026 11:10
Both formats are line-oriented plain-text scans common in legacy
survey workflows. Shared decoder + streaming source handle both.

Auto-detected layouts by column count of the first data line:
  3  → X Y Z
  4  → X Y Z I
  6  → X Y Z R G B
  7  → X Y Z I R G B (canonical PTS)
  9  → X Y Z R G B Nx Ny Nz       (normals dropped)
  10 → X Y Z I R G B Nx Ny Nz     (PTS-with-normals)
  XYZ with ≥3 unknown cols → positions only, rest skipped.

Comment lines (# or //) and blank lines skipped. Intensity / RGB
range auto-detected (0..1 vs 0..255) per file from observed max.

Wired into:
  - packages/pointcloud: decoder, streaming source, decode-worker,
    protocol, worker-client, index re-exports
  - apps/viewer: format detection, useIfcLoader + useIfcFederation
    pointcloud branches, file picker accept lists in MainToolbar
    and ViewportContainer drop handlers

The "PTS / XYZ not yet supported" toast in describeUnsupportedFormat
is removed.

10 new unit tests cover layout probing, decoder round-trips for the
common shapes, and the comment / header-count edge cases.

Verified: 77 pointcloud tests pass, full repo typecheck (24/24),
viewer Vite build green.
…tensity / colour

ScaledInteger is the more compact E57 encoding most real-world
Faro / Trimble / Leica exports use. Previously the decoder threw
on these files; now it loads them directly.

Per ASTM E2807-11 §6.3.4:
  bitsPerRecord = ceil(log2(maximum - minimum + 1))
  bytestream stores raw_int = original - minimum LSB-first,
  decoded float = (raw_int + minimum) * scale + offset.

New readBitsLE() walks a byte buffer and reconstructs each value
using Math.pow(2, n) so precision holds up to 53 bits. Wider fields
throw clearly. readCartesianStream + readIntensityStream now branch
on kind; writeColorChannel gets a ScaledInteger remap. Per-axis
packet capacity computation varies by field kind via
floatOrSiPointCapacity.

The earlier multi-scan pose rejection stays — that's a separate
piece of work.

2 new tests:
- 8-bit ScaledInteger round-trip across X/Y/Z
- 12-bit ScaledInteger crossing byte boundaries (proves the bit
  walk is correct for non-multiple-of-8 widths)

Verified: 63 pointcloud unit tests, full repo typecheck (24/24),
viewer Vite build green.
Parse each Data3D's <pose> (unit quaternion + translation) and apply
it before merging. Registered multi-scan E57 files now load directly
instead of throwing the "re-export as merged" rejection.

  Data3DEntry.hasPose: boolean → Data3DEntry.pose?: E57Pose
  parsePoseElement walks <pose><rotation/><translation/></pose>;
    non-finite values fall through to identity.
  applyPoseInPlace derives R from the quaternion (Hamilton w+xi+yj+zk),
    computes out = R · in + T per point.
  decodeE57 applies pose post-decode + recomputes bbox.

3 new tests:
  - <pose> XML extraction with 90°-around-Z quat + translation
  - applyPoseInPlace 90°-around-Z + translation per-axis check
  - identity pose round-trip

Verified: 64 pointcloud tests, full repo typecheck (24/24), viewer
Vite build green.
…ints

Ctrl/⌘ + LMB drag in the select tool draws a rectangle; release
selects every entity (mesh + point cloud) whose pixel lies inside.
A teal-dashed SVG outline tracks the drag.

  Picker.pickRect(x0,y0,x1,y1,…) renders the same pick pass as
    pick(), reads back the texel rect, dedupes hits to Set<expressId>.
  Picker.renderPickPass extracts the shared render-pass setup.
  PickingManager.pickRect applies hiddenIds + isolatedIds filtering.
  Renderer.pickRect exposes it through the public facade.
  RectSelectionOverlay renders the dashed SVG box.
  useMouseControls grows mouseState.isRectSelecting; suppresses
    orbit/pan during the drag; runs pickRect + setSelectedEntityIds
    on mouseup. 4-pixel minimum rect avoids stray Ctrl-click hits.

CPU-raycast and dynamic-mesh-creation fallbacks that pick() uses
for huge batched models are skipped — rect only sees hydrated
meshes. Acceptable MVP; issue #611's lasso (polygonal) and
per-class isolation are follow-ups.

Verified: full repo typecheck (24/24), 655 viewer tests pass,
viewer Vite build green.
A new Classes section in the point cloud panel checkboxes every
LAS 1.4 class. Hidden classes get culled in the splat shader.

  pointCloudClassMask: number (u32 bitmask, default 0xFFFFFFFF) on
    the point cloud slice; togglePointCloudClass + setMask actions.
  PointCloudRenderOptions.classMask plumbed through; stored in
    uniform flags.w (was unused).
  Splat shader: per vertex check (flags.w >> classId) & 1; hidden
    classes get a degenerate clipPos so they're culled pre-raster.
  New PointCloudClasses component renders a <details> collapsible
    with "Show all" + per-class checkboxes. Swatch colours mirror
    the shader's classification palette.

Class ids ≥32 always show — mask only covers ASPRS standard range.

Verified: full repo typecheck (24/24), 655 viewer tests, viewer
Vite build green.
…der drag

Splat shader gains a previewStride uniform that culls every Nth
instance at the start of vs_main. Section-plane slider wires
pointer/key down → stride 4, up → stride 1, so dragging over huge
scans stays responsive.

  POINT_UNIFORM_SIZE 208 → 224 (new extras: vec4<u32> slot;
    extras.x = previewStride, yzw reserved).
  PointCloudRenderOptions.previewStride? clamped to [1, 256].
  Vertex shader culls hidden instances via clipPos = (0, 0, -2, 1).
  pointCloudPreviewStride on the slice (default 1) +
    setPointCloudPreviewStride action.
  usePointCloudSync forwards the stride.
  SectionOverlay slider sets stride 4 on drag start, 1 on release.
    Gated on pointCloudAssetCount > 0 so IFC-only sessions are
    unaffected.

Triangle meshes ignore the stride — they were already smooth.

Verified: typecheck (24/24), 655 viewer tests, viewer build green.
GPU compute pipeline that colours each scan point by signed distance
to the nearest mesh surface. Works with every IFC ingest path
(STEP / IFCx / GLB / federated) and every point cloud format —
anywhere Scene.forEachMeshData reaches and the splat pipeline can
already render.

Pipeline:
  - Per-triangle BVH built from Scene.forEachMeshData. Median split,
    max 16 tris/leaf, flattened to 32-byte nodes during build.
  - Two GPU storage buffers (nodes + triangles), uploaded once per
    mesh-set change, cached by (meshCount, totalPositions).
  - Compute shader (workgroup-size 64): stack-based BVH descent
    pruning by squared point-AABB distance, Ericson §5.1.5
    closest-point-on-triangle, signed distance via face normal.
  - Per-chunk deviation buffer (4 bytes/point, STORAGE+VERTEX) sits
    alongside the splat vertex buffer. Compute reads positions
    straight from the vertex buffer — no CPU copy for streamed.
  - Splat shader: 2nd vertex buffer (location 4 = f32 deviation),
    new 'deviation' color mode, diverging blue→white→red ramp.
    POINT_UNIFORM_SIZE 208→224 for new deviationRange uniform slot.
  - Renderer.computeDeviations({maxRange?, forceRebuild?}) →
    {bvhTriangles, bvhNodes, chunksProcessed, pointsProcessed,
     bounds, suggestedHalfRange}. Awaits onSubmittedWorkDone.
  - DeviationPanel inside PointCloudPanel: compute button (gated
    on triangleCount > 0), duration readout, mm-scale range slider,
    inline gradient legend. Auto-suggests half-range from BVH bbox
    and auto-switches colour mode on success.

Sign: positive = point on outward-normal side of nearest face.
Negative = inside/behind. Per-surface front/back is always defined
even on non-watertight BIM.

Verified: full repo typecheck (24/24), 655 viewer tests, viewer
Vite build green.
# Conflicts:
#	apps/viewer/src/hooks/ingest/pointCloudIngest.ts
# Conflicts:
#	packages/pointcloud/src/formats/e57.ts
# Conflicts:
#	packages/pointcloud/src/formats/e57.ts
# Conflicts:
#	apps/viewer/src/components/viewer/useMouseControls.ts
#	packages/renderer/src/picker.ts
# Conflicts:
#	apps/viewer/src/components/viewer/PointCloudPanel.tsx
#	packages/renderer/src/pointcloud/point-cloud-renderer.ts
# Conflicts:
#	apps/viewer/src/components/viewer/usePointCloudSync.ts
#	apps/viewer/src/store/slices/pointCloudSlice.ts
#	packages/renderer/src/pointcloud/point-cloud-renderer.ts
# Conflicts:
#	apps/viewer/src/components/viewer/PointCloudPanel.tsx
#	apps/viewer/src/components/viewer/usePointCloudSync.ts
#	apps/viewer/src/store/slices/pointCloudSlice.ts
#	packages/renderer/src/pointcloud/point-cloud-renderer.ts
#	packages/renderer/src/pointcloud/point-pipeline.ts
#	packages/renderer/src/pointcloud/point-shader.wgsl.ts
Copy link
Copy Markdown

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 13

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
apps/viewer/src/hooks/useIfcLoader.ts (1)

1593-1629: ⚠️ Potential issue | 🟠 Major | ⚡ Quick win

Don't let an older stream clear a newer stream's Cancel action.

activeStreamCanceller is cleared unconditionally on every exit path here. If the user starts a second point-cloud load before the first settles, the first load can wipe the second load's canceller and leave the current stream un-cancellable.

♻️ Suggested ownership-safe cleanup
-        const setCanceller = useViewerStore.getState().setActiveStreamCanceller;
-        setCanceller(() => ingest.streamHandle.cancel());
+        const { setActiveStreamCanceller } = useViewerStore.getState();
+        const cancelStream = () => ingest.streamHandle.cancel();
+        setActiveStreamCanceller(cancelStream);
         try {
           await ingest.done;
         } catch (err) {
           ...
-            setCanceller(null);
             return;
           }
           ...
-          setCanceller(null);
           setLoading(false);
           return;
+        } finally {
+          if (useViewerStore.getState().activeStreamCanceller === cancelStream) {
+            setActiveStreamCanceller(null);
+          }
         }
-        setCanceller(null);
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@apps/viewer/src/hooks/useIfcLoader.ts` around lines 1593 - 1629, The cancel
handler is being cleared unconditionally and can wipe a newer session's
canceller; change the logic to only clear the activeStreamCanceller if the
session that set it still owns it: when you call setActiveStreamCanceller (via
setCanceller(() => ingest.streamHandle.cancel())), capture the currentSession id
(or create a canceller wrapper that checks loadSessionRef.current) and when
clearing (all setCanceller(null) calls) only call setCanceller(null) if
loadSessionRef.current === currentSession; likewise in the early-abort branch
where you call renderer.removePointCloudAsset(ingest.rendererHandle) keep the
same guard so you don't clear a newer session's canceller. Ensure the same
ownership check is used in both the success path (after await ingest.done) and
all error/return paths so a later load retains its cancel action.
🧹 Nitpick comments (5)
packages/pointcloud/src/formats/e57.test.ts (1)

168-284: ⚡ Quick win

Add one malformed-packet regression next to these happy-path decoder cases.

The PR also hardens decodeE57Scan() against invalid length tables and payload overruns, but this file only locks in successful decode paths. A tiny DataPacket that runs past payloadEnd would keep those new bounds checks from regressing silently.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/pointcloud/src/formats/e57.test.ts` around lines 168 - 284, Add a
negative test next to the existing ScaledInteger cases that constructs a
deliberately malformed DataPacket which declares bytestream lengths that run
past the packet payload end and asserts decodeE57Scan detects the overrun (e.g.,
throws or returns an error). Build a small ArrayBuffer/DataView like the other
tests but set packetLogicalLength (header at indices written by
view.setUint16(2, ...)) too small for the declared bytestream lengths (or set
one bytestream length larger than remaining bytes), reuse a Data3DEntry
prototype similar to the other tests, call decodeE57Scan with that buffer, and
expect it to throw (or otherwise signal the invalid-packet condition) to lock in
the bounds-checking behavior for DataPacket handling in decodeE57Scan.
apps/viewer/src/components/viewer/PointCloudClasses.tsx (1)

23-41: ⚡ Quick win

Extract the class palette into a shared module.

This table is duplicated here and in PointCloudLegend.tsx, and both files also need to stay aligned with the shader palette. One shared source of truth will prevent UI drift the next time a class color or label changes.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@apps/viewer/src/components/viewer/PointCloudClasses.tsx` around lines 23 -
41, Extract the CLASSES constant and its ClassEntry type into a new shared
module (e.g., export const CLASSES and export type ClassEntry) and replace the
local CLASSES definitions in PointCloudClasses.tsx and PointCloudLegend.tsx by
importing the shared symbols; update any references to CLASSES, ClassEntry, and
usages in those files to use the imported names. Also ensure the shader palette
consumer imports the same shared CLASSES (or a derived rgb array) so the UI and
shader use the single source of truth; run type checks to adjust any import
paths/exports accordingly.
apps/viewer/src/components/viewer/DeviationPanel.tsx (1)

76-78: 💤 Low value

Hardcoded default check is fragile.

The condition halfRange === 0.05 to detect "user hasn't touched the slider" couples this component to the initial value defined elsewhere in the store. If the store default changes, this logic silently breaks.

Consider extracting the default to a shared constant or using a separate "touched" flag in the store.

Example: use a shared constant
// In a shared constants file or the store slice:
export const DEFAULT_DEVIATION_HALF_RANGE = 0.05;

// Then in this component:
if (halfRange === DEFAULT_DEVIATION_HALF_RANGE && result.suggestedHalfRange !== DEFAULT_DEVIATION_HALF_RANGE) {
  setHalfRange(result.suggestedHalfRange);
}
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@apps/viewer/src/components/viewer/DeviationPanel.tsx` around lines 76 - 78,
The check that uses the hardcoded 0.05 to detect an untouched slider is fragile;
replace the literal with a shared constant (e.g., DEFAULT_DEVIATION_HALF_RANGE)
exported from the store or constants module, or alternatively add a "touched"
flag in the store and use that to decide whether to call setHalfRange; update
the condition in DeviationPanel (the halfRange check and the setHalfRange call
that references result.suggestedHalfRange) to compare against
DEFAULT_DEVIATION_HALF_RANGE or to check the touched flag instead so the logic
no longer depends on a magic number.
packages/renderer/src/deviation/deviation-shader.wgsl.ts (1)

209-217: 💤 Low value

Stack overflow silently drops BVH children.

When sp + 2 > STACK_SIZE, both children are silently dropped. While 64 levels is generous for balanced BVHs (supports billions of triangles), a degenerate or unbalanced tree could exceed this depth and produce incorrect nearest-neighbor results without any indication.

This is acceptable for an MVP given typical BVH depths, but consider adding a debug flag or logging mechanism for detecting stack exhaustion in future iterations if deviation results seem incorrect on pathological meshes.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/renderer/src/deviation/deviation-shader.wgsl.ts` around lines 209 -
217, The code silently drops BVH children when sp + 2u > STACK_SIZE; modify the
traversal in the block using sp, STACK_SIZE, and stack so that it never silently
discards both children: if only one slot remains push the nearer child (based on
lDist vs rDist), if zero slots remain set a diagnostic flag (e.g.,
stackOverflowDetected or stackOverflowCount) in your shader-visible output/UBO
and break out of traversal to avoid incorrect results; add the diagnostic flag
so host code can detect/expose stack exhaustion (optionally gated by a debug
compile flag).
packages/pointcloud/src/formats/e57-decode.ts (1)

358-374: Add optional bounds checking in readBitsLE for malformed input.

The concern is valid but limited to corrupt files: while upstream validation ensures bytestream offsets fit within the packet payload, it doesn't guarantee that metadata-declared byte counts match actual storage. If a field header claims more bytes than available, the capacity calculation (Math.floor((lengthBytes * 8) / bitsPerRecord)) could compute a take value that causes readBitsLE to read past bytes.length. Accessing undefined indices silently returns undefined, which converts to 0 in the bit arithmetic, corrupting the decoded value.

For well-formed files, the existing validation prevents this. A defensive guard—such as if (cur >= bytes.length) return 0; in the loop or a pre-flight check on startBit + bitsPerRecord * take—would safely handle malformed input without affecting normal operation.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/pointcloud/src/formats/e57-decode.ts` around lines 358 - 374, The
readBitsLE function can read past the provided bytes on malformed input; add
defensive bounds checking: before reading each byte in readBitsLE (using
parameters bytes, bitOffset, bitsPerRecord and locals cur/inByte), ensure cur <
bytes.length (or precompute that startBit + bitsPerRecord <= bytes.length*8) and
if the check fails return 0 (or a safe default) to avoid accessing undefined
bytes and corrupting decoded values; update readBitsLE to perform this guard
either as a pre-flight validation or inside the while loop.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In @.changeset/pointcloud-e57-scaled-integer.md:
- Around line 1-3: The changeset currently only bumps "@ifc-lite/pointcloud" but
the PR also modifies packages/renderer; add a corresponding changeset for
"@ifc-lite/renderer" (or a companion changeset) so the renderer package is
versioned and included in the release; run pnpm changeset and select or create
an entry for `@ifc-lite/renderer` (minor/patch as appropriate), commit the new
changeset file alongside the existing one, and ensure you do not manually edit
package versions or CHANGELOG.md.

In @.changeset/pointcloud-near-term-correctness.md:
- Around line 2-3: The changeset currently updates "@ifc-lite/pointcloud" and
"@ifc-lite/viewer" but missing "@ifc-lite/renderer", so add
"@ifc-lite/renderer": patch to the changeset file so the renderer package is
versioned and released alongside the API changes; ensure the entry covers the
changes to Renderer.computeDeviations, Renderer.pickRect and any exported
point-cloud symbols so the release metadata reflects the updated renderer API.

In `@apps/viewer/src/components/viewer/MainToolbar.tsx`:
- Around line 426-429: The file filter in MainToolbar.tsx (the supportedFiles
const) inconsistently lowercases only some branches; refactor by extracting a
single normalized allow-list (e.g., a SUPPORTED_EXTENSIONS Set or an
isSupportedFile(name: string) helper) that contains all extensions in lowercase,
lowercase the file.name once (const name = file.name.toLowerCase()), and test
membership using name.endsWith(ext) or checking ext against the set; replace the
current inline predicate in supportedFiles with that helper and reuse it in the
other similar checks in this component.

In `@apps/viewer/src/components/viewer/tools/SectionPanel.tsx`:
- Around line 46-54: The temporary preview stride may remain at 4 if a drag is
cancelled or the panel unmounts; update the slider drag logic to reset stride on
all cancellation/teardown paths: add event handlers for pointercancel and
blur/lost focus (e.g. wire onPointerCancel and onBlur to call
setPreviewStride(1) alongside handleSliderDragEnd), and add a useEffect cleanup
that calls setPreviewStride(1) on unmount or when the panel visibility changes;
reference the existing handleSliderDragStart, handleSliderDragEnd and
setPreviewStride functions and ensure any code path that exits a drag
(pointercancel, blur, component unmount) calls setPreviewStride(1).

In `@apps/viewer/src/components/viewer/useMouseControls.ts`:
- Around line 507-509: renderer.pickRect(...) currently returns a promise that
can reject (e.g. WebGPU device loss) but is left unhandled; wrap the pickRect
call in proper error handling (either await inside an async handler with
try/catch or append .catch) so rejections are consumed, log the error via the
existing logger/context and avoid leaving selection in an indeterminate state
(for example, only call
useViewerStore.getState().setSelectedEntityIds(Array.from(ids)) on success; on
failure either leave selection unchanged or explicitly clear it depending on
desired UX). Ensure you modify the call site referencing renderer.pickRect,
getPickOptions, and useViewerStore.getState().setSelectedEntityIds so no
unhandled promise rejection occurs.

In `@apps/viewer/src/components/viewer/ViewportContainer.tsx`:
- Around line 313-315: The extension checks in ViewportContainer.tsx are
inconsistent and duplicated: some checks use case-sensitive literals (e.g.,
'.ifc', '.ifcx', '.glb') while others call toLowerCase(), and the same list
appears in multiple places. Fix by introducing a single shared lowercase
allowlist array (e.g., ALLOWED_EXTENSIONS = ['.ifc','.ifcx','.glb','.las',...])
at the top of the component/module and replace the inline predicates (the
anonymous filter lambda that does f.name.endsWith(...) ) with a normalized check
using file.name.toLowerCase().endsWith(...) or fileNameLower.endsWith for
membership against ALLOWED_EXTENSIONS; update all occurrences pointed out (the
three duplicated locations) and remove duplicate entries from the list.

In `@apps/viewer/src/hooks/ingest/pointCloudIngest.ts`:
- Around line 207-221: The PCD sniffing is too narrow—currently it only detects
a "# .PCD" pattern at byte 0; update the probe in pointCloudIngest (the code
using view/getUint8 and b0,b1,b2) to scan the sniff window for ASCII sequences
indicating PCD (for example ".PCD", "PCD", or "VERSION") anywhere within the
first N bytes instead of only at offset 0, and only fall back to extension
checks using fileName when the buffer is too short or no PCD signatures are
found.

In `@apps/viewer/src/hooks/useIfcFederation.ts`:
- Around line 485-497: Create a dedicated canceller function variable (e.g.,
const myCanceller = () => ingest.streamHandle.cancel()) and pass that to
useViewerStore.getState().setActiveStreamCanceller instead of an anonymous
closure; in the finally block, read the current store value
(useViewerStore.getState().activeStreamCanceller) and only call
setActiveStreamCanceller(null) if the stored function is referentially equal to
myCanceller so you don't clear a newer stream's canceller; keep using
ingest.done and do not duplicate asset cleanup.

In `@packages/pointcloud/src/formats/e57-xml.ts`:
- Around line 82-123: When building the Data3DEntry in the loop, validate
numeric XML metadata before pushing into entries: ensure Number(recordCountAttr)
yields a finite non-negative integer (>0 if appropriate) and
Number(fileOffsetAttr) yields a finite non-negative number; if either check
fails (NaN, Infinity, negative, or recordCount non-integer/zero), skip this
scan/points entry (continue) instead of pushing; do these checks where you
currently read fileOffsetAttr and recordCountAttr (refer to variables
fileOffsetAttr, recordCountAttr and the entries.push call) and keep existing
parsing of prototype fields and parsePoseElement as-is.

In `@packages/renderer/src/index.ts`:
- Around line 461-468: The current fingerprint
`${meshes.length}:${totalPositionsLength}` can collide and cause a stale BVH;
update the logic in collectAllSceneMeshes / computeDeviations where
deviationBvhFingerprint is compared so the fingerprint incorporates per-mesh
stable identifiers or content hashes (e.g., mesh.uuid or mesh.id plus a checksum
of positions / vertex buffer, or mesh bounding box + vertexCount) before
deciding to reuse the BVH built by buildTriangleBVH and uploaded via
deviationPipeline.uploadBvh; compute a deterministic combined fingerprint from
each mesh (referencing mesh.positions, mesh.uuid/id, and/or mesh.bounds) and set
deviationBvhFingerprint to that value so rebuilds occur whenever any mesh’s
geometry changes.
- Around line 240-243: The new DeviationPipeline instance (created in init via
new DeviationPipeline(this.device.getDevice()) and stored on
this.deviationPipeline) is never disposed, causing GPU buffer/cache leaks;
update the renderer teardown path (destroy method) to check for
this.deviationPipeline, call its disposal/cleanup method (e.g., dispose(),
destroy(), or close() on DeviationPipeline depending on the API) and null out
this.deviationPipeline so GPU BVH buffers and cache are released when the viewer
is torn down. Ensure the call happens before other device teardown so the
pipeline can access the device during its cleanup.

In `@packages/renderer/src/picker.ts`:
- Around line 212-236: The pick() implementation can call copyTextureToBuffer()
with invalid origin coordinates because Math.floor(x)/Math.floor(y) may be -1 or
equal to width/height on border clicks; clamp the texel origin before using it
(e.g., compute ix = clamp(Math.floor(x), 0, texWidth-1) and iy =
clamp(Math.floor(y), 0, texHeight-1)) and use ix/iy for both the colorTexture
and depthTexture copyTextureToBuffer() calls; ensure you reference the same
clamped values when building the origin objects in pick() (matching the guard
already present in pickRect()) so no out-of-range copy is attempted.

In `@packages/renderer/src/scene.ts`:
- Around line 343-346: forEachMeshData currently iterates
this.meshDataMap.values() and can call visit on the same MeshData object
multiple times when that MeshData appears under multiple keys; change
forEachMeshData to deduplicate by tracking seen MeshData instances (e.g., use a
Set to store visited MeshData references) and only call visit(piece) the first
time an instance is encountered (check the Set before visiting and add after
visiting) so each MeshData is processed exactly once.

---

Outside diff comments:
In `@apps/viewer/src/hooks/useIfcLoader.ts`:
- Around line 1593-1629: The cancel handler is being cleared unconditionally and
can wipe a newer session's canceller; change the logic to only clear the
activeStreamCanceller if the session that set it still owns it: when you call
setActiveStreamCanceller (via setCanceller(() => ingest.streamHandle.cancel())),
capture the currentSession id (or create a canceller wrapper that checks
loadSessionRef.current) and when clearing (all setCanceller(null) calls) only
call setCanceller(null) if loadSessionRef.current === currentSession; likewise
in the early-abort branch where you call
renderer.removePointCloudAsset(ingest.rendererHandle) keep the same guard so you
don't clear a newer session's canceller. Ensure the same ownership check is used
in both the success path (after await ingest.done) and all error/return paths so
a later load retains its cancel action.

---

Nitpick comments:
In `@apps/viewer/src/components/viewer/DeviationPanel.tsx`:
- Around line 76-78: The check that uses the hardcoded 0.05 to detect an
untouched slider is fragile; replace the literal with a shared constant (e.g.,
DEFAULT_DEVIATION_HALF_RANGE) exported from the store or constants module, or
alternatively add a "touched" flag in the store and use that to decide whether
to call setHalfRange; update the condition in DeviationPanel (the halfRange
check and the setHalfRange call that references result.suggestedHalfRange) to
compare against DEFAULT_DEVIATION_HALF_RANGE or to check the touched flag
instead so the logic no longer depends on a magic number.

In `@apps/viewer/src/components/viewer/PointCloudClasses.tsx`:
- Around line 23-41: Extract the CLASSES constant and its ClassEntry type into a
new shared module (e.g., export const CLASSES and export type ClassEntry) and
replace the local CLASSES definitions in PointCloudClasses.tsx and
PointCloudLegend.tsx by importing the shared symbols; update any references to
CLASSES, ClassEntry, and usages in those files to use the imported names. Also
ensure the shader palette consumer imports the same shared CLASSES (or a derived
rgb array) so the UI and shader use the single source of truth; run type checks
to adjust any import paths/exports accordingly.

In `@packages/pointcloud/src/formats/e57-decode.ts`:
- Around line 358-374: The readBitsLE function can read past the provided bytes
on malformed input; add defensive bounds checking: before reading each byte in
readBitsLE (using parameters bytes, bitOffset, bitsPerRecord and locals
cur/inByte), ensure cur < bytes.length (or precompute that startBit +
bitsPerRecord <= bytes.length*8) and if the check fails return 0 (or a safe
default) to avoid accessing undefined bytes and corrupting decoded values;
update readBitsLE to perform this guard either as a pre-flight validation or
inside the while loop.

In `@packages/pointcloud/src/formats/e57.test.ts`:
- Around line 168-284: Add a negative test next to the existing ScaledInteger
cases that constructs a deliberately malformed DataPacket which declares
bytestream lengths that run past the packet payload end and asserts
decodeE57Scan detects the overrun (e.g., throws or returns an error). Build a
small ArrayBuffer/DataView like the other tests but set packetLogicalLength
(header at indices written by view.setUint16(2, ...)) too small for the declared
bytestream lengths (or set one bytestream length larger than remaining bytes),
reuse a Data3DEntry prototype similar to the other tests, call decodeE57Scan
with that buffer, and expect it to throw (or otherwise signal the invalid-packet
condition) to lock in the bounds-checking behavior for DataPacket handling in
decodeE57Scan.

In `@packages/renderer/src/deviation/deviation-shader.wgsl.ts`:
- Around line 209-217: The code silently drops BVH children when sp + 2u >
STACK_SIZE; modify the traversal in the block using sp, STACK_SIZE, and stack so
that it never silently discards both children: if only one slot remains push the
nearer child (based on lDist vs rDist), if zero slots remain set a diagnostic
flag (e.g., stackOverflowDetected or stackOverflowCount) in your shader-visible
output/UBO and break out of traversal to avoid incorrect results; add the
diagnostic flag so host code can detect/expose stack exhaustion (optionally
gated by a debug compile flag).
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

Run ID: f4b3cc72-80f9-438e-8522-bc49a62e34b7

📥 Commits

Reviewing files that changed from the base of the PR and between f87c40d and 4fe8f56.

📒 Files selected for processing (55)
  • .changeset/pointcloud-class-isolation.md
  • .changeset/pointcloud-deviation-heatmap.md
  • .changeset/pointcloud-e57-multi-scan-pose.md
  • .changeset/pointcloud-e57-scaled-integer.md
  • .changeset/pointcloud-near-term-correctness.md
  • .changeset/pointcloud-near-term-ux.md
  • .changeset/pointcloud-pts-xyz.md
  • .changeset/pointcloud-rect-pick.md
  • .changeset/pointcloud-section-preview.md
  • apps/viewer/src/components/viewer/DeviationPanel.tsx
  • apps/viewer/src/components/viewer/HoverTooltip.tsx
  • apps/viewer/src/components/viewer/MainToolbar.tsx
  • apps/viewer/src/components/viewer/PointCloudClasses.tsx
  • apps/viewer/src/components/viewer/PointCloudLegend.tsx
  • apps/viewer/src/components/viewer/PointCloudPanel.tsx
  • apps/viewer/src/components/viewer/RectSelectionOverlay.tsx
  • apps/viewer/src/components/viewer/StatusBar.tsx
  • apps/viewer/src/components/viewer/Viewport.tsx
  • apps/viewer/src/components/viewer/ViewportContainer.tsx
  • apps/viewer/src/components/viewer/ViewportOverlays.tsx
  • apps/viewer/src/components/viewer/tools/SectionPanel.tsx
  • apps/viewer/src/components/viewer/useMouseControls.ts
  • apps/viewer/src/components/viewer/usePointCloudSync.ts
  • apps/viewer/src/hooks/ingest/pointCloudIngest.ts
  • apps/viewer/src/hooks/useIfcFederation.ts
  • apps/viewer/src/hooks/useIfcLoader.ts
  • apps/viewer/src/store/slices/loadingSlice.ts
  • apps/viewer/src/store/slices/pointCloudSlice.ts
  • apps/viewer/src/store/types.ts
  • packages/pointcloud/src/formats/ascii-points.test.ts
  • packages/pointcloud/src/formats/ascii-points.ts
  • packages/pointcloud/src/formats/e57-decode.ts
  • packages/pointcloud/src/formats/e57-page.ts
  • packages/pointcloud/src/formats/e57-xml.ts
  • packages/pointcloud/src/formats/e57.test.ts
  • packages/pointcloud/src/formats/e57.ts
  • packages/pointcloud/src/formats/ifcx-points.ts
  • packages/pointcloud/src/index.ts
  • packages/pointcloud/src/streaming/ascii-points-source.ts
  • packages/pointcloud/src/streaming/decode-worker.ts
  • packages/pointcloud/src/streaming/protocol.ts
  • packages/pointcloud/src/streaming/worker-client.ts
  • packages/renderer/src/deviation/deviation-pipeline.ts
  • packages/renderer/src/deviation/deviation-shader.wgsl.ts
  • packages/renderer/src/deviation/triangle-bvh.ts
  • packages/renderer/src/index.ts
  • packages/renderer/src/picker.ts
  • packages/renderer/src/picking-manager.ts
  • packages/renderer/src/pointcloud/point-cloud-node.ts
  • packages/renderer/src/pointcloud/point-cloud-renderer.ts
  • packages/renderer/src/pointcloud/point-cloud-uniforms.ts
  • packages/renderer/src/pointcloud/point-pipeline.ts
  • packages/renderer/src/pointcloud/point-shader.wgsl.ts
  • packages/renderer/src/scene.ts
  • packages/renderer/src/types.ts

Comment thread .changeset/pointcloud-e57-scaled-integer.md
Comment thread .changeset/pointcloud-near-term-correctness.md
Comment on lines 426 to 429
const supportedFiles = Array.from(files).filter(
f => f.name.endsWith('.ifc') || f.name.endsWith('.ifcx') || f.name.endsWith('.glb')
|| f.name.toLowerCase().endsWith('.las') || f.name.toLowerCase().endsWith('.laz') || f.name.toLowerCase().endsWith('.ply') || f.name.toLowerCase().endsWith('.pcd') || f.name.toLowerCase().endsWith('.e57')
|| f.name.toLowerCase().endsWith('.las') || f.name.toLowerCase().endsWith('.laz') || f.name.toLowerCase().endsWith('.ply') || f.name.toLowerCase().endsWith('.pcd') || f.name.toLowerCase().endsWith('.e57') || f.name.toLowerCase().endsWith('.pts') || f.name.toLowerCase().endsWith('.xyz')
);
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor | ⚡ Quick win

Normalize the supported-extension allow-list once.

The current filter lowercases only the point-cloud branches, so MODEL.IFC / SCAN.GLB still get rejected here while .las / .pts do not. Pulling the extensions into one lowercased helper/accept string fixes that bug and avoids the four copies drifting further in this already-large component.

Suggested direction
+const SUPPORTED_OPEN_EXTENSIONS = [
+  '.ifc', '.ifcx', '.glb',
+  '.las', '.laz', '.ply', '.pcd', '.e57', '.pts', '.xyz',
+] as const;
+
+function hasSupportedOpenExtension(name: string): boolean {
+  const lower = name.toLowerCase();
+  return SUPPORTED_OPEN_EXTENSIONS.some((ext) => lower.endsWith(ext));
+}
+
+const SUPPORTED_OPEN_ACCEPT = SUPPORTED_OPEN_EXTENSIONS.join(',');
+
-    const supportedFiles = Array.from(files).filter(
-      f => f.name.endsWith('.ifc') || f.name.endsWith('.ifcx') || f.name.endsWith('.glb')
-        || f.name.toLowerCase().endsWith('.las') || f.name.toLowerCase().endsWith('.laz') || f.name.toLowerCase().endsWith('.ply') || f.name.toLowerCase().endsWith('.pcd') || f.name.toLowerCase().endsWith('.e57') || f.name.toLowerCase().endsWith('.pts') || f.name.toLowerCase().endsWith('.xyz')
-    );
+    const supportedFiles = Array.from(files).filter((f) => hasSupportedOpenExtension(f.name));

As per coding guidelines: Limit non-generated modules to ~400 lines; split larger files and extract cohesive helpers into separate modules.

Also applies to: 467-470, 784-792

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@apps/viewer/src/components/viewer/MainToolbar.tsx` around lines 426 - 429,
The file filter in MainToolbar.tsx (the supportedFiles const) inconsistently
lowercases only some branches; refactor by extracting a single normalized
allow-list (e.g., a SUPPORTED_EXTENSIONS Set or an isSupportedFile(name: string)
helper) that contains all extensions in lowercase, lowercase the file.name once
(const name = file.name.toLowerCase()), and test membership using
name.endsWith(ext) or checking ext against the set; replace the current inline
predicate in supportedFiles with that helper and reuse it in the other similar
checks in this component.

Comment thread apps/viewer/src/components/viewer/tools/SectionPanel.tsx
Comment thread apps/viewer/src/components/viewer/useMouseControls.ts Outdated
Comment thread packages/pointcloud/src/formats/e57-xml.ts
Comment thread packages/renderer/src/index.ts
Comment thread packages/renderer/src/index.ts
Comment thread packages/renderer/src/picker.ts
Comment thread packages/renderer/src/scene.ts
CodeRabbit review on the consolidated PR. Real-bug fixes:

  - Canceller ownership: useIfcLoader + useIfcFederation now only
    null the store-wide canceller when the stored function is still
    ours. A second load that started before the first settles
    keeps its own Cancel button.
  - Picker.pick origin clamping: Math.floor(x/y) could be -1 or
    width/height on border clicks, making copyTextureToBuffer
    reject the submit. Now clamps to [0, width-1] / [0, height-1]
    matching pickRect's existing guard.
  - Scene.forEachMeshData dedup: a colour-merged batch lives under
    every contributor's expressId. The deviation BVH was double-
    counting those triangles. Set<MeshData> guard.
  - DeviationPipeline disposal: destroy() now releases the BVH GPU
    buffers + clears the fingerprint cache.
  - SectionPanel preview-stride leak: pointercancel + onBlur
    handlers + unmount cleanup so the splat shader can't stay
    stuck at stride 4 when the slider loses focus mid-drag.
  - useMouseControls.pickRect rejection: WebGPU device-loss now
    logs and leaves selection unchanged instead of unhandled-
    rejecting from a pointer event.
  - e57-xml: reject non-finite / negative fileOffset + recordCount
    before pushing into Data3DEntry. Malformed XML metadata can no
    longer poison the binary decode path.
  - Renderer BVH fingerprint: includes per-mesh expressId /
    modelIndex / positions length / indices length so two distinct
    mesh sets that share the same aggregate position-length total
    can't alias each other and reuse a stale BVH.

Changeset metadata:
  - pointcloud-near-term-correctness: + @ifc-lite/renderer patch
  - pointcloud-e57-scaled-integer: + @ifc-lite/renderer patch
  Both PRs touched the renderer's published surface; the metadata
  needed to reflect that for the version bump.

Verified: full repo typecheck (24/24), 80 pointcloud unit tests,
655 viewer tests, viewer Vite build green.
Real-world Faro / Trimble exports (e.g. Union_Station.e57) failed
with "bytestream colorBlue (2600 bytes) runs past packet payload at
offset 46900" because the round-2 packet-bounds guard subtracted
4 bytes for a phantom packet-trailing CRC.

Per ASTM E2807-11 §6.3, CRCs in E57 live at the PAGE level (4 bytes
per 1024-byte physical page), not per DataPacket. `stripPageCrc`
already removes them before the packet walk runs. The packet's
bytestreams fill the full `packetLogicalLength`, so payloadEnd =
packetEnd. Subtracting an extra 4 bytes false-positives whenever a
real exporter packs the last bytestream tight against packetEnd.

Fix: payloadEnd = packetEnd. The header / length-table / per-stream
guards still catch genuinely malformed files; they were just being
overzealous on legal ones.

Regression test: a 3-point packet with bytestreams that fill the
packet exactly (no trailing slack). Would have failed before the
fix; passes now.

Verified: 81 pointcloud unit tests, full repo typecheck (24/24).
Classification mode renders everything as "unclassified" grey for
some files and we don't yet know whether the issue is upstream
(decoder didn't emit classes) or downstream (packing / shader).
Two cross-checkable diagnostic logs (capped at 3 chunks each) on
the boundaries:

  - ingestPointCloud.onChunk: per-chunk presence + 16-bin class
    histogram. Tells us whether classifications arrived at all and
    what their distribution is.

  - point-cloud-node.appendChunkToNode: reads back the packed
    class byte (vertex offset +15) for the first 8 points and logs
    them. Tells us whether the bytes that the splat shader will
    sample match what the host saw.

If the host log shows e.g. hist={0=10,2=4500,6=2000} but the vertex
log shows all 0, the packing is wrong. If both show all 0, the
decoder isn't emitting classes (look at the format's streaming
source). If both show a non-trivial spread, the shader read or
colour palette path is wrong.

Logs cap at 3 per page session to keep the console readable; they
auto-reset on HMR.
The diagnostic confirmed: E57 files don't emit classifications
because ASTM E2807 doesn't define one in the standard prototype.
Stock Faro / Leica / Trimble scans simply have no per-point class
IDs to decode — classification colour mode rendering everything as
class 0 ("Created, never classified") is correct behaviour for
those files.

Two improvements:

  - decodeE57Scan now looks for a non-standard `classification`
    field (Integer or ScaledInteger) in the prototype. CloudCompare
    + custom LIDAR pipelines occasionally include one; we honour
    it when present so the splat shader's classification mode
    works on those files. New readClassificationStream helper
    handles both encodings.
  - E57StreamingSource.hasClassification now reflects whether the
    chunk actually has classifications (was hardcoded false).
  - Debug log message now spells out the E57-vs-LAS distinction so
    users can see "this is correct, your file just doesn't carry
    class data" rather than wondering if it's a viewer bug.

Verified: 81 pointcloud unit tests pass, full repo typecheck (24/24).
laz-perf's emscripten shim called fetch("laz-perf.wasm") relative to
the worker's script directory, which under Vite hits a 404 and the
SPA index.html fallback served as text/plain — making the WebAssembly
streaming compile reject the Response and the ArrayBuffer fallback
abort the same way.

Resolve the wasm via Vite's ?url asset pipeline, pre-fetch the bytes,
and hand them to emscripten as Module.wasmBinary so the shim skips its
own (broken-by-default) resolution path entirely.
Copy link
Copy Markdown

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (2)
packages/pointcloud/src/streaming/e57-source.ts (1)

59-88: ⚠️ Potential issue | 🟠 Major | ⚡ Quick win

Classifications are not preserved during stride downsampling.

The applyStride function handles positions, colors, and intensities, but omits classifications. When stride > 1, classification data is silently dropped, and hasClassification will incorrectly report false even when the source E57 contains classifications.

🐛 Proposed fix
 function applyStride(chunk: DecodedPointChunk, stride: number): DecodedPointChunk {
   const s = Math.max(1, stride | 0);
   if (s === 1) return chunk;
   const newCount = Math.ceil(chunk.pointCount / s);
   const positions = new Float32Array(newCount * 3);
   const colors = chunk.colors ? new Float32Array(newCount * 3) : undefined;
   const intensities = chunk.intensities ? new Uint16Array(newCount) : undefined;
+  const classifications = chunk.classifications ? new Uint8Array(newCount) : undefined;
   let dst = 0;
   for (let i = 0; i < chunk.pointCount; i += s) {
     positions[dst * 3] = chunk.positions[i * 3];
     positions[dst * 3 + 1] = chunk.positions[i * 3 + 1];
     positions[dst * 3 + 2] = chunk.positions[i * 3 + 2];
     if (colors && chunk.colors) {
       colors[dst * 3] = chunk.colors[i * 3];
       colors[dst * 3 + 1] = chunk.colors[i * 3 + 1];
       colors[dst * 3 + 2] = chunk.colors[i * 3 + 2];
     }
     if (intensities && chunk.intensities) {
       intensities[dst] = chunk.intensities[i];
     }
+    if (classifications && chunk.classifications) {
+      classifications[dst] = chunk.classifications[i];
+    }
     dst++;
   }
   return {
     positions,
     colors,
     intensities,
+    classifications,
     pointCount: newCount,
     bbox: chunk.bbox,
   };
 }
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/pointcloud/src/streaming/e57-source.ts` around lines 59 - 88,
applyStride currently downsamples positions, colors, and intensities but omits
copying classifications, causing DecodedPointChunk.classifications to be lost
and hasClassification to become false; update the applyStride function to
allocate a new classifications array when chunk.classifications exists (matching
newCount length and appropriate type), fill it in the same loop using
chunk.classifications[i] aligned with the sampled indices, and return this
classifications array in the resulting DecodedPointChunk alongside positions,
colors, intensities, pointCount, and bbox so downstream callers see preserved
classification data.
apps/viewer/src/hooks/useIfcLoader.ts (1)

1612-1637: ⚠️ Potential issue | 🟡 Minor | ⚡ Quick win

Log this rejected ingest before returning.

This catch updates model/UI state and returns, but it never logs either the abort path or the real parsing failure. That makes point-cloud ingest failures hard to diagnose and misses the repo rule for catch blocks.

As per coding guidelines, "Every catch block must either log the error or re-throw; never use bare catch {} without a /* cleanup — safe to ignore */ comment for truly irrelevant failures".

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@apps/viewer/src/hooks/useIfcLoader.ts` around lines 1612 - 1637, The catch
block in useIfcLoader around the point-cloud ingest never logs the rejection;
add logging before each early return so failures are recorded: when detecting a
stale session (loadSessionRef.current !== currentSession) call the repo logger
(or console.error) with context including ingest.rendererHandle and the error,
then removePointCloudAsset and clearOwnedCanceller as currently done; for the
abort branch (isAbort) log that the ingest was cancelled along with
primaryModelId and ingest.rendererHandle; for the real failure branch log the
full error/message plus format and primaryModelId before calling
updateModel/setError/clearOwnedCanceller/setLoading. Ensure you reference the
existing symbols (loadSessionRef, currentSession,
renderer.removePointCloudAsset, ingest.rendererHandle, isAbort, updateModel,
setError, setLoading, clearOwnedCanceller, format) and do not change the
existing state-updating flow.
♻️ Duplicate comments (1)
apps/viewer/src/hooks/ingest/pointCloudIngest.ts (1)

207-210: ⚠️ Potential issue | 🟡 Minor | ⚡ Quick win

Broaden the PCD probe beyond # .PCD at byte 0.

Line 210 still recognizes only one comment-style PCD header. Valid PCD files that start with VERSION or FIELDS will fall through to extension-based detection, so a renamed PCD can still be routed to the wrong decoder.

♻️ Suggested change
-    const b0 = view.getUint8(0), b1 = view.getUint8(1), b2 = view.getUint8(2);
-    if (b0 === 0x70 /* p */ && b1 === 0x6c /* l */ && b2 === 0x79 /* y */) return 'ply';
-    if (b0 === 0x23 /* # */ && view.byteLength > 4 && view.getUint8(2) === 0x2e /* . */) return 'pcd';
+    const b0 = view.getUint8(0), b1 = view.getUint8(1), b2 = view.getUint8(2);
+    if (b0 === 0x70 /* p */ && b1 === 0x6c /* l */ && b2 === 0x79 /* y */) return 'ply';
+    const header = new TextDecoder().decode(new Uint8Array(buffer, 0, view.byteLength)).toUpperCase();
+    if (header.includes('.PCD') || header.startsWith('VERSION ') || header.startsWith('FIELDS ')) return 'pcd';
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@apps/viewer/src/hooks/ingest/pointCloudIngest.ts` around lines 207 - 210, The
current ASCII probe only detects PCD when the file begins with "# .PCD" and
misses valid PCDs that start with keywords like "VERSION" or "FIELDS"; update
the detection in the ingest probe (the code using view.getUint8 and variables
b0,b1,b2) to also recognize ASCII starts for "VERSION" (bytes 'V','E','R') and
"FIELDS" (bytes 'F','I','E') — e.g., check b0/b1/b2 for 0x56/0x45/0x52 and
0x46/0x49/0x45 respectively (in addition to the existing '#' + '.' + 'PCD'
check), and consider skipping initial ASCII whitespace/newline when reading
b0/b1/b2 so files that begin with whitespace still detect as PCD.
🧹 Nitpick comments (2)
packages/pointcloud/src/formats/e57.test.ts (1)

269-282: 💤 Low value

Dead code from earlier test iteration.

Lines 269-282 create buf, view, and bytes variables that are immediately shadowed by fullBuf, fv, and fb at lines 288-291. The abandoned setup is never used. Consider removing to improve test clarity.

♻️ Suggested cleanup
   it('decodes ScaledInteger streams with bitsPerRecord that crosses byte boundaries', () => {
-    // bitsPerRecord = 12 (min=0, max=4095, span=4095). Two values
-    // pack into 3 bytes LSB-first:
-    //   value0 = 0xABC, value1 = 0xDEF
-    //   byte 0 = value0 & 0xFF                   = 0xBC
-    //   byte 1 = (value0 >> 8) | ((value1 & 0xF) << 4) = 0xA | 0xF0 = 0xFA
-    //   byte 2 = value1 >> 4                     = 0xDE
-    // With scale=1, offset=0, minimum=0 the decoded floats round-trip
-    // to the original raw_int, so we just check the cartesian values.
-    const packetLogicalLength = 4 + 2 + 2 + 3 + 4; // header + count + length + payload + CRC
-    const totalBytes = packetLogicalLength;
-    const buf = new ArrayBuffer(totalBytes);
-    const view = new DataView(buf);
-    const bytes = new Uint8Array(buf);
-    view.setUint8(0, 1);
-    view.setUint8(1, 0);
-    view.setUint16(2, packetLogicalLength - 1, true);
-    view.setUint16(4, 1, true);          // 1 bytestream (we'll use only X to keep it minimal)
-    view.setUint16(6, 3, true);          // bytestream length = 3
-    bytes[8] = 0xBC;
-    bytes[9] = 0xFA;
-    bytes[10] = 0xDE;
-
-    // recordCount=2; Y/Z would be required by decodeE57Scan, so use a
-    // 3-axis prototype and stuff Y/Z into the same bytestream by
-    // declaring 3 streams of 1 byte each. Easier: drop down to a
-    // 1-stream test that only has X — but decodeE57Scan demands all
-    // three. Build a complete 3-stream packet with the bit-packed X.
+    // bitsPerRecord = 12 for X (min=0, max=4095). Two 12-bit values
+    // pack into 3 bytes LSB-first: [0xABC, 0xDEF] → [0xBC, 0xFA, 0xDE]
+    // Y/Z use 4-bit values to keep the test compact.
     const fullLen = 4 + 2 + 2*3 + 3 + 1 + 1 + 4;
     const fullBuf = new ArrayBuffer(fullLen);
     const fv = new DataView(fullBuf);
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/pointcloud/src/formats/e57.test.ts` around lines 269 - 282, Remove
the unused setup that creates buf, view, and bytes (variables
packetLogicalLength, buf, view, bytes) which are immediately shadowed by
fullBuf, fv, and fb; either delete the dead block (lines creating
packetLogicalLength, new ArrayBuffer, DataView, and the bytes assignments) or
reuse those same variables later (fullBuf, fv, fb) so there is no
shadowing—ensure the test still constructs the same byte content and lengths but
only uses one set of buffer/view/Uint8Array variables (referencing
packetLogicalLength, buf/view/bytes or fullBuf/fv/fb consistently).
packages/pointcloud/src/formats/e57-decode.ts (1)

441-465: 💤 Low value

Classification Integer handling differs from ScaledInteger when minimum ≠ 0.

For Integer (line 452): raw - min shifts the value.
For ScaledInteger (line 464): raw + minimum recovers the original.

If an Integer classification field has minimum: 5, class ID 7 would become 2 after raw - min. This is inconsistent with ScaledInteger behavior which correctly recovers the original value. In practice, classification fields typically have min=0 so this works, but the semantic inconsistency could cause issues with unusual producers.

Consider using just raw for Integer classification (since class IDs are absolute values, not range-normalized):

🔧 Suggested fix
     for (let i = 0; i < take; i++) {
       const off = start + i * stride;
       const raw = stride === 2
         ? (signed ? view.getInt16(off, true) : view.getUint16(off, true))
         : (signed ? view.getInt8(off) : view.getUint8(off));
-      classifications[written + i] = Math.max(0, Math.min(255, raw - min));
+      classifications[written + i] = Math.max(0, Math.min(255, raw));
     }
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@packages/pointcloud/src/formats/e57-decode.ts` around lines 441 - 465, The
Integer branch in the classification decode uses raw - min which shifts class
IDs when field.minimum ≠ 0; change the Integer handling in the same block that
computes raw (inside the field.kind === 'Integer' branch) to treat class IDs as
absolute values (use raw directly, then clamp to 0..255) so its behavior matches
the ScaledInteger path (see symbols: field.kind, min, raw, classifications,
scaledIntegerBitsPerRecord, readBitsLE) — update the assignment to
classifications[written + i] accordingly and keep the existing clamping.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@apps/viewer/src/hooks/useIfcFederation.ts`:
- Around line 669-678: The AbortError handler in useIfcFederation.ts currently
calls setError, setProgress and setLoading unconditionally and can stomp a newer
load; modify the DOMException AbortError branch (inside the addModel() / load
routine) to first check the per-load ownership token/ID used elsewhere (e.g.,
currentLoadId, loadOwner or similar) and only perform setError(null),
setProgress({ phase: 'Cancelled', percent: 0 }) and setLoading(false) if the
token matches the one for the cancelled request; otherwise skip those state
updates so a later addModel() instance isn't overwritten. Ensure you reference
the same ownership variable used by the rest of the loader to keep behavior
consistent.

---

Outside diff comments:
In `@apps/viewer/src/hooks/useIfcLoader.ts`:
- Around line 1612-1637: The catch block in useIfcLoader around the point-cloud
ingest never logs the rejection; add logging before each early return so
failures are recorded: when detecting a stale session (loadSessionRef.current
!== currentSession) call the repo logger (or console.error) with context
including ingest.rendererHandle and the error, then removePointCloudAsset and
clearOwnedCanceller as currently done; for the abort branch (isAbort) log that
the ingest was cancelled along with primaryModelId and ingest.rendererHandle;
for the real failure branch log the full error/message plus format and
primaryModelId before calling
updateModel/setError/clearOwnedCanceller/setLoading. Ensure you reference the
existing symbols (loadSessionRef, currentSession,
renderer.removePointCloudAsset, ingest.rendererHandle, isAbort, updateModel,
setError, setLoading, clearOwnedCanceller, format) and do not change the
existing state-updating flow.

In `@packages/pointcloud/src/streaming/e57-source.ts`:
- Around line 59-88: applyStride currently downsamples positions, colors, and
intensities but omits copying classifications, causing
DecodedPointChunk.classifications to be lost and hasClassification to become
false; update the applyStride function to allocate a new classifications array
when chunk.classifications exists (matching newCount length and appropriate
type), fill it in the same loop using chunk.classifications[i] aligned with the
sampled indices, and return this classifications array in the resulting
DecodedPointChunk alongside positions, colors, intensities, pointCount, and bbox
so downstream callers see preserved classification data.

---

Duplicate comments:
In `@apps/viewer/src/hooks/ingest/pointCloudIngest.ts`:
- Around line 207-210: The current ASCII probe only detects PCD when the file
begins with "# .PCD" and misses valid PCDs that start with keywords like
"VERSION" or "FIELDS"; update the detection in the ingest probe (the code using
view.getUint8 and variables b0,b1,b2) to also recognize ASCII starts for
"VERSION" (bytes 'V','E','R') and "FIELDS" (bytes 'F','I','E') — e.g., check
b0/b1/b2 for 0x56/0x45/0x52 and 0x46/0x49/0x45 respectively (in addition to the
existing '#' + '.' + 'PCD' check), and consider skipping initial ASCII
whitespace/newline when reading b0/b1/b2 so files that begin with whitespace
still detect as PCD.

---

Nitpick comments:
In `@packages/pointcloud/src/formats/e57-decode.ts`:
- Around line 441-465: The Integer branch in the classification decode uses raw
- min which shifts class IDs when field.minimum ≠ 0; change the Integer handling
in the same block that computes raw (inside the field.kind === 'Integer' branch)
to treat class IDs as absolute values (use raw directly, then clamp to 0..255)
so its behavior matches the ScaledInteger path (see symbols: field.kind, min,
raw, classifications, scaledIntegerBitsPerRecord, readBitsLE) — update the
assignment to classifications[written + i] accordingly and keep the existing
clamping.

In `@packages/pointcloud/src/formats/e57.test.ts`:
- Around line 269-282: Remove the unused setup that creates buf, view, and bytes
(variables packetLogicalLength, buf, view, bytes) which are immediately shadowed
by fullBuf, fv, and fb; either delete the dead block (lines creating
packetLogicalLength, new ArrayBuffer, DataView, and the bytes assignments) or
reuse those same variables later (fullBuf, fv, fb) so there is no
shadowing—ensure the test still constructs the same byte content and lengths but
only uses one set of buffer/view/Uint8Array variables (referencing
packetLogicalLength, buf/view/bytes or fullBuf/fv/fb consistently).
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

Run ID: cf24565e-c572-4e56-a9b2-3590d26e0231

📥 Commits

Reviewing files that changed from the base of the PR and between 4fe8f56 and 6013ac6.

📒 Files selected for processing (18)
  • .changeset/pointcloud-e57-scaled-integer.md
  • .changeset/pointcloud-laz-wasm-mime.md
  • .changeset/pointcloud-near-term-correctness.md
  • apps/viewer/src/components/viewer/tools/SectionPanel.tsx
  • apps/viewer/src/components/viewer/useMouseControls.ts
  • apps/viewer/src/hooks/ingest/pointCloudIngest.ts
  • apps/viewer/src/hooks/useIfcFederation.ts
  • apps/viewer/src/hooks/useIfcLoader.ts
  • packages/pointcloud/src/formats/e57-decode.ts
  • packages/pointcloud/src/formats/e57-xml.ts
  • packages/pointcloud/src/formats/e57.test.ts
  • packages/pointcloud/src/streaming/e57-source.ts
  • packages/pointcloud/src/streaming/laz-source.ts
  • packages/pointcloud/src/streaming/vendor.d.ts
  • packages/renderer/src/index.ts
  • packages/renderer/src/picker.ts
  • packages/renderer/src/pointcloud/point-cloud-node.ts
  • packages/renderer/src/scene.ts
✅ Files skipped from review due to trivial changes (4)
  • .changeset/pointcloud-laz-wasm-mime.md
  • packages/pointcloud/src/streaming/vendor.d.ts
  • .changeset/pointcloud-e57-scaled-integer.md
  • .changeset/pointcloud-near-term-correctness.md
🚧 Files skipped from review as they are similar to previous changes (4)
  • packages/renderer/src/pointcloud/point-cloud-node.ts
  • packages/renderer/src/picker.ts
  • packages/pointcloud/src/formats/e57-xml.ts
  • apps/viewer/src/components/viewer/useMouseControls.ts

Comment thread apps/viewer/src/hooks/useIfcFederation.ts
- E57 applyStride now preserves classifications across stride > 1.
- useIfcFederation guards addModel state writes with a loadSessionRef
  token so a cancelled load can't stomp a newer load's spinner.
- E57 Integer classification keeps raw class IDs (no min subtraction)
  to match ScaledInteger semantics — class IDs are absolute labels.
- PCD probe now also recognizes VERSION / FIELDS first-line headers,
  not just the # .PCD comment style.
- useIfcLoader logs the pointcloud ingest catch on each branch
  (stale session, abort, real failure) per repo catch-block rule.
- Test: drop shadowed buffer setup in the bit-packed ScaledInteger
  test so only the live fullBuf path remains.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants