fix(pointcloud): near-term batch — correctness + code-org#614
fix(pointcloud): near-term batch — correctness + code-org#614
Conversation
Four items from issue #611's near-term list: 1. computeBBox empty + non-finite guards (e57 + ifcx-points). Empty or NaN-poisoned chunks no longer produce ±Infinity bounds. 2. Magic-byte-first format detection. detectPointCloudFormat now probes the buffer before falling back to extension, so a LAS file named *.ply still loads correctly. 3. E57 packet-bounds + per-stream guards. Validate that the DataPacket header, length table, and each bytestream fit inside payloadEnd before reading. 4. Code-org splits: - e57.ts (631 lines) → e57-page.ts + e57-xml.ts + e57-decode.ts + e57.ts orchestrator (all <400 lines) - point-cloud-renderer.ts (420 lines) → extract uniform writer to point-cloud-uniforms.ts (renderer drops to 364 lines) Verified: 62 pointcloud unit tests, full repo typecheck (24/24).
Four UX items from issue #611: 1. Hover XYZ readback. Picker now copies the depth texel alongside the color texel, unprojects through inverse(viewProj), and adds PickResult.worldXYZ. HoverTooltip shows x/y/z under the entity id. Reverse-Z aware (depth=1 near, 0 far/miss). 2. Solid-color picker. Native <input type="color"> in the point cloud panel when colour mode is "fixed". Hex round-trips through the [r,g,b,a] store tuple. 3. Colour-mode legend. New PointCloudLegend component renders inline beneath the mode buttons: - Classification → ASPRS LAS 1.4 class id/color/label list - Intensity → black-to-white gradient bar - Height → cool-warm gradient bar Palettes mirror point-shader.wgsl.ts exactly. 4. Cancel button. New activeStreamCanceller on the loading slice; both ingest sites register/clear streamHandle.cancel(). StatusBar shows a Cancel button while non-null. AbortError → "Cancelled" instead of an error toast. Verified: full repo typecheck (24/24), 655/655 tests pass, viewer Vite build green.
|
The latest updates on your projects. Learn more about Vercel for GitHub.
|
|
Warning Rate limit exceeded
To keep reviews running without waiting, you can enable usage-based add-on for your organization. This allows additional reviews beyond the hourly cap. Account admins can enable it under billing. ⌛ How to resolve this issue?After the wait time has elapsed, a review can be triggered using the We recommend that you space out your commits to avoid hitting the rate limit. 🚦 How do rate limits work?CodeRabbit enforces hourly rate limits for each developer per organization. Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout. Please see our FAQ for further information. ℹ️ Review info⚙️ Run configurationConfiguration used: Organization UI Review profile: CHILL Plan: Pro Run ID: 📒 Files selected for processing (7)
WalkthroughAdds broad point-cloud functionality: ASCII PTS/XYZ decoding and streaming, E57 multi-scan pose and ScaledInteger decoding, per-class LAS visibility mask, preview-density throttling, GPU BIM↔scan deviation compute with BVH, rectangle (marquee) GPU picking, hover world-space readback, streaming cancellation, renderer uniform/ shader plumbing, and viewer UI/state to control and surface these features. ChangesRelease Notes
ASCII PTS/XYZ Format & Streaming
E57 Refactor + ScaledInteger + Multi-scan Poses
LAZ wasm fetch robustness
GPU Deviation Pipeline (BVH + Compute Shader)
Point-cloud Rendering Uniforms, Shader, and Buffers
Picking, Rectangle Selection & Hover XYZ
Viewer Store, UI Components & Sync
Sequence Diagram(s)sequenceDiagram
participant User as User
participant Panel as DeviationPanel
participant Renderer as Renderer
participant GPU as GPU Device
participant Result as Store/Return
User->>Panel: Click "Compute Deviations"
Panel->>Renderer: computeDeviations({maxRange: 1.0})
Renderer->>Renderer: collect meshes via scene.forEachMeshData
Renderer->>Renderer: buildTriangleBVH(meshes)
Renderer->>GPU: upload BVH nodes & triangles
Renderer->>GPU: dispatch compute shader for point chunks
GPU->>GPU: compute signed distances, write per-chunk deviation buffers
GPU-->>Renderer: queue.onSubmittedWorkDone()
Renderer->>Result: return stats + suggestedHalfRange
Panel->>Panel: update store, set deviation computed, switch colorMode
sequenceDiagram
participant User as User
participant Mouse as useMouseControls
participant Viewport as Viewport
participant Renderer as Renderer.pickRect
participant Picker as Picker
participant GPU as GPU
participant Store as Selection Store
User->>Mouse: Ctrl+LMB drag to select
Mouse->>Viewport: setRectSelection(rect)
Viewport->>Viewport: RectSelectionOverlay renders
User->>Mouse: Release LMB
Mouse->>Renderer: pickRect(x0,y0,x1,y1)
Renderer->>Picker: pickRect(...)
Picker->>GPU: render pick pass for rect
GPU->>GPU: write pick id texture
Picker->>GPU: copyTextureToBuffer readback
GPU-->>Picker: pick buffer data
Picker->>Renderer: Set<expressId>
Renderer->>Store: setSelectedEntityIds(Set)
Mouse->>Viewport: setRectSelection(null)
Possibly related issues
Possibly related PRs
✨ Finishing Touches🧪 Generate unit tests (beta)
|
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 11698d3678
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
| try { | ||
| await ingest.done; | ||
| } finally { |
There was a problem hiding this comment.
Treat cancelled federated streams as cancellation, not errors
Cancelling a point-cloud stream in the federated loader now calls ingest.streamHandle.cancel(), but this await ingest.done block only has finally cleanup and no AbortError handling, so cancellation bubbles to the outer catch and is surfaced via setError(err.message) as a load failure. This is inconsistent with the single-model loader path (which maps aborts to a benign cancelled state) and causes user-initiated cancel actions to look like real parse errors in federated workflows.
Useful? React with 👍 / 👎.
Codex flagged that useIfcFederation's outer catch was reporting a user-initiated stream cancel as a load failure (setError(message)), inconsistent with useIfcLoader.loadFile which maps AbortError to a benign cancelled state. Detect DOMException with name === "AbortError" before the general catch path, set progress to "Cancelled" and clear error.
Both formats are line-oriented plain-text scans common in legacy
survey workflows. Shared decoder + streaming source handle both.
Auto-detected layouts by column count of the first data line:
3 → X Y Z
4 → X Y Z I
6 → X Y Z R G B
7 → X Y Z I R G B (canonical PTS)
9 → X Y Z R G B Nx Ny Nz (normals dropped)
10 → X Y Z I R G B Nx Ny Nz (PTS-with-normals)
XYZ with ≥3 unknown cols → positions only, rest skipped.
Comment lines (# or //) and blank lines skipped. Intensity / RGB
range auto-detected (0..1 vs 0..255) per file from observed max.
Wired into:
- packages/pointcloud: decoder, streaming source, decode-worker,
protocol, worker-client, index re-exports
- apps/viewer: format detection, useIfcLoader + useIfcFederation
pointcloud branches, file picker accept lists in MainToolbar
and ViewportContainer drop handlers
The "PTS / XYZ not yet supported" toast in describeUnsupportedFormat
is removed.
10 new unit tests cover layout probing, decoder round-trips for the
common shapes, and the comment / header-count edge cases.
Verified: 77 pointcloud tests pass, full repo typecheck (24/24),
viewer Vite build green.
…tensity / colour ScaledInteger is the more compact E57 encoding most real-world Faro / Trimble / Leica exports use. Previously the decoder threw on these files; now it loads them directly. Per ASTM E2807-11 §6.3.4: bitsPerRecord = ceil(log2(maximum - minimum + 1)) bytestream stores raw_int = original - minimum LSB-first, decoded float = (raw_int + minimum) * scale + offset. New readBitsLE() walks a byte buffer and reconstructs each value using Math.pow(2, n) so precision holds up to 53 bits. Wider fields throw clearly. readCartesianStream + readIntensityStream now branch on kind; writeColorChannel gets a ScaledInteger remap. Per-axis packet capacity computation varies by field kind via floatOrSiPointCapacity. The earlier multi-scan pose rejection stays — that's a separate piece of work. 2 new tests: - 8-bit ScaledInteger round-trip across X/Y/Z - 12-bit ScaledInteger crossing byte boundaries (proves the bit walk is correct for non-multiple-of-8 widths) Verified: 63 pointcloud unit tests, full repo typecheck (24/24), viewer Vite build green.
Parse each Data3D's <pose> (unit quaternion + translation) and apply
it before merging. Registered multi-scan E57 files now load directly
instead of throwing the "re-export as merged" rejection.
Data3DEntry.hasPose: boolean → Data3DEntry.pose?: E57Pose
parsePoseElement walks <pose><rotation/><translation/></pose>;
non-finite values fall through to identity.
applyPoseInPlace derives R from the quaternion (Hamilton w+xi+yj+zk),
computes out = R · in + T per point.
decodeE57 applies pose post-decode + recomputes bbox.
3 new tests:
- <pose> XML extraction with 90°-around-Z quat + translation
- applyPoseInPlace 90°-around-Z + translation per-axis check
- identity pose round-trip
Verified: 64 pointcloud tests, full repo typecheck (24/24), viewer
Vite build green.
…ints
Ctrl/⌘ + LMB drag in the select tool draws a rectangle; release
selects every entity (mesh + point cloud) whose pixel lies inside.
A teal-dashed SVG outline tracks the drag.
Picker.pickRect(x0,y0,x1,y1,…) renders the same pick pass as
pick(), reads back the texel rect, dedupes hits to Set<expressId>.
Picker.renderPickPass extracts the shared render-pass setup.
PickingManager.pickRect applies hiddenIds + isolatedIds filtering.
Renderer.pickRect exposes it through the public facade.
RectSelectionOverlay renders the dashed SVG box.
useMouseControls grows mouseState.isRectSelecting; suppresses
orbit/pan during the drag; runs pickRect + setSelectedEntityIds
on mouseup. 4-pixel minimum rect avoids stray Ctrl-click hits.
CPU-raycast and dynamic-mesh-creation fallbacks that pick() uses
for huge batched models are skipped — rect only sees hydrated
meshes. Acceptable MVP; issue #611's lasso (polygonal) and
per-class isolation are follow-ups.
Verified: full repo typecheck (24/24), 655 viewer tests pass,
viewer Vite build green.
A new Classes section in the point cloud panel checkboxes every
LAS 1.4 class. Hidden classes get culled in the splat shader.
pointCloudClassMask: number (u32 bitmask, default 0xFFFFFFFF) on
the point cloud slice; togglePointCloudClass + setMask actions.
PointCloudRenderOptions.classMask plumbed through; stored in
uniform flags.w (was unused).
Splat shader: per vertex check (flags.w >> classId) & 1; hidden
classes get a degenerate clipPos so they're culled pre-raster.
New PointCloudClasses component renders a <details> collapsible
with "Show all" + per-class checkboxes. Swatch colours mirror
the shader's classification palette.
Class ids ≥32 always show — mask only covers ASPRS standard range.
Verified: full repo typecheck (24/24), 655 viewer tests, viewer
Vite build green.
…der drag
Splat shader gains a previewStride uniform that culls every Nth
instance at the start of vs_main. Section-plane slider wires
pointer/key down → stride 4, up → stride 1, so dragging over huge
scans stays responsive.
POINT_UNIFORM_SIZE 208 → 224 (new extras: vec4<u32> slot;
extras.x = previewStride, yzw reserved).
PointCloudRenderOptions.previewStride? clamped to [1, 256].
Vertex shader culls hidden instances via clipPos = (0, 0, -2, 1).
pointCloudPreviewStride on the slice (default 1) +
setPointCloudPreviewStride action.
usePointCloudSync forwards the stride.
SectionOverlay slider sets stride 4 on drag start, 1 on release.
Gated on pointCloudAssetCount > 0 so IFC-only sessions are
unaffected.
Triangle meshes ignore the stride — they were already smooth.
Verified: typecheck (24/24), 655 viewer tests, viewer build green.
GPU compute pipeline that colours each scan point by signed distance
to the nearest mesh surface. Works with every IFC ingest path
(STEP / IFCx / GLB / federated) and every point cloud format —
anywhere Scene.forEachMeshData reaches and the splat pipeline can
already render.
Pipeline:
- Per-triangle BVH built from Scene.forEachMeshData. Median split,
max 16 tris/leaf, flattened to 32-byte nodes during build.
- Two GPU storage buffers (nodes + triangles), uploaded once per
mesh-set change, cached by (meshCount, totalPositions).
- Compute shader (workgroup-size 64): stack-based BVH descent
pruning by squared point-AABB distance, Ericson §5.1.5
closest-point-on-triangle, signed distance via face normal.
- Per-chunk deviation buffer (4 bytes/point, STORAGE+VERTEX) sits
alongside the splat vertex buffer. Compute reads positions
straight from the vertex buffer — no CPU copy for streamed.
- Splat shader: 2nd vertex buffer (location 4 = f32 deviation),
new 'deviation' color mode, diverging blue→white→red ramp.
POINT_UNIFORM_SIZE 208→224 for new deviationRange uniform slot.
- Renderer.computeDeviations({maxRange?, forceRebuild?}) →
{bvhTriangles, bvhNodes, chunksProcessed, pointsProcessed,
bounds, suggestedHalfRange}. Awaits onSubmittedWorkDone.
- DeviationPanel inside PointCloudPanel: compute button (gated
on triangleCount > 0), duration readout, mm-scale range slider,
inline gradient legend. Auto-suggests half-range from BVH bbox
and auto-switches colour mode on success.
Sign: positive = point on outward-normal side of nearest face.
Negative = inside/behind. Per-surface front/back is always defined
even on non-watertight BIM.
Verified: full repo typecheck (24/24), 655 viewer tests, viewer
Vite build green.
# Conflicts: # apps/viewer/src/hooks/ingest/pointCloudIngest.ts
# Conflicts: # packages/pointcloud/src/formats/e57.ts
# Conflicts: # packages/pointcloud/src/formats/e57.ts
# Conflicts: # apps/viewer/src/components/viewer/useMouseControls.ts # packages/renderer/src/picker.ts
# Conflicts: # apps/viewer/src/components/viewer/PointCloudPanel.tsx # packages/renderer/src/pointcloud/point-cloud-renderer.ts
# Conflicts: # apps/viewer/src/components/viewer/usePointCloudSync.ts # apps/viewer/src/store/slices/pointCloudSlice.ts # packages/renderer/src/pointcloud/point-cloud-renderer.ts
# Conflicts: # apps/viewer/src/components/viewer/PointCloudPanel.tsx # apps/viewer/src/components/viewer/usePointCloudSync.ts # apps/viewer/src/store/slices/pointCloudSlice.ts # packages/renderer/src/pointcloud/point-cloud-renderer.ts # packages/renderer/src/pointcloud/point-pipeline.ts # packages/renderer/src/pointcloud/point-shader.wgsl.ts
There was a problem hiding this comment.
Actionable comments posted: 13
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
apps/viewer/src/hooks/useIfcLoader.ts (1)
1593-1629:⚠️ Potential issue | 🟠 Major | ⚡ Quick winDon't let an older stream clear a newer stream's Cancel action.
activeStreamCancelleris cleared unconditionally on every exit path here. If the user starts a second point-cloud load before the first settles, the first load can wipe the second load's canceller and leave the current stream un-cancellable.♻️ Suggested ownership-safe cleanup
- const setCanceller = useViewerStore.getState().setActiveStreamCanceller; - setCanceller(() => ingest.streamHandle.cancel()); + const { setActiveStreamCanceller } = useViewerStore.getState(); + const cancelStream = () => ingest.streamHandle.cancel(); + setActiveStreamCanceller(cancelStream); try { await ingest.done; } catch (err) { ... - setCanceller(null); return; } ... - setCanceller(null); setLoading(false); return; + } finally { + if (useViewerStore.getState().activeStreamCanceller === cancelStream) { + setActiveStreamCanceller(null); + } } - setCanceller(null);🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@apps/viewer/src/hooks/useIfcLoader.ts` around lines 1593 - 1629, The cancel handler is being cleared unconditionally and can wipe a newer session's canceller; change the logic to only clear the activeStreamCanceller if the session that set it still owns it: when you call setActiveStreamCanceller (via setCanceller(() => ingest.streamHandle.cancel())), capture the currentSession id (or create a canceller wrapper that checks loadSessionRef.current) and when clearing (all setCanceller(null) calls) only call setCanceller(null) if loadSessionRef.current === currentSession; likewise in the early-abort branch where you call renderer.removePointCloudAsset(ingest.rendererHandle) keep the same guard so you don't clear a newer session's canceller. Ensure the same ownership check is used in both the success path (after await ingest.done) and all error/return paths so a later load retains its cancel action.
🧹 Nitpick comments (5)
packages/pointcloud/src/formats/e57.test.ts (1)
168-284: ⚡ Quick winAdd one malformed-packet regression next to these happy-path decoder cases.
The PR also hardens
decodeE57Scan()against invalid length tables and payload overruns, but this file only locks in successful decode paths. A tinyDataPacketthat runs pastpayloadEndwould keep those new bounds checks from regressing silently.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/pointcloud/src/formats/e57.test.ts` around lines 168 - 284, Add a negative test next to the existing ScaledInteger cases that constructs a deliberately malformed DataPacket which declares bytestream lengths that run past the packet payload end and asserts decodeE57Scan detects the overrun (e.g., throws or returns an error). Build a small ArrayBuffer/DataView like the other tests but set packetLogicalLength (header at indices written by view.setUint16(2, ...)) too small for the declared bytestream lengths (or set one bytestream length larger than remaining bytes), reuse a Data3DEntry prototype similar to the other tests, call decodeE57Scan with that buffer, and expect it to throw (or otherwise signal the invalid-packet condition) to lock in the bounds-checking behavior for DataPacket handling in decodeE57Scan.apps/viewer/src/components/viewer/PointCloudClasses.tsx (1)
23-41: ⚡ Quick winExtract the class palette into a shared module.
This table is duplicated here and in
PointCloudLegend.tsx, and both files also need to stay aligned with the shader palette. One shared source of truth will prevent UI drift the next time a class color or label changes.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@apps/viewer/src/components/viewer/PointCloudClasses.tsx` around lines 23 - 41, Extract the CLASSES constant and its ClassEntry type into a new shared module (e.g., export const CLASSES and export type ClassEntry) and replace the local CLASSES definitions in PointCloudClasses.tsx and PointCloudLegend.tsx by importing the shared symbols; update any references to CLASSES, ClassEntry, and usages in those files to use the imported names. Also ensure the shader palette consumer imports the same shared CLASSES (or a derived rgb array) so the UI and shader use the single source of truth; run type checks to adjust any import paths/exports accordingly.apps/viewer/src/components/viewer/DeviationPanel.tsx (1)
76-78: 💤 Low valueHardcoded default check is fragile.
The condition
halfRange === 0.05to detect "user hasn't touched the slider" couples this component to the initial value defined elsewhere in the store. If the store default changes, this logic silently breaks.Consider extracting the default to a shared constant or using a separate "touched" flag in the store.
Example: use a shared constant
// In a shared constants file or the store slice: export const DEFAULT_DEVIATION_HALF_RANGE = 0.05; // Then in this component: if (halfRange === DEFAULT_DEVIATION_HALF_RANGE && result.suggestedHalfRange !== DEFAULT_DEVIATION_HALF_RANGE) { setHalfRange(result.suggestedHalfRange); }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@apps/viewer/src/components/viewer/DeviationPanel.tsx` around lines 76 - 78, The check that uses the hardcoded 0.05 to detect an untouched slider is fragile; replace the literal with a shared constant (e.g., DEFAULT_DEVIATION_HALF_RANGE) exported from the store or constants module, or alternatively add a "touched" flag in the store and use that to decide whether to call setHalfRange; update the condition in DeviationPanel (the halfRange check and the setHalfRange call that references result.suggestedHalfRange) to compare against DEFAULT_DEVIATION_HALF_RANGE or to check the touched flag instead so the logic no longer depends on a magic number.packages/renderer/src/deviation/deviation-shader.wgsl.ts (1)
209-217: 💤 Low valueStack overflow silently drops BVH children.
When
sp + 2 > STACK_SIZE, both children are silently dropped. While 64 levels is generous for balanced BVHs (supports billions of triangles), a degenerate or unbalanced tree could exceed this depth and produce incorrect nearest-neighbor results without any indication.This is acceptable for an MVP given typical BVH depths, but consider adding a debug flag or logging mechanism for detecting stack exhaustion in future iterations if deviation results seem incorrect on pathological meshes.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/renderer/src/deviation/deviation-shader.wgsl.ts` around lines 209 - 217, The code silently drops BVH children when sp + 2u > STACK_SIZE; modify the traversal in the block using sp, STACK_SIZE, and stack so that it never silently discards both children: if only one slot remains push the nearer child (based on lDist vs rDist), if zero slots remain set a diagnostic flag (e.g., stackOverflowDetected or stackOverflowCount) in your shader-visible output/UBO and break out of traversal to avoid incorrect results; add the diagnostic flag so host code can detect/expose stack exhaustion (optionally gated by a debug compile flag).packages/pointcloud/src/formats/e57-decode.ts (1)
358-374: Add optional bounds checking inreadBitsLEfor malformed input.The concern is valid but limited to corrupt files: while upstream validation ensures
bytestreamoffsets fit within the packet payload, it doesn't guarantee that metadata-declared byte counts match actual storage. If a field header claims more bytes than available, the capacity calculation (Math.floor((lengthBytes * 8) / bitsPerRecord)) could compute atakevalue that causesreadBitsLEto read pastbytes.length. Accessing undefined indices silently returnsundefined, which converts to 0 in the bit arithmetic, corrupting the decoded value.For well-formed files, the existing validation prevents this. A defensive guard—such as
if (cur >= bytes.length) return 0;in the loop or a pre-flight check onstartBit + bitsPerRecord * take—would safely handle malformed input without affecting normal operation.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/pointcloud/src/formats/e57-decode.ts` around lines 358 - 374, The readBitsLE function can read past the provided bytes on malformed input; add defensive bounds checking: before reading each byte in readBitsLE (using parameters bytes, bitOffset, bitsPerRecord and locals cur/inByte), ensure cur < bytes.length (or precompute that startBit + bitsPerRecord <= bytes.length*8) and if the check fails return 0 (or a safe default) to avoid accessing undefined bytes and corrupting decoded values; update readBitsLE to perform this guard either as a pre-flight validation or inside the while loop.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In @.changeset/pointcloud-e57-scaled-integer.md:
- Around line 1-3: The changeset currently only bumps "@ifc-lite/pointcloud" but
the PR also modifies packages/renderer; add a corresponding changeset for
"@ifc-lite/renderer" (or a companion changeset) so the renderer package is
versioned and included in the release; run pnpm changeset and select or create
an entry for `@ifc-lite/renderer` (minor/patch as appropriate), commit the new
changeset file alongside the existing one, and ensure you do not manually edit
package versions or CHANGELOG.md.
In @.changeset/pointcloud-near-term-correctness.md:
- Around line 2-3: The changeset currently updates "@ifc-lite/pointcloud" and
"@ifc-lite/viewer" but missing "@ifc-lite/renderer", so add
"@ifc-lite/renderer": patch to the changeset file so the renderer package is
versioned and released alongside the API changes; ensure the entry covers the
changes to Renderer.computeDeviations, Renderer.pickRect and any exported
point-cloud symbols so the release metadata reflects the updated renderer API.
In `@apps/viewer/src/components/viewer/MainToolbar.tsx`:
- Around line 426-429: The file filter in MainToolbar.tsx (the supportedFiles
const) inconsistently lowercases only some branches; refactor by extracting a
single normalized allow-list (e.g., a SUPPORTED_EXTENSIONS Set or an
isSupportedFile(name: string) helper) that contains all extensions in lowercase,
lowercase the file.name once (const name = file.name.toLowerCase()), and test
membership using name.endsWith(ext) or checking ext against the set; replace the
current inline predicate in supportedFiles with that helper and reuse it in the
other similar checks in this component.
In `@apps/viewer/src/components/viewer/tools/SectionPanel.tsx`:
- Around line 46-54: The temporary preview stride may remain at 4 if a drag is
cancelled or the panel unmounts; update the slider drag logic to reset stride on
all cancellation/teardown paths: add event handlers for pointercancel and
blur/lost focus (e.g. wire onPointerCancel and onBlur to call
setPreviewStride(1) alongside handleSliderDragEnd), and add a useEffect cleanup
that calls setPreviewStride(1) on unmount or when the panel visibility changes;
reference the existing handleSliderDragStart, handleSliderDragEnd and
setPreviewStride functions and ensure any code path that exits a drag
(pointercancel, blur, component unmount) calls setPreviewStride(1).
In `@apps/viewer/src/components/viewer/useMouseControls.ts`:
- Around line 507-509: renderer.pickRect(...) currently returns a promise that
can reject (e.g. WebGPU device loss) but is left unhandled; wrap the pickRect
call in proper error handling (either await inside an async handler with
try/catch or append .catch) so rejections are consumed, log the error via the
existing logger/context and avoid leaving selection in an indeterminate state
(for example, only call
useViewerStore.getState().setSelectedEntityIds(Array.from(ids)) on success; on
failure either leave selection unchanged or explicitly clear it depending on
desired UX). Ensure you modify the call site referencing renderer.pickRect,
getPickOptions, and useViewerStore.getState().setSelectedEntityIds so no
unhandled promise rejection occurs.
In `@apps/viewer/src/components/viewer/ViewportContainer.tsx`:
- Around line 313-315: The extension checks in ViewportContainer.tsx are
inconsistent and duplicated: some checks use case-sensitive literals (e.g.,
'.ifc', '.ifcx', '.glb') while others call toLowerCase(), and the same list
appears in multiple places. Fix by introducing a single shared lowercase
allowlist array (e.g., ALLOWED_EXTENSIONS = ['.ifc','.ifcx','.glb','.las',...])
at the top of the component/module and replace the inline predicates (the
anonymous filter lambda that does f.name.endsWith(...) ) with a normalized check
using file.name.toLowerCase().endsWith(...) or fileNameLower.endsWith for
membership against ALLOWED_EXTENSIONS; update all occurrences pointed out (the
three duplicated locations) and remove duplicate entries from the list.
In `@apps/viewer/src/hooks/ingest/pointCloudIngest.ts`:
- Around line 207-221: The PCD sniffing is too narrow—currently it only detects
a "# .PCD" pattern at byte 0; update the probe in pointCloudIngest (the code
using view/getUint8 and b0,b1,b2) to scan the sniff window for ASCII sequences
indicating PCD (for example ".PCD", "PCD", or "VERSION") anywhere within the
first N bytes instead of only at offset 0, and only fall back to extension
checks using fileName when the buffer is too short or no PCD signatures are
found.
In `@apps/viewer/src/hooks/useIfcFederation.ts`:
- Around line 485-497: Create a dedicated canceller function variable (e.g.,
const myCanceller = () => ingest.streamHandle.cancel()) and pass that to
useViewerStore.getState().setActiveStreamCanceller instead of an anonymous
closure; in the finally block, read the current store value
(useViewerStore.getState().activeStreamCanceller) and only call
setActiveStreamCanceller(null) if the stored function is referentially equal to
myCanceller so you don't clear a newer stream's canceller; keep using
ingest.done and do not duplicate asset cleanup.
In `@packages/pointcloud/src/formats/e57-xml.ts`:
- Around line 82-123: When building the Data3DEntry in the loop, validate
numeric XML metadata before pushing into entries: ensure Number(recordCountAttr)
yields a finite non-negative integer (>0 if appropriate) and
Number(fileOffsetAttr) yields a finite non-negative number; if either check
fails (NaN, Infinity, negative, or recordCount non-integer/zero), skip this
scan/points entry (continue) instead of pushing; do these checks where you
currently read fileOffsetAttr and recordCountAttr (refer to variables
fileOffsetAttr, recordCountAttr and the entries.push call) and keep existing
parsing of prototype fields and parsePoseElement as-is.
In `@packages/renderer/src/index.ts`:
- Around line 461-468: The current fingerprint
`${meshes.length}:${totalPositionsLength}` can collide and cause a stale BVH;
update the logic in collectAllSceneMeshes / computeDeviations where
deviationBvhFingerprint is compared so the fingerprint incorporates per-mesh
stable identifiers or content hashes (e.g., mesh.uuid or mesh.id plus a checksum
of positions / vertex buffer, or mesh bounding box + vertexCount) before
deciding to reuse the BVH built by buildTriangleBVH and uploaded via
deviationPipeline.uploadBvh; compute a deterministic combined fingerprint from
each mesh (referencing mesh.positions, mesh.uuid/id, and/or mesh.bounds) and set
deviationBvhFingerprint to that value so rebuilds occur whenever any mesh’s
geometry changes.
- Around line 240-243: The new DeviationPipeline instance (created in init via
new DeviationPipeline(this.device.getDevice()) and stored on
this.deviationPipeline) is never disposed, causing GPU buffer/cache leaks;
update the renderer teardown path (destroy method) to check for
this.deviationPipeline, call its disposal/cleanup method (e.g., dispose(),
destroy(), or close() on DeviationPipeline depending on the API) and null out
this.deviationPipeline so GPU BVH buffers and cache are released when the viewer
is torn down. Ensure the call happens before other device teardown so the
pipeline can access the device during its cleanup.
In `@packages/renderer/src/picker.ts`:
- Around line 212-236: The pick() implementation can call copyTextureToBuffer()
with invalid origin coordinates because Math.floor(x)/Math.floor(y) may be -1 or
equal to width/height on border clicks; clamp the texel origin before using it
(e.g., compute ix = clamp(Math.floor(x), 0, texWidth-1) and iy =
clamp(Math.floor(y), 0, texHeight-1)) and use ix/iy for both the colorTexture
and depthTexture copyTextureToBuffer() calls; ensure you reference the same
clamped values when building the origin objects in pick() (matching the guard
already present in pickRect()) so no out-of-range copy is attempted.
In `@packages/renderer/src/scene.ts`:
- Around line 343-346: forEachMeshData currently iterates
this.meshDataMap.values() and can call visit on the same MeshData object
multiple times when that MeshData appears under multiple keys; change
forEachMeshData to deduplicate by tracking seen MeshData instances (e.g., use a
Set to store visited MeshData references) and only call visit(piece) the first
time an instance is encountered (check the Set before visiting and add after
visiting) so each MeshData is processed exactly once.
---
Outside diff comments:
In `@apps/viewer/src/hooks/useIfcLoader.ts`:
- Around line 1593-1629: The cancel handler is being cleared unconditionally and
can wipe a newer session's canceller; change the logic to only clear the
activeStreamCanceller if the session that set it still owns it: when you call
setActiveStreamCanceller (via setCanceller(() => ingest.streamHandle.cancel())),
capture the currentSession id (or create a canceller wrapper that checks
loadSessionRef.current) and when clearing (all setCanceller(null) calls) only
call setCanceller(null) if loadSessionRef.current === currentSession; likewise
in the early-abort branch where you call
renderer.removePointCloudAsset(ingest.rendererHandle) keep the same guard so you
don't clear a newer session's canceller. Ensure the same ownership check is used
in both the success path (after await ingest.done) and all error/return paths so
a later load retains its cancel action.
---
Nitpick comments:
In `@apps/viewer/src/components/viewer/DeviationPanel.tsx`:
- Around line 76-78: The check that uses the hardcoded 0.05 to detect an
untouched slider is fragile; replace the literal with a shared constant (e.g.,
DEFAULT_DEVIATION_HALF_RANGE) exported from the store or constants module, or
alternatively add a "touched" flag in the store and use that to decide whether
to call setHalfRange; update the condition in DeviationPanel (the halfRange
check and the setHalfRange call that references result.suggestedHalfRange) to
compare against DEFAULT_DEVIATION_HALF_RANGE or to check the touched flag
instead so the logic no longer depends on a magic number.
In `@apps/viewer/src/components/viewer/PointCloudClasses.tsx`:
- Around line 23-41: Extract the CLASSES constant and its ClassEntry type into a
new shared module (e.g., export const CLASSES and export type ClassEntry) and
replace the local CLASSES definitions in PointCloudClasses.tsx and
PointCloudLegend.tsx by importing the shared symbols; update any references to
CLASSES, ClassEntry, and usages in those files to use the imported names. Also
ensure the shader palette consumer imports the same shared CLASSES (or a derived
rgb array) so the UI and shader use the single source of truth; run type checks
to adjust any import paths/exports accordingly.
In `@packages/pointcloud/src/formats/e57-decode.ts`:
- Around line 358-374: The readBitsLE function can read past the provided bytes
on malformed input; add defensive bounds checking: before reading each byte in
readBitsLE (using parameters bytes, bitOffset, bitsPerRecord and locals
cur/inByte), ensure cur < bytes.length (or precompute that startBit +
bitsPerRecord <= bytes.length*8) and if the check fails return 0 (or a safe
default) to avoid accessing undefined bytes and corrupting decoded values;
update readBitsLE to perform this guard either as a pre-flight validation or
inside the while loop.
In `@packages/pointcloud/src/formats/e57.test.ts`:
- Around line 168-284: Add a negative test next to the existing ScaledInteger
cases that constructs a deliberately malformed DataPacket which declares
bytestream lengths that run past the packet payload end and asserts
decodeE57Scan detects the overrun (e.g., throws or returns an error). Build a
small ArrayBuffer/DataView like the other tests but set packetLogicalLength
(header at indices written by view.setUint16(2, ...)) too small for the declared
bytestream lengths (or set one bytestream length larger than remaining bytes),
reuse a Data3DEntry prototype similar to the other tests, call decodeE57Scan
with that buffer, and expect it to throw (or otherwise signal the invalid-packet
condition) to lock in the bounds-checking behavior for DataPacket handling in
decodeE57Scan.
In `@packages/renderer/src/deviation/deviation-shader.wgsl.ts`:
- Around line 209-217: The code silently drops BVH children when sp + 2u >
STACK_SIZE; modify the traversal in the block using sp, STACK_SIZE, and stack so
that it never silently discards both children: if only one slot remains push the
nearer child (based on lDist vs rDist), if zero slots remain set a diagnostic
flag (e.g., stackOverflowDetected or stackOverflowCount) in your shader-visible
output/UBO and break out of traversal to avoid incorrect results; add the
diagnostic flag so host code can detect/expose stack exhaustion (optionally
gated by a debug compile flag).
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
Run ID: f4b3cc72-80f9-438e-8522-bc49a62e34b7
📒 Files selected for processing (55)
.changeset/pointcloud-class-isolation.md.changeset/pointcloud-deviation-heatmap.md.changeset/pointcloud-e57-multi-scan-pose.md.changeset/pointcloud-e57-scaled-integer.md.changeset/pointcloud-near-term-correctness.md.changeset/pointcloud-near-term-ux.md.changeset/pointcloud-pts-xyz.md.changeset/pointcloud-rect-pick.md.changeset/pointcloud-section-preview.mdapps/viewer/src/components/viewer/DeviationPanel.tsxapps/viewer/src/components/viewer/HoverTooltip.tsxapps/viewer/src/components/viewer/MainToolbar.tsxapps/viewer/src/components/viewer/PointCloudClasses.tsxapps/viewer/src/components/viewer/PointCloudLegend.tsxapps/viewer/src/components/viewer/PointCloudPanel.tsxapps/viewer/src/components/viewer/RectSelectionOverlay.tsxapps/viewer/src/components/viewer/StatusBar.tsxapps/viewer/src/components/viewer/Viewport.tsxapps/viewer/src/components/viewer/ViewportContainer.tsxapps/viewer/src/components/viewer/ViewportOverlays.tsxapps/viewer/src/components/viewer/tools/SectionPanel.tsxapps/viewer/src/components/viewer/useMouseControls.tsapps/viewer/src/components/viewer/usePointCloudSync.tsapps/viewer/src/hooks/ingest/pointCloudIngest.tsapps/viewer/src/hooks/useIfcFederation.tsapps/viewer/src/hooks/useIfcLoader.tsapps/viewer/src/store/slices/loadingSlice.tsapps/viewer/src/store/slices/pointCloudSlice.tsapps/viewer/src/store/types.tspackages/pointcloud/src/formats/ascii-points.test.tspackages/pointcloud/src/formats/ascii-points.tspackages/pointcloud/src/formats/e57-decode.tspackages/pointcloud/src/formats/e57-page.tspackages/pointcloud/src/formats/e57-xml.tspackages/pointcloud/src/formats/e57.test.tspackages/pointcloud/src/formats/e57.tspackages/pointcloud/src/formats/ifcx-points.tspackages/pointcloud/src/index.tspackages/pointcloud/src/streaming/ascii-points-source.tspackages/pointcloud/src/streaming/decode-worker.tspackages/pointcloud/src/streaming/protocol.tspackages/pointcloud/src/streaming/worker-client.tspackages/renderer/src/deviation/deviation-pipeline.tspackages/renderer/src/deviation/deviation-shader.wgsl.tspackages/renderer/src/deviation/triangle-bvh.tspackages/renderer/src/index.tspackages/renderer/src/picker.tspackages/renderer/src/picking-manager.tspackages/renderer/src/pointcloud/point-cloud-node.tspackages/renderer/src/pointcloud/point-cloud-renderer.tspackages/renderer/src/pointcloud/point-cloud-uniforms.tspackages/renderer/src/pointcloud/point-pipeline.tspackages/renderer/src/pointcloud/point-shader.wgsl.tspackages/renderer/src/scene.tspackages/renderer/src/types.ts
| const supportedFiles = Array.from(files).filter( | ||
| f => f.name.endsWith('.ifc') || f.name.endsWith('.ifcx') || f.name.endsWith('.glb') | ||
| || f.name.toLowerCase().endsWith('.las') || f.name.toLowerCase().endsWith('.laz') || f.name.toLowerCase().endsWith('.ply') || f.name.toLowerCase().endsWith('.pcd') || f.name.toLowerCase().endsWith('.e57') | ||
| || f.name.toLowerCase().endsWith('.las') || f.name.toLowerCase().endsWith('.laz') || f.name.toLowerCase().endsWith('.ply') || f.name.toLowerCase().endsWith('.pcd') || f.name.toLowerCase().endsWith('.e57') || f.name.toLowerCase().endsWith('.pts') || f.name.toLowerCase().endsWith('.xyz') | ||
| ); |
There was a problem hiding this comment.
Normalize the supported-extension allow-list once.
The current filter lowercases only the point-cloud branches, so MODEL.IFC / SCAN.GLB still get rejected here while .las / .pts do not. Pulling the extensions into one lowercased helper/accept string fixes that bug and avoids the four copies drifting further in this already-large component.
Suggested direction
+const SUPPORTED_OPEN_EXTENSIONS = [
+ '.ifc', '.ifcx', '.glb',
+ '.las', '.laz', '.ply', '.pcd', '.e57', '.pts', '.xyz',
+] as const;
+
+function hasSupportedOpenExtension(name: string): boolean {
+ const lower = name.toLowerCase();
+ return SUPPORTED_OPEN_EXTENSIONS.some((ext) => lower.endsWith(ext));
+}
+
+const SUPPORTED_OPEN_ACCEPT = SUPPORTED_OPEN_EXTENSIONS.join(',');
+
- const supportedFiles = Array.from(files).filter(
- f => f.name.endsWith('.ifc') || f.name.endsWith('.ifcx') || f.name.endsWith('.glb')
- || f.name.toLowerCase().endsWith('.las') || f.name.toLowerCase().endsWith('.laz') || f.name.toLowerCase().endsWith('.ply') || f.name.toLowerCase().endsWith('.pcd') || f.name.toLowerCase().endsWith('.e57') || f.name.toLowerCase().endsWith('.pts') || f.name.toLowerCase().endsWith('.xyz')
- );
+ const supportedFiles = Array.from(files).filter((f) => hasSupportedOpenExtension(f.name));As per coding guidelines: Limit non-generated modules to ~400 lines; split larger files and extract cohesive helpers into separate modules.
Also applies to: 467-470, 784-792
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@apps/viewer/src/components/viewer/MainToolbar.tsx` around lines 426 - 429,
The file filter in MainToolbar.tsx (the supportedFiles const) inconsistently
lowercases only some branches; refactor by extracting a single normalized
allow-list (e.g., a SUPPORTED_EXTENSIONS Set or an isSupportedFile(name: string)
helper) that contains all extensions in lowercase, lowercase the file.name once
(const name = file.name.toLowerCase()), and test membership using
name.endsWith(ext) or checking ext against the set; replace the current inline
predicate in supportedFiles with that helper and reuse it in the other similar
checks in this component.
CodeRabbit review on the consolidated PR. Real-bug fixes:
- Canceller ownership: useIfcLoader + useIfcFederation now only
null the store-wide canceller when the stored function is still
ours. A second load that started before the first settles
keeps its own Cancel button.
- Picker.pick origin clamping: Math.floor(x/y) could be -1 or
width/height on border clicks, making copyTextureToBuffer
reject the submit. Now clamps to [0, width-1] / [0, height-1]
matching pickRect's existing guard.
- Scene.forEachMeshData dedup: a colour-merged batch lives under
every contributor's expressId. The deviation BVH was double-
counting those triangles. Set<MeshData> guard.
- DeviationPipeline disposal: destroy() now releases the BVH GPU
buffers + clears the fingerprint cache.
- SectionPanel preview-stride leak: pointercancel + onBlur
handlers + unmount cleanup so the splat shader can't stay
stuck at stride 4 when the slider loses focus mid-drag.
- useMouseControls.pickRect rejection: WebGPU device-loss now
logs and leaves selection unchanged instead of unhandled-
rejecting from a pointer event.
- e57-xml: reject non-finite / negative fileOffset + recordCount
before pushing into Data3DEntry. Malformed XML metadata can no
longer poison the binary decode path.
- Renderer BVH fingerprint: includes per-mesh expressId /
modelIndex / positions length / indices length so two distinct
mesh sets that share the same aggregate position-length total
can't alias each other and reuse a stale BVH.
Changeset metadata:
- pointcloud-near-term-correctness: + @ifc-lite/renderer patch
- pointcloud-e57-scaled-integer: + @ifc-lite/renderer patch
Both PRs touched the renderer's published surface; the metadata
needed to reflect that for the version bump.
Verified: full repo typecheck (24/24), 80 pointcloud unit tests,
655 viewer tests, viewer Vite build green.
Real-world Faro / Trimble exports (e.g. Union_Station.e57) failed with "bytestream colorBlue (2600 bytes) runs past packet payload at offset 46900" because the round-2 packet-bounds guard subtracted 4 bytes for a phantom packet-trailing CRC. Per ASTM E2807-11 §6.3, CRCs in E57 live at the PAGE level (4 bytes per 1024-byte physical page), not per DataPacket. `stripPageCrc` already removes them before the packet walk runs. The packet's bytestreams fill the full `packetLogicalLength`, so payloadEnd = packetEnd. Subtracting an extra 4 bytes false-positives whenever a real exporter packs the last bytestream tight against packetEnd. Fix: payloadEnd = packetEnd. The header / length-table / per-stream guards still catch genuinely malformed files; they were just being overzealous on legal ones. Regression test: a 3-point packet with bytestreams that fill the packet exactly (no trailing slack). Would have failed before the fix; passes now. Verified: 81 pointcloud unit tests, full repo typecheck (24/24).
Classification mode renders everything as "unclassified" grey for
some files and we don't yet know whether the issue is upstream
(decoder didn't emit classes) or downstream (packing / shader).
Two cross-checkable diagnostic logs (capped at 3 chunks each) on
the boundaries:
- ingestPointCloud.onChunk: per-chunk presence + 16-bin class
histogram. Tells us whether classifications arrived at all and
what their distribution is.
- point-cloud-node.appendChunkToNode: reads back the packed
class byte (vertex offset +15) for the first 8 points and logs
them. Tells us whether the bytes that the splat shader will
sample match what the host saw.
If the host log shows e.g. hist={0=10,2=4500,6=2000} but the vertex
log shows all 0, the packing is wrong. If both show all 0, the
decoder isn't emitting classes (look at the format's streaming
source). If both show a non-trivial spread, the shader read or
colour palette path is wrong.
Logs cap at 3 per page session to keep the console readable; they
auto-reset on HMR.
The diagnostic confirmed: E57 files don't emit classifications
because ASTM E2807 doesn't define one in the standard prototype.
Stock Faro / Leica / Trimble scans simply have no per-point class
IDs to decode — classification colour mode rendering everything as
class 0 ("Created, never classified") is correct behaviour for
those files.
Two improvements:
- decodeE57Scan now looks for a non-standard `classification`
field (Integer or ScaledInteger) in the prototype. CloudCompare
+ custom LIDAR pipelines occasionally include one; we honour
it when present so the splat shader's classification mode
works on those files. New readClassificationStream helper
handles both encodings.
- E57StreamingSource.hasClassification now reflects whether the
chunk actually has classifications (was hardcoded false).
- Debug log message now spells out the E57-vs-LAS distinction so
users can see "this is correct, your file just doesn't carry
class data" rather than wondering if it's a viewer bug.
Verified: 81 pointcloud unit tests pass, full repo typecheck (24/24).
laz-perf's emscripten shim called fetch("laz-perf.wasm") relative to
the worker's script directory, which under Vite hits a 404 and the
SPA index.html fallback served as text/plain — making the WebAssembly
streaming compile reject the Response and the ArrayBuffer fallback
abort the same way.
Resolve the wasm via Vite's ?url asset pipeline, pre-fetch the bytes,
and hand them to emscripten as Module.wasmBinary so the shim skips its
own (broken-by-default) resolution path entirely.
There was a problem hiding this comment.
Actionable comments posted: 1
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (2)
packages/pointcloud/src/streaming/e57-source.ts (1)
59-88:⚠️ Potential issue | 🟠 Major | ⚡ Quick winClassifications are not preserved during stride downsampling.
The
applyStridefunction handlespositions,colors, andintensities, but omitsclassifications. Whenstride > 1, classification data is silently dropped, andhasClassificationwill incorrectly reportfalseeven when the source E57 contains classifications.🐛 Proposed fix
function applyStride(chunk: DecodedPointChunk, stride: number): DecodedPointChunk { const s = Math.max(1, stride | 0); if (s === 1) return chunk; const newCount = Math.ceil(chunk.pointCount / s); const positions = new Float32Array(newCount * 3); const colors = chunk.colors ? new Float32Array(newCount * 3) : undefined; const intensities = chunk.intensities ? new Uint16Array(newCount) : undefined; + const classifications = chunk.classifications ? new Uint8Array(newCount) : undefined; let dst = 0; for (let i = 0; i < chunk.pointCount; i += s) { positions[dst * 3] = chunk.positions[i * 3]; positions[dst * 3 + 1] = chunk.positions[i * 3 + 1]; positions[dst * 3 + 2] = chunk.positions[i * 3 + 2]; if (colors && chunk.colors) { colors[dst * 3] = chunk.colors[i * 3]; colors[dst * 3 + 1] = chunk.colors[i * 3 + 1]; colors[dst * 3 + 2] = chunk.colors[i * 3 + 2]; } if (intensities && chunk.intensities) { intensities[dst] = chunk.intensities[i]; } + if (classifications && chunk.classifications) { + classifications[dst] = chunk.classifications[i]; + } dst++; } return { positions, colors, intensities, + classifications, pointCount: newCount, bbox: chunk.bbox, }; }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/pointcloud/src/streaming/e57-source.ts` around lines 59 - 88, applyStride currently downsamples positions, colors, and intensities but omits copying classifications, causing DecodedPointChunk.classifications to be lost and hasClassification to become false; update the applyStride function to allocate a new classifications array when chunk.classifications exists (matching newCount length and appropriate type), fill it in the same loop using chunk.classifications[i] aligned with the sampled indices, and return this classifications array in the resulting DecodedPointChunk alongside positions, colors, intensities, pointCount, and bbox so downstream callers see preserved classification data.apps/viewer/src/hooks/useIfcLoader.ts (1)
1612-1637:⚠️ Potential issue | 🟡 Minor | ⚡ Quick winLog this rejected ingest before returning.
This catch updates model/UI state and returns, but it never logs either the abort path or the real parsing failure. That makes point-cloud ingest failures hard to diagnose and misses the repo rule for catch blocks.
As per coding guidelines, "Every
catchblock must either log the error or re-throw; never use barecatch {}without a/* cleanup — safe to ignore */comment for truly irrelevant failures".🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@apps/viewer/src/hooks/useIfcLoader.ts` around lines 1612 - 1637, The catch block in useIfcLoader around the point-cloud ingest never logs the rejection; add logging before each early return so failures are recorded: when detecting a stale session (loadSessionRef.current !== currentSession) call the repo logger (or console.error) with context including ingest.rendererHandle and the error, then removePointCloudAsset and clearOwnedCanceller as currently done; for the abort branch (isAbort) log that the ingest was cancelled along with primaryModelId and ingest.rendererHandle; for the real failure branch log the full error/message plus format and primaryModelId before calling updateModel/setError/clearOwnedCanceller/setLoading. Ensure you reference the existing symbols (loadSessionRef, currentSession, renderer.removePointCloudAsset, ingest.rendererHandle, isAbort, updateModel, setError, setLoading, clearOwnedCanceller, format) and do not change the existing state-updating flow.
♻️ Duplicate comments (1)
apps/viewer/src/hooks/ingest/pointCloudIngest.ts (1)
207-210:⚠️ Potential issue | 🟡 Minor | ⚡ Quick winBroaden the PCD probe beyond
# .PCDat byte 0.Line 210 still recognizes only one comment-style PCD header. Valid PCD files that start with
VERSIONorFIELDSwill fall through to extension-based detection, so a renamed PCD can still be routed to the wrong decoder.♻️ Suggested change
- const b0 = view.getUint8(0), b1 = view.getUint8(1), b2 = view.getUint8(2); - if (b0 === 0x70 /* p */ && b1 === 0x6c /* l */ && b2 === 0x79 /* y */) return 'ply'; - if (b0 === 0x23 /* # */ && view.byteLength > 4 && view.getUint8(2) === 0x2e /* . */) return 'pcd'; + const b0 = view.getUint8(0), b1 = view.getUint8(1), b2 = view.getUint8(2); + if (b0 === 0x70 /* p */ && b1 === 0x6c /* l */ && b2 === 0x79 /* y */) return 'ply'; + const header = new TextDecoder().decode(new Uint8Array(buffer, 0, view.byteLength)).toUpperCase(); + if (header.includes('.PCD') || header.startsWith('VERSION ') || header.startsWith('FIELDS ')) return 'pcd';🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@apps/viewer/src/hooks/ingest/pointCloudIngest.ts` around lines 207 - 210, The current ASCII probe only detects PCD when the file begins with "# .PCD" and misses valid PCDs that start with keywords like "VERSION" or "FIELDS"; update the detection in the ingest probe (the code using view.getUint8 and variables b0,b1,b2) to also recognize ASCII starts for "VERSION" (bytes 'V','E','R') and "FIELDS" (bytes 'F','I','E') — e.g., check b0/b1/b2 for 0x56/0x45/0x52 and 0x46/0x49/0x45 respectively (in addition to the existing '#' + '.' + 'PCD' check), and consider skipping initial ASCII whitespace/newline when reading b0/b1/b2 so files that begin with whitespace still detect as PCD.
🧹 Nitpick comments (2)
packages/pointcloud/src/formats/e57.test.ts (1)
269-282: 💤 Low valueDead code from earlier test iteration.
Lines 269-282 create
buf,view, andbytesvariables that are immediately shadowed byfullBuf,fv, andfbat lines 288-291. The abandoned setup is never used. Consider removing to improve test clarity.♻️ Suggested cleanup
it('decodes ScaledInteger streams with bitsPerRecord that crosses byte boundaries', () => { - // bitsPerRecord = 12 (min=0, max=4095, span=4095). Two values - // pack into 3 bytes LSB-first: - // value0 = 0xABC, value1 = 0xDEF - // byte 0 = value0 & 0xFF = 0xBC - // byte 1 = (value0 >> 8) | ((value1 & 0xF) << 4) = 0xA | 0xF0 = 0xFA - // byte 2 = value1 >> 4 = 0xDE - // With scale=1, offset=0, minimum=0 the decoded floats round-trip - // to the original raw_int, so we just check the cartesian values. - const packetLogicalLength = 4 + 2 + 2 + 3 + 4; // header + count + length + payload + CRC - const totalBytes = packetLogicalLength; - const buf = new ArrayBuffer(totalBytes); - const view = new DataView(buf); - const bytes = new Uint8Array(buf); - view.setUint8(0, 1); - view.setUint8(1, 0); - view.setUint16(2, packetLogicalLength - 1, true); - view.setUint16(4, 1, true); // 1 bytestream (we'll use only X to keep it minimal) - view.setUint16(6, 3, true); // bytestream length = 3 - bytes[8] = 0xBC; - bytes[9] = 0xFA; - bytes[10] = 0xDE; - - // recordCount=2; Y/Z would be required by decodeE57Scan, so use a - // 3-axis prototype and stuff Y/Z into the same bytestream by - // declaring 3 streams of 1 byte each. Easier: drop down to a - // 1-stream test that only has X — but decodeE57Scan demands all - // three. Build a complete 3-stream packet with the bit-packed X. + // bitsPerRecord = 12 for X (min=0, max=4095). Two 12-bit values + // pack into 3 bytes LSB-first: [0xABC, 0xDEF] → [0xBC, 0xFA, 0xDE] + // Y/Z use 4-bit values to keep the test compact. const fullLen = 4 + 2 + 2*3 + 3 + 1 + 1 + 4; const fullBuf = new ArrayBuffer(fullLen); const fv = new DataView(fullBuf);🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/pointcloud/src/formats/e57.test.ts` around lines 269 - 282, Remove the unused setup that creates buf, view, and bytes (variables packetLogicalLength, buf, view, bytes) which are immediately shadowed by fullBuf, fv, and fb; either delete the dead block (lines creating packetLogicalLength, new ArrayBuffer, DataView, and the bytes assignments) or reuse those same variables later (fullBuf, fv, fb) so there is no shadowing—ensure the test still constructs the same byte content and lengths but only uses one set of buffer/view/Uint8Array variables (referencing packetLogicalLength, buf/view/bytes or fullBuf/fv/fb consistently).packages/pointcloud/src/formats/e57-decode.ts (1)
441-465: 💤 Low valueClassification Integer handling differs from ScaledInteger when minimum ≠ 0.
For Integer (line 452):
raw - minshifts the value.
For ScaledInteger (line 464):raw + minimumrecovers the original.If an Integer classification field has
minimum: 5, class ID 7 would become 2 afterraw - min. This is inconsistent with ScaledInteger behavior which correctly recovers the original value. In practice, classification fields typically havemin=0so this works, but the semantic inconsistency could cause issues with unusual producers.Consider using just
rawfor Integer classification (since class IDs are absolute values, not range-normalized):🔧 Suggested fix
for (let i = 0; i < take; i++) { const off = start + i * stride; const raw = stride === 2 ? (signed ? view.getInt16(off, true) : view.getUint16(off, true)) : (signed ? view.getInt8(off) : view.getUint8(off)); - classifications[written + i] = Math.max(0, Math.min(255, raw - min)); + classifications[written + i] = Math.max(0, Math.min(255, raw)); }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/pointcloud/src/formats/e57-decode.ts` around lines 441 - 465, The Integer branch in the classification decode uses raw - min which shifts class IDs when field.minimum ≠ 0; change the Integer handling in the same block that computes raw (inside the field.kind === 'Integer' branch) to treat class IDs as absolute values (use raw directly, then clamp to 0..255) so its behavior matches the ScaledInteger path (see symbols: field.kind, min, raw, classifications, scaledIntegerBitsPerRecord, readBitsLE) — update the assignment to classifications[written + i] accordingly and keep the existing clamping.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@apps/viewer/src/hooks/useIfcFederation.ts`:
- Around line 669-678: The AbortError handler in useIfcFederation.ts currently
calls setError, setProgress and setLoading unconditionally and can stomp a newer
load; modify the DOMException AbortError branch (inside the addModel() / load
routine) to first check the per-load ownership token/ID used elsewhere (e.g.,
currentLoadId, loadOwner or similar) and only perform setError(null),
setProgress({ phase: 'Cancelled', percent: 0 }) and setLoading(false) if the
token matches the one for the cancelled request; otherwise skip those state
updates so a later addModel() instance isn't overwritten. Ensure you reference
the same ownership variable used by the rest of the loader to keep behavior
consistent.
---
Outside diff comments:
In `@apps/viewer/src/hooks/useIfcLoader.ts`:
- Around line 1612-1637: The catch block in useIfcLoader around the point-cloud
ingest never logs the rejection; add logging before each early return so
failures are recorded: when detecting a stale session (loadSessionRef.current
!== currentSession) call the repo logger (or console.error) with context
including ingest.rendererHandle and the error, then removePointCloudAsset and
clearOwnedCanceller as currently done; for the abort branch (isAbort) log that
the ingest was cancelled along with primaryModelId and ingest.rendererHandle;
for the real failure branch log the full error/message plus format and
primaryModelId before calling
updateModel/setError/clearOwnedCanceller/setLoading. Ensure you reference the
existing symbols (loadSessionRef, currentSession,
renderer.removePointCloudAsset, ingest.rendererHandle, isAbort, updateModel,
setError, setLoading, clearOwnedCanceller, format) and do not change the
existing state-updating flow.
In `@packages/pointcloud/src/streaming/e57-source.ts`:
- Around line 59-88: applyStride currently downsamples positions, colors, and
intensities but omits copying classifications, causing
DecodedPointChunk.classifications to be lost and hasClassification to become
false; update the applyStride function to allocate a new classifications array
when chunk.classifications exists (matching newCount length and appropriate
type), fill it in the same loop using chunk.classifications[i] aligned with the
sampled indices, and return this classifications array in the resulting
DecodedPointChunk alongside positions, colors, intensities, pointCount, and bbox
so downstream callers see preserved classification data.
---
Duplicate comments:
In `@apps/viewer/src/hooks/ingest/pointCloudIngest.ts`:
- Around line 207-210: The current ASCII probe only detects PCD when the file
begins with "# .PCD" and misses valid PCDs that start with keywords like
"VERSION" or "FIELDS"; update the detection in the ingest probe (the code using
view.getUint8 and variables b0,b1,b2) to also recognize ASCII starts for
"VERSION" (bytes 'V','E','R') and "FIELDS" (bytes 'F','I','E') — e.g., check
b0/b1/b2 for 0x56/0x45/0x52 and 0x46/0x49/0x45 respectively (in addition to the
existing '#' + '.' + 'PCD' check), and consider skipping initial ASCII
whitespace/newline when reading b0/b1/b2 so files that begin with whitespace
still detect as PCD.
---
Nitpick comments:
In `@packages/pointcloud/src/formats/e57-decode.ts`:
- Around line 441-465: The Integer branch in the classification decode uses raw
- min which shifts class IDs when field.minimum ≠ 0; change the Integer handling
in the same block that computes raw (inside the field.kind === 'Integer' branch)
to treat class IDs as absolute values (use raw directly, then clamp to 0..255)
so its behavior matches the ScaledInteger path (see symbols: field.kind, min,
raw, classifications, scaledIntegerBitsPerRecord, readBitsLE) — update the
assignment to classifications[written + i] accordingly and keep the existing
clamping.
In `@packages/pointcloud/src/formats/e57.test.ts`:
- Around line 269-282: Remove the unused setup that creates buf, view, and bytes
(variables packetLogicalLength, buf, view, bytes) which are immediately shadowed
by fullBuf, fv, and fb; either delete the dead block (lines creating
packetLogicalLength, new ArrayBuffer, DataView, and the bytes assignments) or
reuse those same variables later (fullBuf, fv, fb) so there is no
shadowing—ensure the test still constructs the same byte content and lengths but
only uses one set of buffer/view/Uint8Array variables (referencing
packetLogicalLength, buf/view/bytes or fullBuf/fv/fb consistently).
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
Run ID: cf24565e-c572-4e56-a9b2-3590d26e0231
📒 Files selected for processing (18)
.changeset/pointcloud-e57-scaled-integer.md.changeset/pointcloud-laz-wasm-mime.md.changeset/pointcloud-near-term-correctness.mdapps/viewer/src/components/viewer/tools/SectionPanel.tsxapps/viewer/src/components/viewer/useMouseControls.tsapps/viewer/src/hooks/ingest/pointCloudIngest.tsapps/viewer/src/hooks/useIfcFederation.tsapps/viewer/src/hooks/useIfcLoader.tspackages/pointcloud/src/formats/e57-decode.tspackages/pointcloud/src/formats/e57-xml.tspackages/pointcloud/src/formats/e57.test.tspackages/pointcloud/src/streaming/e57-source.tspackages/pointcloud/src/streaming/laz-source.tspackages/pointcloud/src/streaming/vendor.d.tspackages/renderer/src/index.tspackages/renderer/src/picker.tspackages/renderer/src/pointcloud/point-cloud-node.tspackages/renderer/src/scene.ts
✅ Files skipped from review due to trivial changes (4)
- .changeset/pointcloud-laz-wasm-mime.md
- packages/pointcloud/src/streaming/vendor.d.ts
- .changeset/pointcloud-e57-scaled-integer.md
- .changeset/pointcloud-near-term-correctness.md
🚧 Files skipped from review as they are similar to previous changes (4)
- packages/renderer/src/pointcloud/point-cloud-node.ts
- packages/renderer/src/picker.ts
- packages/pointcloud/src/formats/e57-xml.ts
- apps/viewer/src/components/viewer/useMouseControls.ts
- E57 applyStride now preserves classifications across stride > 1. - useIfcFederation guards addModel state writes with a loadSessionRef token so a cancelled load can't stomp a newer load's spinner. - E57 Integer classification keeps raw class IDs (no min subtraction) to match ScaledInteger semantics — class IDs are absolute labels. - PCD probe now also recognizes VERSION / FIELDS first-line headers, not just the # .PCD comment style. - useIfcLoader logs the pointcloud ingest catch on each branch (stale session, abort, real failure) per repo catch-block rule. - Test: drop shadowed buffer setup in the bit-packed ScaledInteger test so only the live fullBuf path remains.
Four items from issue #611's near-term list:
computeBBox empty + non-finite guards (e57 + ifcx-points). Empty
or NaN-poisoned chunks no longer produce ±Infinity bounds.
Magic-byte-first format detection. detectPointCloudFormat now
probes the buffer before falling back to extension, so a LAS
file named *.ply still loads correctly.
E57 packet-bounds + per-stream guards. Validate that the
DataPacket header, length table, and each bytestream fit inside
payloadEnd before reading.
Code-org splits:
to point-cloud-uniforms.ts (renderer drops to 364 lines)
Verified: 62 pointcloud unit tests, full repo typecheck (24/24).
Summary by CodeRabbit
New Features
Bug Fixes