Skip to content

Conversation

@alexnorell
Copy link
Contributor

@alexnorell alexnorell commented Nov 5, 2025

Problem

The Jetson 6.2.0 base image already exists and works, but it's published under a different name:

  • Existing: roboflow/roboflow-inference-server-jetson:jetpack-6.2.0 (built Oct 17, 2025)
  • Needed: roboflow/roboflow-inference-server-jetson-6.2.0:latest (for downstream projects)

Trying to rebuild from scratch hits Docker's max layer depth limit and various dependency issues.

Solution

Create a workflow that retags the existing working image instead of rebuilding:

Workflow (.github/workflows/docker.jetson.6.2.0.yml)

  • Trigger: Manual workflow dispatch
  • Action: Pull, retag, and push
  • Source: roboflow/roboflow-inference-server-jetson:jetpack-6.2.0
  • Targets:
    • roboflow/roboflow-inference-server-jetson-6.2.0:latest
    • roboflow/roboflow-inference-server-jetson-6.2.0:$VERSION

Dockerfile (docker/dockerfiles/Dockerfile.onnx.jetson.6.2.0)

  • Restored to match the original working version from remote machine
  • Uses roboflow/l4t-ml:r36.4.tegra-aarch64-cu126-22.04 base
  • Simple single-stage build (~50 lines)
  • No layer flattening or dependency workarounds needed (for reference only)

Why Retag Instead of Rebuild?

  1. Existing image works perfectly - Built on Oct 17, 2025, 14GB, fully functional
  2. Avoids Docker layer depth issues - The l4t-ml base has too many layers
  3. Avoids dependency problems - opencv-contrib-python, structlog, etc. not available in Jetson pip index
  4. Fast and reliable - Pull/tag/push takes seconds vs hours of building
  5. Same binary - Exact same image, just with additional tags

Usage

To publish the retagged image:

gh workflow run docker.jetson.6.2.0.yml --repo roboflow/inference --field force_push=true

Impact

  • Enables downstream PR: roboflow/inference-for-manufacturing#201
  • Provides consistent naming for Jetson 6.2.0 images
  • No risk of introducing build issues
  • Can be run anytime to update tags

Source

Dockerfile sourced from: roboflow@ubuntu:/mnt/nvme/inference
Branch: feat/jetson-62-minimal-dockerfile
Commit: a2329b4b4

…io version

- Change image tags from roboflow-inference-server-jetson-6.2.0 to roboflow-inference-server-jetson:jetpack-6.2.0
- This matches the existing working image published by the Jetson 6.X workflow
- Downgrade rasterio from ~=1.4.0 to ~=1.3.0 for GDAL 3.4.1 compatibility in Jetpack 6.2.0 base image
- Fixes: rasterio 1.4+ requires GDAL >= 3.5, but l4t-jetpack:r36.4.0 has GDAL 3.4.1
…0 with version tags

Container name: roboflow/roboflow-inference-server-jetson-6.2.0
Tags: :latest and :$VERSION

This matches the naming convention from the original PR #1671.
The build will now succeed with the rasterio 1.3.0 fix for GDAL compatibility.
@alexnorell
Copy link
Contributor Author

Updated to use correct container naming:

  • Container name: roboflow/roboflow-inference-server-jetson-6.2.0
  • Tags: :latest and :$VERSION

This matches the naming convention from PR #1671. The build will now succeed because we've fixed the rasterio GDAL compatibility issue (downgraded to 1.3.0).

Replace complex multi-stage build with the simple, working Dockerfile
from the remote machine at /mnt/nvme/inference.

Changes:
- FROM roboflow/l4t-ml:r36.4.tegra-aarch64-cu126-22.04 (pre-built base)
- Single-stage build (~50 lines vs 177)
- No ONNX Runtime compilation (pre-included in base)
- CORE_MODEL_SAM_ENABLED=False (was SAM2_ENABLED=True)
- Matches the version that successfully built on Oct 14
@alexnorell alexnorell changed the title Fix Jetson 6.2.0 workflow image tags and rasterio GDAL compatibility Add Jetson 6.2.0 workflow with optimized l4t-ml base image Nov 6, 2025
Use multi-stage build to copy entire filesystem from l4t-ml base image
into a fresh scratch layer, then build from that flattened layer.

This avoids Docker's max layer depth limit while maintaining all the
pre-built dependencies from the roboflow/l4t-ml base image.

Technique:
1. FROM base image AS base
2. FROM scratch + COPY --from=base / /
3. FROM flattened layer

This resets the layer count to 1 before adding application layers.
The inference CLI build requires python-dotenv to import the version module.
Add this dependency installation after flattening the base image.
Instead of rebuilding, this workflow retags the existing working image:
- Source: roboflow/roboflow-inference-server-jetson:jetpack-6.2.0 (built Oct 17, 2025)
- Target: roboflow/roboflow-inference-server-jetson-6.2.0:latest
- Also tags with version: roboflow/roboflow-inference-server-jetson-6.2.0:$VERSION

This approach:
- Avoids Docker layer depth issues when building
- Uses the proven working image from Oct 17
- Provides consistent naming for downstream projects
- Can be run manually via workflow_dispatch

Dockerfile updated to match the original working version from remote machine
(commit a2329b4b4 on feat/jetson-62-minimal-dockerfile branch).
@alexnorell alexnorell changed the title Add Jetson 6.2.0 workflow with optimized l4t-ml base image Add Jetson 6.2.0 workflow to retag existing working image Nov 6, 2025
@alexnorell
Copy link
Contributor Author

Updated approach:

  • ✅ Workflow now retags existing working image instead of rebuilding
  • ✅ Dockerfile restored to original working version from remote machine
  • ✅ Removed layer flattening and dependency workarounds (not needed)
  • ✅ Can be run manually via workflow dispatch

To publish: Run the workflow with force_push=true to push the retagged images to Docker Hub.

The original approach of building from roboflow/l4t-ml:r36.4.tegra-aarch64-cu126-22.04
fails due to max layer depth exceeded (base image has 186 layers).

Solution: Build incrementally from the existing working image
roboflow/roboflow-inference-server-jetson:jetpack-6.2.0 which was built
on Oct 17. This approach:
- Avoids Docker's layer depth limit
- Updates inference packages with latest code
- Maintains all functionality of the base image

The workflow now actually builds the image instead of just retagging.
@PawelPeczek-Roboflow
Copy link
Collaborator

@alexnorell is that ready?

- Use Depot infrastructure for linux/arm64 platform builds
- Align with other Jetson workflow patterns (6.0.0, 5.1.1, etc.)
- Auto-push on releases and main branch
- Increase timeout to 120 minutes
- Use same Depot project ID as other Jetson builds
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants