Skip to content

Conversation

gforsyth
Copy link
Contributor

@gforsyth gforsyth commented Apr 3, 2025

Port all condabuild recipes over to use rattler-build instead.

Contributes to rapidsai/build-planning#47

@gforsyth gforsyth added improvement Improves an existing functionality non-breaking Introduces a non-breaking change labels Apr 3, 2025
Copy link

copy-pr-bot bot commented Apr 3, 2025

Auto-sync is disabled for draft pull requests in this repository. Workflows must be run manually.

Contributors can view more details about this message here.

@gforsyth
Copy link
Contributor Author

gforsyth commented Apr 3, 2025

/ok to test

3 similar comments
@gforsyth
Copy link
Contributor Author

gforsyth commented Apr 3, 2025

/ok to test

@gforsyth
Copy link
Contributor Author

gforsyth commented Apr 3, 2025

/ok to test

@gforsyth
Copy link
Contributor Author

gforsyth commented Apr 3, 2025

/ok to test

@gforsyth
Copy link
Contributor Author

gforsyth commented Apr 7, 2025

/ok to test

@gforsyth
Copy link
Contributor Author

/ok to test 7ab3a96

@gforsyth
Copy link
Contributor Author

/ok to test

@jakirkham
Copy link
Member

We discussed this again at the cuCIM meeting this week, but didn't see an obvious cause for the test failures

Diffing the changes relative to a recent passing PR ( #867 ) didn't provide any obvious clues

Tried updating this PR and opening a fresh testing PR ( #868 ). The latter passed whereas this still failed

Am starting wondering if there is something unique to the changes in this PR, which lead to the CI failures we see. Still not sure exactly what change causes it though

@jakirkham jakirkham changed the title Port all conda recipes to rattler-build Port all conda recipes to rattler-build & use strict channel priority Apr 16, 2025
@gforsyth
Copy link
Contributor Author

We discussed this again at the cuCIM meeting this week, but didn't see an obvious cause for the test failures

Diffing the changes relative to a recent passing PR ( #867 ) didn't provide any obvious clues

Tried updating this PR and opening a fresh testing PR ( #868 ). The latter passed whereas this still failed

Am starting wondering if there is something unique to the changes in this PR, which lead to the CI failures we see. Still not sure exactly what change causes it though

Ok, thanks for looking into that @jakirkham ! I'll start combing through the recipe with a fine-toothed comb

@gforsyth
Copy link
Contributor Author

/ok to test

@gforsyth gforsyth changed the base branch from branch-25.06 to branch-25.08 May 27, 2025 15:54
@gforsyth
Copy link
Contributor Author

/ok to test

Copy link
Member

@jakirkham jakirkham left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks Gil! 🙏

Noticed something different about how the S3 environment variables are set. Could you please take a closer look?

Comment on lines -40 to -39
- SCCACHE_S3_KEY_PREFIX=libcucim-aarch64 # [aarch64]
- SCCACHE_S3_KEY_PREFIX=libcucim-linux64 # [linux64]
Copy link
Member

@jakirkham jakirkham May 28, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Previously we set the SCCACHE_S3_KEY_PREFIX like so

Note: The same is true of the cucim recipe

SCCACHE_REGION: ${{ env.get("SCCACHE_REGION") }}
SCCACHE_S3_USE_SSL: ${{ env.get("SCCACHE_S3_USE_SSL") }}
SCCACHE_S3_NO_CREDENTIALS: ${{ env.get("SCCACHE_S3_NO_CREDENTIALS") }}
SCCACHE_S3_KEY_PREFIX: libcucim/${{ env.get("RAPIDS_CONDA_ARCH") }}/cuda${{ cuda_major }}
Copy link
Member

@jakirkham jakirkham May 28, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

However it now looks like SCCACHE_S3_KEY_PREFIX is set differently?

Note: The same is true of the cucim recipe

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey @jakirkham ! Yeah, this was a deliberate change in all of the rattler ports to have slightly more convenient access patterns in the cache bucket.

- CMAKE_CUDA_COMPILER_LAUNCHER
- CMAKE_CXX_COMPILER_LAUNCHER
- CMAKE_GENERATOR
- PARALLEL_LEVEL
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also should we still set PARALLEL_LEVEL?

Looks like it was dropped from the new recipe

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmm, let me check.

@jakirkham
Copy link
Member

When you have a moment Gil, could you please resolve the conflicts here?

@gforsyth
Copy link
Contributor Author

gforsyth commented Jul 1, 2025

/ok to test

@gforsyth
Copy link
Contributor Author

gforsyth commented Jul 2, 2025

/ok to test

@jakirkham jakirkham changed the base branch from branch-25.08 to branch-25.10 July 31, 2025 19:13
@jakirkham
Copy link
Member

/ok to test

@jakirkham
Copy link
Member

Have moved this to 25.10

rapids-bot bot pushed a commit that referenced this pull request Aug 5, 2025
This is the last project in RAPIDS that hasn't converted its conda builds from `conda-build` to `rattler-build`.

That's happening in #864, but that PR isn't yet close to passing.

In rapidsai/gha-tools#145, I'm working on reducing complexity in `gha-tools` by removing unused tools. This repo is the last RAPIDS project using `rapids-configure-conda-channels`, and that usage won't be removed until #864 is complete.

To unblock that cleanup in `gha-tools`, this PR proposes checking in a copy of `rapids-configure-conda-channels` in this repo. That'll allow the work in `gha-tools` to proceed without being blocked by `cucim`'s migration to `rattler-build`.

Authors:
  - James Lamb (https://github.com/jameslamb)

Approvers:
  - Gil Forsyth (https://github.com/gforsyth)

URL: #923
@jameslamb
Copy link
Member

Adding a note that #923 should be reverted on this branch whenever this PR is revived.

@jakirkham
Copy link
Member

Adding a note that #923 should be reverted on this branch whenever this PR is revived.

I've gone ahead and made this change to this PR

It would be good to make sure this PR doesn't fall too far behind. We would like to merge it once it passes

@jakirkham
Copy link
Member

/ok to test

@jakirkham
Copy link
Member

/ok to test

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
improvement Improves an existing functionality non-breaking Introduces a non-breaking change
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants