Skip to content

Conversation

jdries
Copy link

@jdries jdries commented Feb 19, 2025

These tests show how a user would typically use run_udf.

They also seem to expose a remaining issue in current implementation, in the sense that the UDF continues to receive the full input datacube as opposed to only receiving a chunk.

Forcing parallellization with dask does reveal error messages, so this is probably what we need to get behaviour similar to a 'real' backend.

@ValentinaHutter
Copy link
Collaborator

Pre-commit should be included in your poetry setup, but you can also resolve the pre-commit hook issue manually by running: pre-commit run --all-files

https://pre-commit.com/

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants