Skip to content

Conversation

thehiddenwaffle
Copy link
Contributor

Purpose

https://discuss.luxonis.com/d/6203-are-depthai-core-contributions-welcome/9
Target built by CMake configuration, installing it as a separate(dependant on core) shared lib. This allows other C++ projects to include the additional lib and use python to build its pipeline. Incompatible with depthai_nodes currently.

Specification

strictly additive, disabled by default

Testing & Validation

Successfully using from a separate project locally.

…ake configuration, installing it as a separate(dependant on core) shared lib. This allows other C++ projects to include the additional lib and use python to build its pipeline. Incompatible with depthai_nodes currently.
Copy link
Collaborator

@moratom moratom left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @thehiddenwaffle !

Would you be open to also add an example and/or test for this?
I'm guessing something like a custom host node which is esentially an "on host Script node" would make sense here.

@thehiddenwaffle
Copy link
Contributor Author

I'm guessing something like a custom host node which is esentially an "on host Script node" would make sense here.

@moratom I've been thinking about it and I feel like maybe 2 examples are needed, one for each of the following cases

  1. My case where the C++ application does not need any python custom nodes(the case Python is merely a pipeline "setup" script). I would note that the interpreter can be destroyed after the pipeline is fully built as long as no Python defined host nodes are present(my reason for rewriting depthai_nodes in C++). I believe for most this will be the most appealing option for production and real time systems where speed is critical(so long as the speed of pipeline setup is not critical because pybind11 embedded interpreter is SO SLOW).

  2. The case where C++ would like to utilize python to both build the pipeline as well as manage the running of the pipeline(and optionally utilize python derived HostNodes) then two things seem true to me: the interpreter must be kept alive, and C++ must be able to aquire the data(I think the best way to do this would be to give the Python script a C++ callback so that it can push frames back to the C++ application for further use)

@thehiddenwaffle
Copy link
Contributor Author

thehiddenwaffle commented Oct 14, 2025

@moratom I've added both examples, however the one that uses Python derived host nodes doesn't work yet and it seems like the synchronization between the state of the Python class and what that needs to represent on the C++ side is not tight enough. Currently it complains that the datatype that Python sets does not exist and other approaches that I've tried also fail due to similar dysfunction between Python and C++ state. Unsure what to do. The first example(the one that I developed this feature for) works perfectly.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants