Skip to content

[WIP] Update Jetpack to 4.6.1#15

Merged
mhejrati merged 5 commits intogalliot-us:mainfrom
lanthorn-ai:Update-Jetpack-4.6.1
Nov 8, 2021
Merged

[WIP] Update Jetpack to 4.6.1#15
mhejrati merged 5 commits intogalliot-us:mainfrom
lanthorn-ai:Update-Jetpack-4.6.1

Conversation

@renzodgc
Copy link
Copy Markdown
Contributor

@renzodgc renzodgc commented Oct 5, 2021

Update the jetpack version of the dockerfile to 4.6.1 and thus the TensorRT version to 8.

This is a work in progress, right now a model can be generated (using trt_exporter.py) but fails when using it on the Pose Detection environment (See galliot-us/adaptive-pose-estimation#8)

The raised error has something to do with the batch size (even though it's correctly set), I tried to review the TensorRT documentation and try out some stuff but to no avail.

python3 inference/inference.py --device jetson --input_video softbio_vid.mp4 --out_dir inference/ --detector_model_path adaptive_object_detection/detectors/data/frozen_inference_graph.bin --label_map train_outputs/label_map.pbtxt --batch_size 2 --pose_model_path models/data/fast_pose_fp16_b2.trt 
2021-10-05 10:00:38.348486: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library libcudart.so.10.2
WARNING:tensorflow:Deprecation warnings have been disabled. Set TF_ENABLE_DEPRECATION_WARNINGS=1 to re-enable them.
opened video  softbio_vid.mp4
WARNING:adaptive_object_detection.detectors.jetson_detector:
WARNING:adaptive_object_detection.detectors.jetson_detector:adaptive_object_detection/detectors/data/frozen_inference_graph.bin
WARNING:adaptive_object_detection.detectors.jetson_detector:
WARNING:root: 
WARNING:root:adaptive_object_detection/detectors/data/frozen_inference_graph.bin
WARNING:root: 
[TensorRT] WARNING: The logger passed into createInferRuntime differs from one already provided for an existing builder, runtime, or refitter. TensorRT maintains only a single logger pointer at any given time, so the existing value, which can be retrieved with getLogger(), will be used instead. In order to use a new logger, first destroy all existing builder, runner or refitter objects.

WARNING:root:started deserialize
WARNING:root:<tensorrt.tensorrt.ICudaEngine object at 0x7f73e99340>
[TensorRT] ERROR: 3: [executionContext.cpp::execute::473] Error Code 3: Internal Error (Parameter check failed at: runtime/api/executionContext.cpp::execute::473, condition: batchSize > 0 && batchSize <= mEngine.getMaxBatchSize()
)
[TensorRT] ERROR: 3: [executionContext.cpp::execute::473] Error Code 3: Internal Error (Parameter check failed at: runtime/api/executionContext.cpp::execute::473, condition: batchSize > 0 && batchSize <= mEngine.getMaxBatchSize()
)

@mahsa-shams
Copy link
Copy Markdown
Contributor

Hi @renzodgc, I solved the issue and sent a this PR to you.

@renzodgc
Copy link
Copy Markdown
Contributor Author

renzodgc commented Oct 25, 2021

Hi @renzodgc, I solved the issue and sent a this PR to you.

Hi @mahsa-shams, Awesome! I'll merge to this pr and add you as a reviewer for my two drafted ones :)

EDIT: It seems I can't add you as a reviewer, but I marked the PRs to be ready for review

fixed jetpack 4.6.1 uncompatibility issue
@renzodgc renzodgc marked this pull request as ready for review October 25, 2021 14:54
@mhejrati mhejrati merged commit e015a99 into galliot-us:main Nov 8, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants