RidgeRun GstInference support for TensorRT
- Jul 8, 2020
- 1 min read
Updated: Apr 29, 2024
RidgeRun has added new support for the GstInference GStreamer plugin for easy development of deep learning inference apps using GStreamer pipelines. This time there is new backend support for NVIDIA’s TensorRT SDK. TensorRT is based on CUDA which enables high performance during inference on the Jetson embedded platforms and NVIDIA GPU-based workstations. It is also possible to use models trained in any other framework and optimize them for TensorRT.
Figure 1 &2. Classification result for label ‘coffee mug’ (left) and detection results for labels `bottle` and ‘chair’ using Jetson TX2 platform.
Check out the following guides from RidgeRun:
Any Questions? : support@ridgerun.com
Contact Us:
Visit our Main Website for the RidgeRun online store, for pricing information of the RidgeRun products and Professional Services. Please email support@ridgerun.com for technical questions. Contact details for sponsoring the RidgeRun GStreamer projects are available at Sponsor Projects page.