RidgeRun GstInference extended support for OpenVINO
GstInference now has more hardware execution options through the support of the OpenVINO backend in ONNX Runtime. GstInference is a GStreamer plugin that allows easy integration of deep learning models with GStreamer pipelines for inference tasks. RidgeRun has expanded its functionality to enable hardware acceleration via Intel® Integrated Graphics, Intel® Neural Compute Sticks or Intel® Arria® 10 FPGA. This feature enables seamless integration of the hardware acceleration offered in OpenVINO into a media processing pipeline using GStreamer on IoT edge devices, like Raspberry Pi or any other ARM-based platforms.
Figure 1. Intel® Neural Compute Stick 2 attached to a Google Coral dev board.
Check out the following guides from RidgeRun:
Any Questions? : firstname.lastname@example.org
Visit our Main Website for the RidgeRun online store and pricing information of the RidgeRun products and Professional Services. Please email to email@example.com for technical questions. Contact details for sponsoring the RidgeRun GStreamer projects are available at Sponsor Projects page.