RidgeRun’s GstInference support for Google’s EdgeTPU
Updated: Jun 16
Explore the new EdgeTPU backend based on the Coral Dev Board created for RidgeRun's GstInference project. This new feature allows the inference of quantized TensorFlow Lite models in the Tensor Processing Unit TPU ML accelerator (Machine Learning Accelerator).
Check out the following guides from RidgeRun:
GstInference EdgeTPU support : https://developer.ridgerun.com/wiki/index.php?title=GstInference/Supported_backends/EdgeTPU
GstInference benchmarks : https://developer.ridgerun.com/wiki/index.php?title=GstInference/Benchmarks#CPU_usage_measurement
R2Inference EdgeTPU backend : https://developer.ridgerun.com/wiki/index.php?title=R2Inference/Supported_backends/EdgeTPU
Google Coral Dev Board:
Any Questions? : firstname.lastname@example.org
Visit our Main Website for the RidgeRun online store for pricing information of the RidgeRun products and Professional Services. Please email email@example.com for technical questions. Contact details for sponsoring the RidgeRun GStreamer projects are available at Sponsor Projects page.