Coral from Google - GstInference - Why use GstInference?

From RidgeRun Developer Connection
Jump to: navigation, search



Previous: GstInference/Introduction Index Next: GstInference/Example Pipelines





GstInference maintains the modularity, flexibility, and scalability of the GStreamer pipeline architecture by encapsulating the inference logic in another processing element. To demonstrate how this approach would ease the problems described above, consider the following image.

Error creating thumbnail: Unable to save thumbnail to destination


Note how the inference is performed within the pipeline, honoring the pipeline architecture. This allows the graph to be easily extended if desired. These processing elements, marked as Detector and Classifier respectively, encapsulate the underlying machine learning framework. This allows the business logic to become much more stable and saves developers the need of learning these APIs. Finally, the application/pipeline data roundtrip is eliminated while leaving a one-directional business logic flow able to react to the predictions.

By using GstInference as the interface between Coral and GStreamer, users can:

  • Easily prototype GStreamer pipelines with common and basic GStreamer tools such as gst-launch and GStreamer Daemon.
  • Easily test and benchmark TFLite models using GStreamer with Coral.
  • Enable a world of possibilities to use Coral with video feeds from cameras, video files, and network streams, and process the prediction information (detection, classification, estimation, segmentation) to monitor events and trigger actions.
  • Develop intelligent media servers with recording, streaming, capture, playback, and display features.
  • Abstract GStreamer complexity in terms of buffers and events handling.
  • Abstract TensorFlow Lite complexity and configuration.
  • Make use of GstInference helper elements and API to visualize and easily extract readable prediction information.

RidgeRun Support

GstInference is an open-source project created and fully maintained by RidgeRun. By integrating GstInference as one of the main interfaces to use Coral, customers can:

  1. Take advantage of the ongoing support provided by RidgeRun engineers and other GStreamer community experts, including benefits from new features, bug fixes, and integration of new networks or models.
  2. Get access to pipelines, examples, and demos using Coral board potential and GStreamer capabilities.
  3. Get access to benchmarking information showing Coral’s Edge TPU performance using GStreamer.
  4. Contact RidgeRun for professional services in case of custom features need to be added, or help is needed to integrate the GStreamer pipeline in a full-fledged GStreamer application ready for production systems.


Previous: GstInference/Introduction Index Next: GstInference/Example Pipelines