Difference between revisions of "Coral from Google/GstInference/Introduction"

From RidgeRun Developer Connection
Jump to: navigation, search
(GstInference Description)
Line 7: Line 7:
 
Deep Learning (DL) has revolutionized classic computer vision techniques to enable even more intelligent and autonomous systems.  To ease the software development burden for complex embedded visual Deep Learning applications, a multimedia framework, such as GStreamer, is utilized to simplify the task.  The Open Source GStreamer audio video streaming framework is a good choice as it separates the complexities of handling streaming video from the inference models processing the individual frames.  GstInference is an open-source GStreamer project sponsored by  RidgeRun that allows easy integration of deep learning networks into your video streaming application.
 
Deep Learning (DL) has revolutionized classic computer vision techniques to enable even more intelligent and autonomous systems.  To ease the software development burden for complex embedded visual Deep Learning applications, a multimedia framework, such as GStreamer, is utilized to simplify the task.  The Open Source GStreamer audio video streaming framework is a good choice as it separates the complexities of handling streaming video from the inference models processing the individual frames.  GstInference is an open-source GStreamer project sponsored by  RidgeRun that allows easy integration of deep learning networks into your video streaming application.
  
Please check our developer wiki with information about the project:
+
[[File:Coral example.png|thumb|center|600px|Example of use of GstInference in the Coral. The model used was the TinyYolo V2. Video took from: https://pixabay.com/videos/road-autobahn-motorway-highway-11018/]]
  
* [https://developer.ridgerun.com/wiki/index.php?title=GstInference GstInference]
+
[[GstInference|GstInference]] and [[R2Inference| R2Inference]] are supported by the Coral. To install it, follow these guides:
 +
 
 +
* [[R2Inference/Getting_started/Getting_the_code|Getting R2Inference code]]
 +
* [[R2Inference/Getting_started/Building_the_library|Building R2Inference]]: The Google Coral is optimized for Tensorflow Lite. To install TFLite for ARM64 follow this [https://developer.ridgerun.com/wiki/index.php?title=R2Inference/Supported_backends/TensorFlow-Lite#Cross-compile_for_ARM64 guide].
 +
* [[GstInference/Getting_started/Getting_the_code|Getting GstInference code]]
 +
* [[GstInference/Getting_started/Building_the_plugin|Building GstInference]]
 +
 
 +
After the installation is completed, you can generate GStreamer pipelines for the Coral using different GstInference elements in this [[GstInference/Example_pipelines_with_hierarchical_metadata| tool]].
  
  

Revision as of 12:53, 16 February 2021



Previous: GstInference Index Next: GstInference/Why_use_GstInference%3F





GstInference Description

Deep Learning (DL) has revolutionized classic computer vision techniques to enable even more intelligent and autonomous systems. To ease the software development burden for complex embedded visual Deep Learning applications, a multimedia framework, such as GStreamer, is utilized to simplify the task. The Open Source GStreamer audio video streaming framework is a good choice as it separates the complexities of handling streaming video from the inference models processing the individual frames. GstInference is an open-source GStreamer project sponsored by RidgeRun that allows easy integration of deep learning networks into your video streaming application.

Example of use of GstInference in the Coral. The model used was the TinyYolo V2. Video took from: https://pixabay.com/videos/road-autobahn-motorway-highway-11018/

GstInference and R2Inference are supported by the Coral. To install it, follow these guides:

After the installation is completed, you can generate GStreamer pipelines for the Coral using different GstInference elements in this tool.


Previous: GstInference Index Next: GstInference/Why_use_GstInference%3F