Difference between revisions of "GstWebRTC - GstWebRTC Basics"

From RidgeRun Developer Connection
Jump to: navigation, search
Line 30: Line 30:
 
* Advanced multimedia solutions
 
* Advanced multimedia solutions
  
== Architecture ==
 
 
=== Direction ===
 
==== Unidirectional Elements====
 
RidgeRun's GstWebRTCSink and GstWebRTCSrc are used as standard GStreamer sink and source elements respectively. If a pipeline uses the GstWebRTCSink element, it becomes a send-only endpoint, as shown in Figure 1.
 
<br />
 
[[File:Gstwebrtc-sendonly.png|600px|thumb|center|Figure 1. Pipeline as a WebRTC send-only endpoint.]]
 
<br />
 
 
==== Bidirectional Element ====
 
Ridgerun's GstWebRTCBin can be used as a sender-receiver endpoint, as shown in Figure 3.
 
<br />
 
[[File:Gstwebrtc-sendreceive.png|400px|thumb|center|Figure 3. Pipeline as a WebRTC send-receive endpoint.]]
 
<br /><br /><br />
 
=== Media Type ===
 
Both, the sink and source elements may send/receive audio, video or both simultaneously. The supported capabilities are determined at runtime based on the pads that where requested for the elements. Simply said, if a GstWebRTCSink element was created with a single audio pad, then it will only be capable of sending audio. Similarly, if a GstWebRTCSrc was created with video and audio pads, it will be capable of receiving both medias. Table 1 summarizes the complete set of possible configurations. At the time being, only one pad of each media type can be created.
 
 
<html>
 
<center>
 
<table class="wikitable">
 
  <tr>
 
    <th>Element</th>
 
    <th>Configuration</th>
 
    <th>Graphical Description</th>
 
  </tr>
 
  <tr>
 
    <td rowspan="3"> GstWebRTCsink</td>
 
    <td>Audio Only</td>
 
    <td><br /><img src="http://developer.ridgerun.com/wiki/images/e/ef/Gstwebrtc-audio.png" width=600></img><br /></td>
 
  </tr>
 
<tr>
 
  <td>Video Only</td>
 
  <td><br /><img src="http://developer.ridgerun.com/wiki/images/2/21/Gstwebrtc-video.png" width=600></img><br /></td>
 
</tr>
 
<tr>
 
  <td>Audio+Video</td>
 
  <td><br /><img src="http://developer.ridgerun.com/wiki/images/a/a6/Gstwebrtc-audiovideo.png" width=600></img><br /></td>
 
</tr>
 
<tr>
 
  <td rowspan="3"> GstWebRTCsrc</td>
 
  <td>Audio Only</td>
 
  <td><br /><img src="http://developer.ridgerun.com/wiki/images/8/89/Gstwebrtcsrc-audio.png" width=600></img><br /></td>
 
</tr>
 
<tr>
 
  <td>Video Only</td>
 
  <td><br /><img src="http://developer.ridgerun.com/wiki/images/6/69/Gstwebrtcsrc-video.png" width=600></img><br /></td>
 
</tr>
 
<tr>
 
  <td>Audio+Video</td>
 
  <td><br /><img src="http://developer.ridgerun.com/wiki/images/e/e2/Gstwebrtcsrc-audiovideo.png" width=600></img><br /></td>
 
</tr>
 
<caption>Table 1.
 
</table>
 
</center>
 
</html>
 
 
}}
 
}}

Revision as of 12:59, 7 July 2017


WebRTC Fundamentals


Home

Evaluating GstWebRTC



This page describes the basic features of Ridgerun's GstWebRTC Gstreamer plugin.

What is GstWebRTC?

GstWebRTC is a GStreamer plug-in that turns pipelines into WebRTC compliant endpoints. The plug-in is equipped with three elements:

  • GstWebRTCSrc
  • GstWebRTCSink
  • GstWebRTCBin

These elements allow audio and/or video streaming using the WebRTC protocol.


Why GstWebRTC?

Other WebRTC solutions will automatically detect the video and audio sources, as well as the decoders/encoders and other elements to be used to build the pipeline. This may be convenient for many applications, but result limiting for several other use cases. To mention some of them:

  • Extend existing pipeline to support WebRTC streaming
  • Use non-standard pipeline configurations
  • High performance pipeline tuning for resource critical systems
  • Dynamic stream handling in a running pipeline.
  • Fine grained pipeline control
  • Quick gst-launch prototyping

GstWebRTC was developed based on this criteria. As such, the plug-in is ideal for:

  • Embedded platforms
  • Existing media servers/applications
  • Advanced multimedia solutions




WebRTC Fundamentals


Home

Evaluating GstWebRTC