Difference between revisions of "GstWebRTC - GstWebRTC Basics"

From RidgeRun Developer Connection
Jump to: navigation, search
Line 49: Line 49:
 
   </tr>
 
   </tr>
 
   <tr>
 
   <tr>
     <td rowspan="3"> GstWebRTCsrc</td>
+
     <td rowspan="3"> GstWebRTCsink</td>
 
     <td>Audio Only</td>  
 
     <td>Audio Only</td>  
 
     <td><img src="http://developer.ridgerun.com/wiki/images/e/ef/Gstwebrtc-audio.png" width=600></img></td>
 
     <td><img src="http://developer.ridgerun.com/wiki/images/e/ef/Gstwebrtc-audio.png" width=600></img></td>
Line 55: Line 55:
 
<tr>
 
<tr>
 
   <td>Video Only</td>
 
   <td>Video Only</td>
   <td><img src="http://developer.ridgerun.com/wiki/images/e/ef/Gstwebrtc-video.png" width=600></img></td>
+
   <td><img src="http://developer.ridgerun.com/wiki/images/2/21/Gstwebrtc-video.png" width=600></img></td>
 
</tr>
 
</tr>
 
<tr>
 
<tr>
 
   <td>Audio+Video</td>
 
   <td>Audio+Video</td>
   <td><img src="http://developer.ridgerun.com/wiki/images/e/ef/Gstwebrtc-audiovideo.png" width=600></img></td>
+
   <td><img src="http://developer.ridgerun.com/wiki/images/a/a6/Gstwebrtc-audiovideo.png" width=600></img></td>
 
</tr>
 
</tr>
 
<tr>
 
<tr>
   <td rowspan="3"> GstWebRTCsink</td>
+
   <td rowspan="3"> GstWebRTCsrc</td>
 
   <td>Audio Only</td>
 
   <td>Audio Only</td>
   <td><img src="http://developer.ridgerun.com/wiki/images/e/ef/Gstwebrtcsrc-audio.png" width=600></img></td>
+
   <td><img src="http://developer.ridgerun.com/wiki/images/8/89/Gstwebrtcsrc-audio.png" width=600></img></td>
 
</tr>
 
</tr>
 
<tr>
 
<tr>
 
   <td>Video Only</td>
 
   <td>Video Only</td>
   <td><img src="http://developer.ridgerun.com/wiki/images/e/ef/Gstwebrtcsrc-video.png" width=600></img></td>
+
   <td><img src="http://developer.ridgerun.com/wiki/images/6/69/Gstwebrtcsrc-video.png" width=600></img></td>
 
</tr>
 
</tr>
 
<tr>
 
<tr>
 
   <td>Audio+Video</td>
 
   <td>Audio+Video</td>
   <td><img src="http://developer.ridgerun.com/wiki/images/e/ef/Gstwebrtcsrc-audiovideo.png" width=600></img></td>
+
   <td><img src="http://developer.ridgerun.com/wiki/images/e/e2/Gstwebrtcsrc-audiovideo.png" width=600></img></td>
 
</tr>
 
</tr>
 
<caption>Table 1.  
 
<caption>Table 1.  

Revision as of 12:56, 5 July 2017


WebRTC Fundamentals


Home

Quick Start Guide



What is GstWebRTC?

GstWebRTC is a GStreamer plug-in that turns pipelines into WebRTC compliant endpoints. The plug-in is equipped with three elements:

  • GstWebRTCSrc
  • GstWebRTCSink
  • GstWebRTCBin

These elements allow audio and/or video streaming using the WebRTC protocol.


Why GstWebRTC?

Other WebRTC solutions will automatically detect the video and audio sources, as well as the decoders/encoders and other elements to be used to build the pipeline. This may be convenient for many applications, but result limiting for several other use cases. To mention some of them:

  • Extend existing pipeline to support WebRTC streaming
  • Use non-standard pipeline configurations
  • High performance pipeline tuning for resource critical systems
  • Dynamic stream handling in a running pipeline.
  • Fine grained pipeline control
  • Quick gst-launch prototyping

GstWebRTC was developed based on this criteria. As such, the plug-in is ideal for:

  • Embedded platforms
  • Existing media servers/applications
  • Advanced multimedia solutions

Architecture

Direction

RidgeRun's GstWebRTCSink and GstWebRTCSrc are used as standard GStreamer sink and source elements respectively. If a pipeline only uses the sink element, it becomes a send-only endpoint. Similarly, if a pipeline only uses the source it becomes a receive-only endpoint. Finally, by using the both elements the pipeline behaves as a bidirectional endpoint. Figures 1, 2 and 3 show the scenarios described above, respectively.

Figure 1. Pipeline as a WebRTC send-only endpoint.
Figure 2. Pipeline as a WebRTC receive-only endpoint.
Figure 3. Pipeline as a WebRTC send-receive endpoint.

Media Type

Both, the sink and source elements may send/receive audio, video or both simultaneously. The supported capabilities are determined at runtime based on the pads that where requested for the elements. Simply said, if a GstWebRTCSink element was created with a single audio pad, then it will only be capable of sending audio. Similarly, if a GstWebRTCSrc was created with video and audio pads, it will be capable of receiving both medias. Table 1 summarizes the complete set of possible configurations. At the time being, only one pad of each media type can be created.

Element Configuration Graphical Description
GstWebRTCsink Audio Only
Video Only
Audio+Video
GstWebRTCsrc Audio Only
Video Only
Audio+Video
Table 1.



WebRTC Fundamentals


Home

Quick Start Guide