Difference between revisions of "GstWebRTC - OpenWebRTC Signaling Examples - x86"

From RidgeRun Developer Connection
Jump to: navigation, search
m
 
(40 intermediate revisions by 4 users not shown)
Line 1: Line 1:
{{GstWebRTC Page|
+
{{GstWebRTC/Head|previous=x86 Examples|next=PubNub Signaler Examples - x86|metakeywords=GstRrWebRTC Examples, WebRTC Examples, GstRrWebRTC GStreamer pipelines, WebRTC GStreamer pipelines, OpenWebRTC signaler, OpenWebRTC Examples, signaling}}
[[GstWebRTC - GstWebRTC Basics|GstWebRTC Basics]]|
 
[[GstWebRTC - GstWebRTC Pipelines |GstWebRTC Pipelines]]|
 
  
This page presents some examples of GstWebRTC using OpenWebRTC's signaler.
+
This page links to the GstRrWebRTC examples using the OpenWebRTC signaler for x86 Platform.
 
==Only Audio==
 
===Opus===
 
====Unidirectional====
 
Example: In this example we use webrtcsink to send an audio stream and webrtcsrc to receive the audio stream.
 
  
The following pipeline will send periodic ticks:
+
== Examples Index ==
<syntaxhighlight lang='bash'>
+
<html>
gst-launch-1.0  webrtcsink start-call=true signaler::server_url=http://webrtc.ridgerun.com:8080 \
+
  <div class="toc" style="font-size:80%;">
signaler::session_id=1234ridgerun name=web audiotestsrc is-live=true wave=8 ! audioconvert ! audioresample \
+
    <ol>
! queue ! opusenc ! rtpopuspay ! web.audio
+
      <li> <a href=https://developer.ridgerun.com/wiki/index.php?title=GstWebRTC_-_Audio_Examples_-_x86>Audio Examples</a>
</syntaxhighlight>
+
      </li>
 +
      <li> <a href=GstWebRTC_-_Video_Examples_-_x86>Video Examples</a></li>
 +
      <li> <a href=GstWebRTC_-_Audio_%2B_Video_Examples_-_x86>Audio + Video Examples</a></li>
 +
      <li> <a href=GstWebRTC_-_OpenWebRTC_Web_Page_-_x86>Demo Web Page</a></li>
 +
      <li> <a href=GstWebRTC_-_Data_Channel_Examples_-_x86>Data Channel Examples</a></li>
 +
    </ol>
 +
  </div>
 +
</html>
  
 
+
{{GstWebRTC/Foot|previous=x86 Examples|next=PubNub Signaler Examples - x86}}
The following pipeline will receive the periodic ticks:
 
<syntaxhighlight lang='bash'>
 
gst-launch-1.0 webrtcsrc start-call=false signaler::server_url=http://webrtc.ridgerun.com:8080 \
 
signaler::session_id=1234ridgerun name=web web.audio ! rtpopusdepay ! opusdec ! audioconvert ! \
 
alsasink async=false
 
</syntaxhighlight>
 
 
 
When executing the two previous pipelines, you should be able to listen the ticks in the receiving computer.
 
 
 
====Bidirectional====
 
Example: In this example we use two webrtcbin elements, each sends an audio stream and receives each other audio stream.
 
 
 
The following pipeline will send a white noise audio stream and receive the ticks audio stream sent by the next pipeline. This pipeline starts the call.
 
<syntaxhighlight lang='bash'>
 
gst-launch-1.0 webrtcbin start-call=true signaler::server_url=http://webrtc.ridgerun.com:8080 \
 
signaler::session_id=1234ridgerun name=web audiotestsrc is-live=true wave=5 ! audioconvert ! \
 
audioresample ! queue ! opusenc ! rtpopuspay ! web.audio_sink web.audio_src ! rtpopusdepay ! \
 
opusdec ! audioconvert ! alsasink sync=false async=false
 
</syntaxhighlight>
 
 
 
The following pipeline will send ticks audio stream and receive the white noise audio stream sent by the previous pipeline. This pipeline joins the call.
 
<syntaxhighlight lang='bash'>
 
gst-launch-1.0 webrtcbin start-call=false signaler::server_url=http://webrtc.ridgerun.com:8080 \
 
signaler::session_id=1234ridgerun name=web audiotestsrc is-live=true wave=8 ! audioconvert ! \
 
audioresample ! queue ! opusenc ! rtpopuspay ! web.audio_sink web.audio_src ! rtpopusdepay ! \
 
opusdec ! audioconvert ! alsasink sync=false async=false
 
</syntaxhighlight>
 
 
 
When executing the two previous pipelines, you should be able to listen the ticks and the white noise.
 
 
 
==Only Video==
 
===H264===
 
====Unidirectional====
 
Example: In this example we use webrtcsink to send a video stream and webrtcsrc to receive the video stream.
 
 
 
The following pipeline will send a color bars h264 video stream:
 
<syntaxhighlight lang='bash'>
 
gst-launch-1.0 webrtcsink start-call=true signaler::server_url=http://webrtc.ridgerun.com:8080 \
 
signaler::session_id=1234ridgerun name=web videotestsrc is-live=true ! queue ! videoconvert ! \
 
x264enc key-int-max=2 ! rtph264pay ! queue ! web.video
 
</syntaxhighlight>
 
 
 
The following pipeline will receive the video stream and display it:
 
<syntaxhighlight lang='bash'>
 
gst-launch-1.0 webrtcsrc start-call=false signaler::server_url=http://webrtc.ridgerun.com:8080 \
 
signaler::session_id=1234ridgerun name=web web.video ! rtph264depay ! avdec_h264 ! videoconvert \
 
! ximagesink async=true
 
</syntaxhighlight>
 
 
 
You should be able to see a video pattern similar to Fig.1.
 
[[File:ColorbarsWebRTC.png|thumbnail|center|Fig.1 Snapshot of video received]]
 
 
 
====Bidirectional====
 
Example: In this example we use two webrtcbin elements, each sends a video stream and receives each other video stream.
 
 
 
The following pipeline starts the call:
 
<syntaxhighlight lang='bash'>
 
gst-launch-1.0 webrtcbin start-call=true signaler::server_url=http://webrtc.ridgerun.com:8080 \
 
signaler::session_id=1234ridgerun name=web videotestsrc is-live=true ! queue ! videoconvert ! \
 
x264enc key-int-max=2 ! rtph264pay ! queue ! identity silent=false ! web.video_sink web.video_src \
 
! rtph264depay ! avdec_h264 ! videoconvert ! ximagesink async=true
 
</syntaxhighlight>
 
 
 
The following pipeline joins the call:
 
<syntaxhighlight lang='bash'>
 
gst-launch-1.0 webrtcbin start-call=false signaler::server_url=http://webrtc.ridgerun.com:8080 \
 
signaler::session_id=1234ridgerun name=web videotestsrc is-live=true ! queue ! videoconvert ! \
 
x264enc key-int-max=2 ! rtph264pay ! queue ! identity silent=false ! web.video_sink web.video_src \
 
! rtph264depay ! avdec_h264 ! videoconvert ! ximagesink async=true
 
</syntaxhighlight>
 
 
 
You should be able to see two windows with a video pattern similar to Fig.1.
 
 
 
===Vp8===
 
====Unidirectional====
 
Example: In this example we use webrtcsink to send a video stream and webrtcsrc to receive the video stream.
 
 
 
The following pipeline will send a color bars vp8 video stream:
 
<syntaxhighlight lang='bash'>
 
gst-launch-1.0 webrtcsink start-call=true signaler::server_url=http://webrtc.ridgerun.com:8080 \
 
signaler::session_id=1234ridgerun name=web videotestsrc is-live=true ! queue ! videoconvert ! \
 
vp8enc ! rtpvp8pay ! queue ! identity ! web.video
 
</syntaxhighlight>
 
 
 
The following pipeline will receive the video stream and display it:
 
<syntaxhighlight lang='bash'>
 
gst-launch-1.0 webrtcsrc start-call=false signaler::server_url=http://webrtc.ridgerun.com:8080 \
 
signaler::session_id=1234ridgerun name=web web.video ! rtpvp8depay ! vp8dec ! videoconvert ! \
 
ximagesink async=true
 
</syntaxhighlight>
 
You should be able to see a video pattern similar to Fig.1.
 
 
 
 
 
====Bidirectional====
 
Example: In this example we use two webrtcbin elements, each sends a video stream and receives each other video stream.
 
 
 
The following pipeline starts the call:
 
<syntaxhighlight lang='bash'>
 
gst-launch-1.0 webrtcbin start-call=true signaler::server_url=http://webrtc.ridgerun.com:8080 \
 
signaler::session_id=1234ridgerun name=web videotestsrc is-live=true ! queue ! videoconvert ! \
 
vp8enc ! rtpvp8pay ! queue ! identity silent=false ! web.video_sink web.video_src ! rtpvp8depay ! \
 
vp8dec ! videoconvert ! ximagesink async=true
 
 
 
</syntaxhighlight>
 
 
 
The following pipeline joins the call:
 
<syntaxhighlight lang='bash'>
 
gst-launch-1.0 webrtcbin start-call=false signaler::server_url=http://webrtc.ridgerun.com:8080 \
 
signaler::session_id=1234ridgerun name=web videotestsrc is-live=true ! queue ! videoconvert ! \
 
vp8enc ! rtpvp8pay ! queue ! identity silent=false ! web.video_sink web.video_src ! rtpvp8depay \
 
! vp8dec ! videoconvert ! ximagesink async=true
 
</syntaxhighlight>
 
You should be able to see two windows with a video pattern similar to Fig.1.
 
 
 
==3.Audio and Video==
 
===Opus + H264===
 
====Unidirectional====
 
Example: In this example we use webrtcsink to send a video stream and an audio stream, and we use webrtcsrc to receive the video and audio streams.
 
 
 
The following pipeline will send periodic ticks and a video stream:
 
<syntaxhighlight lang='bash'>
 
gst-launch-1.0  webrtcsink start-call=true signaler::server_url=http://webrtc.ridgerun.com:8080 \
 
signaler::session_id=1234ridgerun name=web videotestsrc is-live=true ! x264enc key-int-max=2 ! \
 
rtph264pay ! web.video audiotestsrc is-live=true wave=8 ! audioconvert \
 
! audioresample ! queue ! opusenc ! rtpopuspay ! web.audio
 
</syntaxhighlight>
 
 
 
 
 
The following pipeline will receive the video stream and the ticks and do the playback:
 
<syntaxhighlight lang='bash'>
 
gst-launch-1.0 webrtcsrc start-call=false signaler::server_url=http://webrtc.ridgerun.com:8080 \
 
signaler::session_id=1234ridgerun name=web web.video ! rtph264depay ! avdec_h264 ! videoconvert ! \
 
ximagesink async=false web.audio ! rtpopusdepay ! opusdec ! audioconvert ! alsasink async=false
 
</syntaxhighlight>
 
 
 
When executing the two previous pipelines, you should be able to listen the ticks in the receiving computer and see a video pattern similar to Fig.1.
 
 
 
====Bidirectional====
 
Example: In this example we use two webrtcbins, each send a video stream and an audio stream, and receives each other video and audio streams.
 
 
 
The following pipeline starts a call, sends white noise audio stream and a color bar video stream, and receives audio and video streams.
 
<syntaxhighlight lang='bash'>
 
gst-launch-1.0 webrtcbin start-call=true signaler::server_url=http://webrtc.ridgerun.com:8080 \
 
signaler::session_id=1234ridgerun name=web videotestsrc is-live=true ! queue ! videoconvert ! \
 
x264enc key-int-max=2 ! rtph264pay ! queue ! identity silent=false ! web.video_sink web.video_src \
 
! rtph264depay ! avdec_h264 ! videoconvert ! ximagesink async=true audiotestsrc is-live=true wave=5 \
 
! audioconvert ! audioresample ! queue ! opusenc ! rtpopuspay ! web.audio_sink web.audio_src ! \
 
rtpopusdepay ! opusdec ! audioconvert ! alsasink async=false
 
</syntaxhighlight>
 
 
 
The following pipeline joins the call, sends ticks audio stream and a color bar video stream, and receives audio and video streams.
 
<syntaxhighlight lang='bash'>
 
gst-launch-1.0 webrtcbin start-call=false signaler::server_url=http://webrtc.ridgerun.com:8080 \
 
signaler::session_id=1234ridgerun name=web videotestsrc is-live=true ! queue ! videoconvert ! \
 
x264enc key-int-max=2 ! rtph264pay ! queue ! identity silent=false ! web.video_sink web.video_src \
 
! rtph264depay ! avdec_h264 ! videoconvert ! ximagesink async=true web.audio_src ! rtpopusdepay ! \
 
opusdec ! audioconvert ! alsasink async=false audiotestsrc is-live=true wave=8 ! audioconvert ! \
 
audioresample ! queue ! opusenc ! rtpopuspay ! web.audio_sink
 
</syntaxhighlight>
 
When executing the two previous pipelines, you should be able to listen the ticks and whitenoise audio streams and see two windows with video patterns similar to Fig.1.
 
 
 
Example: In this example we use two webrtcbins, one sends an audio stream and receives a video stream and the other receives an audio stream and sends a video stream.
 
 
 
The following pipeline sends an audio stream and receives a video stream, also starts the call.
 
<syntaxhighlight lang='bash'>
 
gst-launch-1.0 webrtcbin start-call=true signaler::server_url=http://webrtc.ridgerun.com:8080 \
 
signaler::session_id=1234ridgerun name=web web.video_src ! rtph264depay ! avdec_h264 ! \
 
videoconvert ! ximagesink async=true audiotestsrc is-live=true wave=8 \
 
! audioconvert ! audioresample ! queue ! opusenc ! rtpopuspay ! web.audio_sink
 
</syntaxhighlight>
 
 
 
The following pipeline sends a video stream and receives an audio stream, also joins the call.
 
<syntaxhighlight lang='bash'>
 
gst-launch-1.0 webrtcbin start-call=false signaler::server_url=http://webrtc.ridgerun.com:8080 \
 
signaler::session_id=1234ridgerun name=web videotestsrc is-live=true ! queue ! videoconvert ! \
 
x264enc key-int-max=2 ! rtph264pay ! queue ! identity silent=false ! web.video_sink\
 
web.audio_src ! rtpopusdepay ! opusdec ! audioconvert ! alsasink async=false
 
</syntaxhighlight>
 
When executing the two previous pipelines, you should be able to listen the ticks audio stream and see a video pattern similar to Fig.1.
 
 
 
===Opus+Vp8===
 
====Unidirectional====
 
Example: In this example we use webrtcsink to send a video stream and an audio stream, and we use webrtcsrc to receive the video and audio streams.
 
 
 
The following pipeline generates a color bars video stream and a ticks audiostream and starts the call:
 
 
 
<syntaxhighlight lang='bash'>
 
gst-launch-1.0 webrtcsink start-call=true signaler::server_url=http://webrtc.ridgerun.com:8080 \
 
signaler::session_id=1234ridgerun name=web videotestsrc is-live=true ! queue ! videoconvert ! \
 
vp8enc ! rtpvp8pay ! queue ! identity ! web.video audiotestsrc is-live=true wave=8 ! audioconvert \
 
! audioresample ! queue ! opusenc ! rtpopuspay ! web.audio
 
</syntaxhighlight>
 
 
 
The following pipeline receives the video stream and the audio stream.
 
<syntaxhighlight lang='bash'>
 
gst-launch-1.0 webrtcsrc start-call=false signaler::server_url=http://webrtc.ridgerun.com:8080 \
 
signaler::session_id=1234ridgerun name=web web.audio ! rtpopusdepay ! opusdec ! audioconvert ! \
 
queue ! alsasink async=false web.video ! rtpvp8depay ! vp8dec ! videoconvert ! ximagesink async=true
 
</syntaxhighlight>
 
 
 
After executing the two previous pipelines you should be able to see a window with a pattern like Fig.1, and hear the ticks audio stream.
 
====Bidirectional====
 
Example: In this example we use two webrtcbins, each send a video stream and an audio stream, and receives each other video and audio streams.
 
The following pipeline starts a call, sends ticks audio stream and a color bar video stream, and receives audio and video streams.
 
<syntaxhighlight lang='bash'>
 
gst-launch-1.0 webrtcbin start-call=true signaler::server_url=http://webrtc.ridgerun.com:8080 \
 
signaler::session_id=1234ridgerun name=web videotestsrc is-live=true ! queue ! videoconvert ! \
 
vp8enc ! rtpvp8pay ! queue ! identity silent=false ! web.video_sink web.video_src ! rtpvp8depay \
 
! vp8dec ! videoconvert ! ximagesink async=true web.audio_src ! rtpopusdepay ! opusdec ! \
 
audioconvert ! alsasink async=false audiotestsrc is-live=true wave=8 ! audioconvert ! audioresample \
 
! queue ! opusenc ! rtpopuspay ! web.audio_sink
 
</syntaxhighlight>
 
 
 
The following pipeline joins the call, sends white noise audio stream and a color bar video stream, and receives audio and video streams.
 
 
 
<syntaxhighlight lang='bash'>
 
gst-launch-1.0 webrtcbin start-call=false signaler::server_url=http://webrtc.ridgerun.com:8080 \
 
signaler::session_id=1234ridgerun name=web videotestsrc is-live=true ! queue ! videoconvert ! \
 
vp8enc ! rtpvp8pay ! queue ! identity silent=false ! web.video_sink web.video_src ! rtpvp8depay \
 
! vp8dec ! videoconvert ! ximagesink async=true web.audio_src ! rtpopusdepay ! opusdec ! audioconvert \
 
! alsasink async=false audiotestsrc is-live=true wave=5 ! audioconvert ! audioresample ! queue ! \
 
opusenc ! rtpopuspay ! web.audio_sink
 
</syntaxhighlight>
 
After executing the two previous pipelines you should be able to see two windows with a pattern like Fig.1, and hear white noise and ticks audio streams.
 
 
 
 
 
Example: In this example we use two webrtcbins, one sends a video stream and receives an audio stream and the other receives a video stream and sends an audio stream.
 
 
 
The following pipeline sends a video stream and receives an audio stream, also starts the call.
 
<syntaxhighlight lang='bash'>
 
gst-launch-1.0 webrtcbin start-call=true signaler::server_url=http://webrtc.ridgerun.com:8080 \
 
signaler::session_id=1234ridgerun name=web videotestsrc is-live=true ! queue ! videoconvert ! \
 
vp8enc ! rtpvp8pay ! queue ! identity silent=false ! web.video_sink web.audio_src ! \
 
rtpopusdepay ! opusdec ! audioconvert ! alsasink async=false
 
</syntaxhighlight>
 
 
 
The following pipeline sends an audio stream and receives a video stream, also joins the call.
 
<syntaxhighlight lang='bash'>
 
gst-launch-1.0 webrtcbin start-call=false signaler::server_url=http://webrtc.ridgerun.com:8080 \
 
signaler::session_id=1234ridgerun name=web web.video_src ! rtpvp8depay ! vp8dec ! videoconvert \
 
! ximagesink async=true audiotestsrc is-live=true wave=8 ! audioconvert ! audioresample ! queue ! \
 
opusenc ! rtpopuspay ! web.audio_sink
 
</syntaxhighlight>
 
When executing the two previous pipelines, you should be able to listen the ticks audio stream and see a video pattern similar to Fig.1.
 
}}
 

Latest revision as of 11:46, 9 March 2023