Difference between revisions of "GstWebRTC - OpenWebRTC Signaling Examples - x86"

From RidgeRun Developer Connection
Jump to: navigation, search
Line 178: Line 178:
 
</syntaxhighlight>
 
</syntaxhighlight>
 
When executing the two previous pipelines, you should be able to listen the ticks and whitenoise audio streams and see two windows with video patterns similar to Fig.1.
 
When executing the two previous pipelines, you should be able to listen the ticks and whitenoise audio streams and see two windows with video patterns similar to Fig.1.
 
+
<br /><br /><br /><br /><br />
 
'''Example''': In this example we use two webrtcbins, one sends an audio stream and receives a video stream and the other receives an audio stream and sends a video stream.
 
'''Example''': In this example we use two webrtcbins, one sends an audio stream and receives a video stream and the other receives an audio stream and sends a video stream.
  

Revision as of 16:15, 5 July 2017


GstWebRTC Basics


Home

GstWebRTC Pipelines



This page presents some examples of GstWebRTC using OpenWebRTC's signaler.

Only Audio

Opus

Unidirectional

Example: In this example we use webrtcsink to send an audio stream and webrtcsrc to receive the audio stream.

The following pipeline will send periodic ticks:

gst-launch-1.0  webrtcsink start-call=true signaler::server_url=http://webrtc.ridgerun.com:8080 \
signaler::session_id=1234ridgerun name=web audiotestsrc is-live=true wave=8 ! audioconvert ! audioresample \
! queue ! opusenc ! rtpopuspay ! web.audio


The following pipeline will receive the periodic ticks:

gst-launch-1.0 webrtcsrc start-call=false signaler::server_url=http://webrtc.ridgerun.com:8080 \
signaler::session_id=1234ridgerun name=web web.audio ! rtpopusdepay ! opusdec ! audioconvert ! \
alsasink async=false

When executing the two previous pipelines, you should be able to listen the ticks in the receiving computer.

Bidirectional

Example: In this example we use two webrtcbin elements, each sends an audio stream and receives each other audio stream.

The following pipeline will send a white noise audio stream and receive the ticks audio stream sent by the next pipeline. This pipeline starts the call.

gst-launch-1.0 webrtcbin start-call=true signaler::server_url=http://webrtc.ridgerun.com:8080 \
signaler::session_id=1234ridgerun name=web audiotestsrc is-live=true wave=5 ! audioconvert ! \
audioresample ! queue ! opusenc ! rtpopuspay ! web.audio_sink web.audio_src ! rtpopusdepay ! \
opusdec ! audioconvert ! alsasink sync=false async=false

The following pipeline will send ticks audio stream and receive the white noise audio stream sent by the previous pipeline. This pipeline joins the call.

gst-launch-1.0 webrtcbin start-call=false signaler::server_url=http://webrtc.ridgerun.com:8080 \
signaler::session_id=1234ridgerun name=web audiotestsrc is-live=true wave=8 ! audioconvert ! \
audioresample ! queue ! opusenc ! rtpopuspay ! web.audio_sink web.audio_src ! rtpopusdepay ! \
opusdec ! audioconvert ! alsasink sync=false async=false

When executing the two previous pipelines, you should be able to listen the ticks and the white noise.

Only Video

H264

Unidirectional

Example: In this example we use webrtcsink to send a video stream and webrtcsrc to receive the video stream.

The following pipeline will send a color bars h264 video stream:

gst-launch-1.0 webrtcsink start-call=true signaler::server_url=http://webrtc.ridgerun.com:8080 \
signaler::session_id=1234ridgerun name=web videotestsrc is-live=true ! queue ! videoconvert ! \
x264enc key-int-max=2 ! rtph264pay ! queue ! web.video

The following pipeline will receive the video stream and display it:

gst-launch-1.0 webrtcsrc start-call=false signaler::server_url=http://webrtc.ridgerun.com:8080 \
signaler::session_id=1234ridgerun name=web web.video ! rtph264depay ! avdec_h264 ! videoconvert \
! ximagesink async=true

You should be able to see a video pattern similar to Fig.1.

Fig.1 Snapshot of video received

Bidirectional

Example: In this example we use two webrtcbin elements, each sends a video stream and receives each other video stream.

The following pipeline starts the call:

gst-launch-1.0 webrtcbin start-call=true signaler::server_url=http://webrtc.ridgerun.com:8080 \
signaler::session_id=1234ridgerun name=web videotestsrc is-live=true ! queue ! videoconvert ! \
x264enc key-int-max=2 ! rtph264pay ! queue ! identity silent=false ! web.video_sink web.video_src \
! rtph264depay ! avdec_h264 ! videoconvert ! ximagesink async=true

The following pipeline joins the call:

gst-launch-1.0 webrtcbin start-call=false signaler::server_url=http://webrtc.ridgerun.com:8080 \
signaler::session_id=1234ridgerun name=web videotestsrc is-live=true ! queue ! videoconvert ! \
x264enc key-int-max=2 ! rtph264pay ! queue ! identity silent=false ! web.video_sink web.video_src \
! rtph264depay ! avdec_h264 ! videoconvert ! ximagesink async=true

You should be able to see two windows with a video pattern similar to Fig.1.

Vp8

Unidirectional

Example: In this example we use webrtcsink to send a video stream and webrtcsrc to receive the video stream.

The following pipeline will send a color bars vp8 video stream:

gst-launch-1.0 webrtcsink start-call=true signaler::server_url=http://webrtc.ridgerun.com:8080 \
signaler::session_id=1234ridgerun name=web videotestsrc is-live=true ! queue ! videoconvert ! \
vp8enc ! rtpvp8pay ! queue ! identity ! web.video

The following pipeline will receive the video stream and display it:

gst-launch-1.0 webrtcsrc start-call=false signaler::server_url=http://webrtc.ridgerun.com:8080 \
signaler::session_id=1234ridgerun name=web web.video ! rtpvp8depay ! vp8dec ! videoconvert ! \
ximagesink async=true

You should be able to see a video pattern similar to Fig.1.


Bidirectional

Example: In this example we use two webrtcbin elements, each sends a video stream and receives each other video stream.

The following pipeline starts the call:

gst-launch-1.0 webrtcbin start-call=true signaler::server_url=http://webrtc.ridgerun.com:8080 \
signaler::session_id=1234ridgerun name=web videotestsrc is-live=true ! queue ! videoconvert ! \
vp8enc ! rtpvp8pay ! queue ! identity silent=false ! web.video_sink web.video_src ! rtpvp8depay ! \
vp8dec ! videoconvert ! ximagesink async=true

The following pipeline joins the call:

gst-launch-1.0 webrtcbin start-call=false signaler::server_url=http://webrtc.ridgerun.com:8080 \
signaler::session_id=1234ridgerun name=web videotestsrc is-live=true ! queue ! videoconvert ! \
vp8enc ! rtpvp8pay ! queue ! identity silent=false ! web.video_sink web.video_src ! rtpvp8depay \
! vp8dec ! videoconvert ! ximagesink async=true

You should be able to see two windows with a video pattern similar to Fig.1.

3.Audio and Video

Opus + H264

Unidirectional

Example: In this example we use webrtcsink to send a video stream and an audio stream, and we use webrtcsrc to receive the video and audio streams.

The following pipeline will send periodic ticks and a video stream:

gst-launch-1.0  webrtcsink start-call=true signaler::server_url=http://webrtc.ridgerun.com:8080 \
signaler::session_id=1234ridgerun name=web videotestsrc is-live=true ! x264enc key-int-max=2 ! \
rtph264pay ! web.video audiotestsrc is-live=true wave=8 ! audioconvert \
! audioresample ! queue ! opusenc ! rtpopuspay ! web.audio


The following pipeline will receive the video stream and the ticks and do the playback:

gst-launch-1.0 webrtcsrc start-call=false signaler::server_url=http://webrtc.ridgerun.com:8080 \
signaler::session_id=1234ridgerun name=web web.video ! rtph264depay ! avdec_h264 ! videoconvert ! \
ximagesink async=false web.audio ! rtpopusdepay ! opusdec ! audioconvert ! alsasink async=false

When executing the two previous pipelines, you should be able to listen the ticks in the receiving computer and see a video pattern similar to Fig.1.

Bidirectional

Example: In this example we use two webrtcbins, each send a video stream and an audio stream, and receives each other video and audio streams.

The following pipeline starts a call, sends white noise audio stream and a color bar video stream, and receives audio and video streams.

gst-launch-1.0 webrtcbin start-call=true signaler::server_url=http://webrtc.ridgerun.com:8080 \
signaler::session_id=1234ridgerun name=web videotestsrc is-live=true ! queue ! videoconvert ! \
x264enc key-int-max=2 ! rtph264pay ! queue ! identity silent=false ! web.video_sink web.video_src \
! rtph264depay ! avdec_h264 ! videoconvert ! ximagesink async=true audiotestsrc is-live=true wave=5 \
! audioconvert ! audioresample ! queue ! opusenc ! rtpopuspay ! web.audio_sink web.audio_src ! \
rtpopusdepay ! opusdec ! audioconvert ! alsasink async=false

The following pipeline joins the call, sends ticks audio stream and a color bar video stream, and receives audio and video streams.

gst-launch-1.0 webrtcbin start-call=false signaler::server_url=http://webrtc.ridgerun.com:8080 \
signaler::session_id=1234ridgerun name=web videotestsrc is-live=true ! queue ! videoconvert ! \
x264enc key-int-max=2 ! rtph264pay ! queue ! identity silent=false ! web.video_sink web.video_src \
! rtph264depay ! avdec_h264 ! videoconvert ! ximagesink async=true web.audio_src ! rtpopusdepay ! \
opusdec ! audioconvert ! alsasink async=false audiotestsrc is-live=true wave=8 ! audioconvert ! \
audioresample ! queue ! opusenc ! rtpopuspay ! web.audio_sink

When executing the two previous pipelines, you should be able to listen the ticks and whitenoise audio streams and see two windows with video patterns similar to Fig.1.




Example: In this example we use two webrtcbins, one sends an audio stream and receives a video stream and the other receives an audio stream and sends a video stream.

The following pipeline sends an audio stream and receives a video stream, also starts the call.

gst-launch-1.0 webrtcbin start-call=true signaler::server_url=http://webrtc.ridgerun.com:8080 \
signaler::session_id=1234ridgerun name=web web.video_src ! rtph264depay ! avdec_h264 ! \
videoconvert ! ximagesink async=true audiotestsrc is-live=true wave=8 \
! audioconvert ! audioresample ! queue ! opusenc ! rtpopuspay ! web.audio_sink

The following pipeline sends a video stream and receives an audio stream, also joins the call.

gst-launch-1.0 webrtcbin start-call=false signaler::server_url=http://webrtc.ridgerun.com:8080 \
signaler::session_id=1234ridgerun name=web videotestsrc is-live=true ! queue ! videoconvert ! \
x264enc key-int-max=2 ! rtph264pay ! queue ! identity silent=false ! web.video_sink\
 web.audio_src ! rtpopusdepay ! opusdec ! audioconvert ! alsasink async=false

When executing the two previous pipelines, you should be able to listen the ticks audio stream and see a video pattern similar to Fig.1.

Opus + Vp8

Unidirectional

Example: In this example we use webrtcsink to send a video stream and an audio stream, and we use webrtcsrc to receive the video and audio streams.

The following pipeline generates a color bars video stream and a ticks audiostream and starts the call:

gst-launch-1.0 webrtcsink start-call=true signaler::server_url=http://webrtc.ridgerun.com:8080 \
signaler::session_id=1234ridgerun name=web videotestsrc is-live=true ! queue ! videoconvert ! \
vp8enc ! rtpvp8pay ! queue ! identity ! web.video audiotestsrc is-live=true wave=8 ! audioconvert \
! audioresample ! queue ! opusenc ! rtpopuspay ! web.audio

The following pipeline receives the video stream and the audio stream.

gst-launch-1.0 webrtcsrc start-call=false signaler::server_url=http://webrtc.ridgerun.com:8080 \
signaler::session_id=1234ridgerun name=web web.audio ! rtpopusdepay ! opusdec ! audioconvert ! \
queue ! alsasink async=false web.video ! rtpvp8depay ! vp8dec ! videoconvert ! ximagesink async=true

After executing the two previous pipelines you should be able to see a window with a pattern like Fig.1, and hear the ticks audio stream.

Bidirectional

Example: In this example we use two webrtcbins, each send a video stream and an audio stream, and receives each other video and audio streams. The following pipeline starts a call, sends ticks audio stream and a color bar video stream, and receives audio and video streams.

gst-launch-1.0 webrtcbin start-call=true signaler::server_url=http://webrtc.ridgerun.com:8080 \
signaler::session_id=1234ridgerun name=web videotestsrc is-live=true ! queue ! videoconvert ! \
vp8enc ! rtpvp8pay ! queue ! identity silent=false ! web.video_sink web.video_src ! rtpvp8depay \
! vp8dec ! videoconvert ! ximagesink async=true web.audio_src ! rtpopusdepay ! opusdec ! \
audioconvert ! alsasink async=false audiotestsrc is-live=true wave=8 ! audioconvert ! audioresample \
! queue ! opusenc ! rtpopuspay ! web.audio_sink

The following pipeline joins the call, sends white noise audio stream and a color bar video stream, and receives audio and video streams.

gst-launch-1.0 webrtcbin start-call=false signaler::server_url=http://webrtc.ridgerun.com:8080 \
signaler::session_id=1234ridgerun name=web videotestsrc is-live=true ! queue ! videoconvert ! \
vp8enc ! rtpvp8pay ! queue ! identity silent=false ! web.video_sink web.video_src ! rtpvp8depay \
! vp8dec ! videoconvert ! ximagesink async=true web.audio_src ! rtpopusdepay ! opusdec ! audioconvert \
! alsasink async=false audiotestsrc is-live=true wave=5 ! audioconvert ! audioresample ! queue ! \
opusenc ! rtpopuspay ! web.audio_sink

After executing the two previous pipelines you should be able to see two windows with a pattern like Fig.1, and hear white noise and ticks audio streams.


Example: In this example we use two webrtcbins, one sends a video stream and receives an audio stream and the other receives a video stream and sends an audio stream.

The following pipeline sends a video stream and receives an audio stream, also starts the call.

gst-launch-1.0 webrtcbin start-call=true signaler::server_url=http://webrtc.ridgerun.com:8080 \
signaler::session_id=1234ridgerun name=web videotestsrc is-live=true ! queue ! videoconvert ! \
vp8enc ! rtpvp8pay ! queue ! identity silent=false ! web.video_sink web.audio_src ! \
 rtpopusdepay ! opusdec ! audioconvert ! alsasink async=false

The following pipeline sends an audio stream and receives a video stream, also joins the call.

gst-launch-1.0 webrtcbin start-call=false signaler::server_url=http://webrtc.ridgerun.com:8080 \
signaler::session_id=1234ridgerun name=web web.video_src ! rtpvp8depay ! vp8dec ! videoconvert \
! ximagesink async=true audiotestsrc is-live=true wave=8 ! audioconvert ! audioresample ! queue ! \
opusenc ! rtpopuspay ! web.audio_sink

When executing the two previous pipelines, you should be able to listen the ticks audio stream and see a video pattern similar to Fig.1.



GstWebRTC Basics


Home

GstWebRTC Pipelines