Difference between revisions of "GstWebRTC - AppRTC Audio + Video Examples - x86"

From RidgeRun Developer Connection
Jump to: navigation, search
Line 32: Line 32:
 
===Unidirectional Elements===
 
===Unidirectional Elements===
 
====Example====
 
====Example====
In this example we use rrwebrtcsink to send a video stream and an audio stream, and we use rrwebrtcsrc to receive the video and audio streams.
+
In this example we use rrwebrtcbin to send a video stream and an audio stream, and we use rrwebrtcbin to receive the video and audio streams.
  
 
=====Send Pipeline=====
 
=====Send Pipeline=====
 
The following pipeline will send periodic ticks and a video stream:
 
The following pipeline will send periodic ticks and a video stream:
 
<syntaxhighlight lang='bash'>
 
<syntaxhighlight lang='bash'>
gst-launch-1.0 rrwebrtcsink start-call=true signaler=GstApprtcSignaler signaler::server_url=http://localhost:8080 \
+
gst-launch-1.0 rrwebrtcbin start-call=true signaler=GstApprtcSignaler signaler::server_url=http://localhost:8080 \
 
signaler::session_id=1234ridgerun name=web videotestsrc is-live=true ! videoconvert ! \
 
signaler::session_id=1234ridgerun name=web videotestsrc is-live=true ! videoconvert ! \
x264enc key-int-max=2 ! rtph264pay ! web.video \
+
x264enc key-int-max=2 ! rtph264pay ! web.video_sink \
audiotestsrc is-live=true ! opusenc ! rtpopuspay ! web.audio
+
audiotestsrc is-live=true ! opusenc ! rtpopuspay ! web.audio_sink
 
</syntaxhighlight>
 
</syntaxhighlight>
  
Line 46: Line 46:
 
The following pipeline will receive the video stream and the ticks and do the playback:
 
The following pipeline will receive the video stream and the ticks and do the playback:
 
<syntaxhighlight lang='bash'>
 
<syntaxhighlight lang='bash'>
gst-launch-1.0 rrwebrtcsrc start-call=false signaler=GstApprtcSignaler signaler::server_url=http://localhost:8080 \
+
gst-launch-1.0 rrwebrtcbin start-call=false signaler=GstApprtcSignaler signaler::server_url=http://localhost:8080 \
signaler::session_id=1234ridgerun name=web web.video ! rtph264depay ! avdec_h264 ! videoconvert ! \
+
signaler::session_id=1234ridgerun name=web web.video_src ! rtph264depay ! avdec_h264 ! videoconvert ! \
xvimagesink async=true web.audio ! rtpopusdepay ! opusdec ! autoaudiosink
+
xvimagesink async=true web.audio_src ! rtpopusdepay ! opusdec ! autoaudiosink
 
</syntaxhighlight>
 
</syntaxhighlight>
  
Line 86: Line 86:
  
 
====Example====
 
====Example====
In this example we use rrwebrtcsink to send a video stream and an audio stream, and we use rrwebrtcsrc to receive the video and audio streams.
+
In this example, we use rrwebrtcbin to send a video stream and an audio stream, and we use rrwebrtcbin to receive the video and audio streams.
  
 
=====Send Pipeline=====
 
=====Send Pipeline=====
Line 92: Line 92:
  
 
<syntaxhighlight lang='bash'>
 
<syntaxhighlight lang='bash'>
gst-launch-1.0 rrwebrtcsink start-call=true signaler=GstApprtcSignaler signaler::server_url=http://localhost:8080 \
+
gst-launch-1.0 rrwebrtcbin start-call=true signaler=GstApprtcSignaler signaler::server_url=http://localhost:8080 \
signaler::session_id=1234ridgerun name=web videotestsrc is-live=true ! vp8enc ! rtpvp8pay ! web.video \
+
signaler::session_id=1234ridgerun name=web videotestsrc is-live=true ! vp8enc ! rtpvp8pay ! web.video_sink \
audiotestsrc is-live=true ! opusenc ! rtpopuspay ! web.audio
+
audiotestsrc is-live=true ! opusenc ! rtpopuspay ! web.audio_sink
 
</syntaxhighlight>
 
</syntaxhighlight>
  
Line 100: Line 100:
 
The following pipeline receives the video stream and the audio stream.
 
The following pipeline receives the video stream and the audio stream.
 
<syntaxhighlight lang='bash'>
 
<syntaxhighlight lang='bash'>
gst-launch-1.0 rrwebrtcsrc start-call=false signaler=GstApprtcSignaler signaler::server_url=http://localhost:8080 \
+
gst-launch-1.0 rrwebrtcbin start-call=false signaler=GstApprtcSignaler signaler::server_url=http://localhost:8080 \
signaler::session_id=1234ridgerun name=web web.video ! rtpvp8depay ! vp8dec ! videoconvert ! xvimagesink async=true \
+
signaler::session_id=1234ridgerun name=web web.video_src ! rtpvp8depay ! vp8dec ! videoconvert ! xvimagesink async=true \
web.audio ! rtpopusdepay ! opusdec ! autoaudiosink
+
web.audio_src ! rtpopusdepay ! opusdec ! autoaudiosink
 
</syntaxhighlight>
 
</syntaxhighlight>
  

Revision as of 12:09, 14 March 2019

Error something wrong.jpg Problems running the pipelines shown on this page?
Please see our GStreamer Debugging guide for help.


Video - x86


Home

Demo Web Page - x86



This page links to the GstRrWebRTC audio and video examples on x86 platform using AppRTC.

Server Setup

To run the examples first enable the Websocket server:

$GOPATH/bin/collidermain -port=8089 -tls=false

Then, enable the AppRTC Node server in a different terminal window:

cd <PATH>/apprtc-node-server
node ./bin/www

Note: Make sure you previously install dependencies needed for enable the servers, if you didn't follow this link [AppRTC Node Server with our websocket server]


H264 + Opus

Unidirectional Elements

Example

In this example we use rrwebrtcbin to send a video stream and an audio stream, and we use rrwebrtcbin to receive the video and audio streams.

Send Pipeline

The following pipeline will send periodic ticks and a video stream:

gst-launch-1.0 rrwebrtcbin start-call=true signaler=GstApprtcSignaler signaler::server_url=http://localhost:8080 \
signaler::session_id=1234ridgerun name=web videotestsrc is-live=true ! videoconvert ! \
x264enc key-int-max=2 ! rtph264pay ! web.video_sink \
audiotestsrc is-live=true ! opusenc ! rtpopuspay ! web.audio_sink
Receive Pipeline

The following pipeline will receive the video stream and the ticks and do the playback:

gst-launch-1.0 rrwebrtcbin start-call=false signaler=GstApprtcSignaler signaler::server_url=http://localhost:8080 \
signaler::session_id=1234ridgerun name=web web.video_src ! rtph264depay ! avdec_h264 ! videoconvert ! \
xvimagesink async=true web.audio_src ! rtpopusdepay ! opusdec ! autoaudiosink

When executing the two previous pipelines, you should be able to listen the ticks in the receiving computer and see a video pattern similar to Fig.1.

Fig.1 Snapshot of video received



Bidirectional Element

Example

In this example we use two rrwebrtcbins, each send a video stream and an audio stream, and receives each other video and audio streams.

Send-Receive Pipeline

The following pipeline starts a call, sends white noise audio stream and a color bar video stream, and receives audio and video streams.

gst-launch-1.0 rrwebrtcbin start-call=true signaler=GstApprtcSignaler signaler::server_url=http://localhost:8080 \
signaler::session_id=1234ridgerun name=web videotestsrc is-live=true ! videoconvert ! \
x264enc key-int-max=2 ! rtph264pay ! web.video_sink \
web.video_src ! rtph264depay ! avdec_h264 ! videoconvert ! xvimagesink async=true \
web.audio_src ! rtpopusdepay ! opusdec ! autoaudiosink \
audiotestsrc is-live=true ! opusenc ! rtpopuspay ! web.audio_sink
Send-Receive Pipeline

The following pipeline joins the call, sends ticks audio stream and a color bar video stream, and receives audio and video streams.

gst-launch-1.0 rrwebrtcbin start-call=false signaler=GstApprtcSignaler signaler::server_url=http://localhost:8080 \
signaler::session_id=1234ridgerun name=web videotestsrc is-live=true ! videoconvert ! \
x264enc key-int-max=2 ! rtph264pay ! web.video_sink \
web.video_src ! rtph264depay ! avdec_h264 ! videoconvert ! xvimagesink async=true \
web.audio_src ! rtpopusdepay ! opusdec ! autoaudiosink \
audiotestsrc is-live=true ! opusenc ! rtpopuspay ! web.audio_sink

When executing the two previous pipelines, you should be able to listen audio streams and see two windows with video patterns similar to Fig.1.

Vp8 + Opus

Unidirectional Elements

Example

In this example, we use rrwebrtcbin to send a video stream and an audio stream, and we use rrwebrtcbin to receive the video and audio streams.

Send Pipeline

The following pipeline generates a color bars video stream and a ticks audiostream and starts the call:

gst-launch-1.0 rrwebrtcbin start-call=true signaler=GstApprtcSignaler signaler::server_url=http://localhost:8080 \
signaler::session_id=1234ridgerun name=web videotestsrc is-live=true ! vp8enc ! rtpvp8pay ! web.video_sink \
audiotestsrc is-live=true ! opusenc ! rtpopuspay ! web.audio_sink
Receive Pipeline

The following pipeline receives the video stream and the audio stream.

gst-launch-1.0 rrwebrtcbin start-call=false signaler=GstApprtcSignaler signaler::server_url=http://localhost:8080 \
signaler::session_id=1234ridgerun name=web web.video_src ! rtpvp8depay ! vp8dec ! videoconvert ! xvimagesink async=true \
web.audio_src ! rtpopusdepay ! opusdec ! autoaudiosink

After executing the two previous pipelines you should be able to see a window with a pattern like Fig.1, and hear the ticks audio stream.



Bidirectional Element

Example

In this example we use two rrwebrtcbins, each send a video stream and an audio stream, and receives each other video and audio streams.

Send-Receive Pipeline

The following pipeline starts a call, sends ticks audio stream and a color bar video stream, and receives audio and video streams.

gst-launch-1.0 rrwebrtcbin start-call=true signaler=GstApprtcSignaler signaler::server_url=http://localhost:8080 \
signaler::session_id=1234ridgerun name=web videotestsrc is-live=true ! vp8enc ! rtpvp8pay ! web.video_sink \
web.video_src ! rtpvp8depay ! vp8dec ! videoconvert ! xvimagesink async=true \
web.audio_src ! rtpopusdepay ! opusdec ! autoaudiosink \
audiotestsrc is-live=true ! opusenc ! rtpopuspay ! web.audio_sink
Send-Receive Pipeline

The following pipeline joins the call, sends white noise audio stream and a color bar video stream, and receives audio and video streams.

gst-launch-1.0 rrwebrtcbin start-call=false signaler=GstApprtcSignaler signaler::server_url=http://localhost:8080 \
signaler::session_id=1234ridgerun name=web videotestsrc is-live=true ! vp8enc ! rtpvp8pay ! web.video_sink \
web.video_src ! rtpvp8depay ! vp8dec ! videoconvert ! xvimagesink async=true \
web.audio_src ! rtpopusdepay ! opusdec ! autoaudiosink \
audiotestsrc is-live=true ! opusenc ! rtpopuspay ! web.audio_sink

After executing the two previous pipelines you should be able to see two windows with a pattern like Fig.1, and hear audio streams.




Video - x86


Home

Demo Web Page - x86