Difference between revisions of "Gst-1.0 pipelines for DM816x and DM814x"

From RidgeRun Developer Connection
Jump to: navigation, search
(Single Video test source RTP streaming)
(Scale the QVGA video test pattern to VGA)
Line 12: Line 12:
  
 
<pre style="background:#d6e4f1">
 
<pre style="background:#d6e4f1">
gst-launch-1.0 videotestsrc num-buffers=1000 ! 'video/x-raw,format=(string)NV12,width=320,height=240,framerate=(fraction)60/1' ! omxbufferalloc num-buffers=8 ! omxscaler  ! 'video/x-raw,format=(string)YUY2,width=640,height=480,framerate=(fraction)60/1' ! perf print-arm-load=true ! fakesink
+
gst-launch-1.0 videotestsrc num-buffers=1000 ! 'video/x-raw,format=(string)NV12,width=320,height=240,framerate=(fraction)60/1' ! omxbufferalloc num-buffers=8 ! omxscaler  \
 +
! 'video/x-raw,format=(string)YUY2,width=640,height=480,framerate=(fraction)60/1' ! perf print-arm-load=true ! fakesink
 
</pre>
 
</pre>
  

Revision as of 11:46, 18 March 2016

Introduction

DM81xx

Encode videotest pattern in H.264 (without container)

 
gst-launch-1.0 videotestsrc num-buffers=1000 ! omxbufferalloc num-buffers=8 ! omxh264enc ! perf print-arm-load=true ! filesink location=sample.h264

Scale the QVGA video test pattern to VGA

gst-launch-1.0 videotestsrc num-buffers=1000 ! 'video/x-raw,format=(string)NV12,width=320,height=240,framerate=(fraction)60/1' ! omxbufferalloc num-buffers=8 ! omxscaler  \
! 'video/x-raw,format=(string)YUY2,width=640,height=480,framerate=(fraction)60/1' ! perf print-arm-load=true ! fakesink

DM8148

V4l2 Capture, H264 encoding and mp4 recording

gst-launch-1.0 v4l2src io-mode=3 device=/dev/video0 num-buffers=1000 ! 'video/x-raw,format=(string)NV12,width=1280,height=720,framerate=(fraction)60/1' ! perf print-arm-load=true ! omxbufferalloc num-buffers=12 -v ! omxh264enc i-period=30 idr-period=30 ! queue ! rrh264parser single-nalu=true ! mp4mux dts-method=0 ! filesink location=test_v4l2src_1.0.mp4


Single Video test source RTP streaming

These instructions show how to do video streaming over the network, a video will be played on the board and viewed on the host. These pipelines use a configured port to send the packets.

Stream H.264 video test pattern over RTP

  • Server: DM81xx
HOST=<Your IP address>

PORT=<The port to use>

gst-launch-1.0 videotestsrc ! 'video/x-raw,format =(string)NV12,width=1280,height=720,framerate=(fraction)60/1' ! perf print-arm-load=true ! omxbufferalloc num-buffers=12 ! omxh264enc \
i-period=30 idr-period=30 ! rtph264pay ! udpsink host=$HOST port=$PORT sync=false async=false -v

This pipeline is going to print the capabilities of each element's pad thanks to the -v option. The pipeline should print something similar to this output:

.
.
.
/GstPipeline:pipeline0/GstOMXH264Enc-omxh264enc:omxh264enc-omxh264enc0.GstPad:src: caps = "video/x-h264\,\ stream-format\=\(string\)byte-stream\,\ alignment\=\(string\)au\,\ profile\=\(string\)baseline\,\ level\=\(string\)4.2\,\ width\=\(int\)1280\,\ height\=\(int\)720\,\ pixel-aspect-ratio\=\(fraction\)1/1\,\ framerate\=\(fraction\)60/1"
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0.GstPad:sink: caps = "video/x-h264\,\ stream-format\=\(string\)byte-stream\,\ alignment\=\(string\)au\,\ profile\=\(string\)baseline\,\ level\=\(string\)4.2\,\ width\=\(int\)1280\,\ height\=\(int\)720\,\ pixel-aspect-ratio\=\(fraction\)1/1\,\ framerate\=\(fraction\)60/1"
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0.GstPad:src: caps = "application/x-rtp\,\ media\=\(string\)video\,\ payload\=\(int\)96\,\ clock-rate\=\(int\)90000\,\ encoding-name\=\(string\)H264\,\ ssrc\=\(uint\)2286224513\,\ timestamp-offset\=\(uint\)3885907970\,\ seqnum-offset\=\(uint\)24314"
/GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad:sink: caps = "application/x-rtp\,\ media\=\(string\)video\,\ payload\=\(int\)96\,\ clock-rate\=\(int\)90000\,\ encoding-name\=\(string\)H264\,\ ssrc\=\(uint\)2286224513\,\ timestamp-offset\=\(uint\)3885907970\,\ seqnum-offset\=\(uint\)24314"
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0.GstPad:src: caps = "application/x-rtp\,\ media\=\(string\)video\,\ clock-rate\=\(int\)90000\,\ encoding-name\=\(string\)H264\,\ packetization-mode\=\(string\)1\,\ sprop-parameter-sets\=\(string\)\"J0KAKouVAKALcgA\\\=\\\,KN4BriAA\"\,\ payload\=\(int\)96\,\ seqnum-offset\=\(uint\)24314\,\ timestamp-offset\=\(uint\)3885907970\,\ ssrc\=\(uint\)2286224513"
/GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad:sink: caps = "application/x-rtp\,\ media\=\(string\)video\,\ clock-rate\=\(int\)90000\,\ encoding-name\=\(string\)H264\,\ packetization-mode\=\(string\)1\,\ sprop-parameter-sets\=\(string\)\"J0KAKouVAKALcgA\\\=\\\,KN4BriAA\"\,\ payload\=\(int\)96\,\ seqnum-offset\=\(uint\)24314\,\ timestamp-offset\=\(uint\)3885907970\,\ ssrc\=\(uint\)2286224513"
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0: timestamp = 3885907970
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0: seqnum = 24314
.
.
.    

You need the udpsink:sink capabilities for the client pipeline.

  • Client: Ubuntu PC

Copy the udpsink caps given by the server pipeline, erase the spaces and the (uint) cast.

CAPS="application/x-rtp,media=(string)video,payload=(int)96,clock-rate=(int)90000,encoding-name=(string)H264, \
ssrc=(int)2286224513,timestamp-offset=(int)3885907970,seqnum-offset=(int)24314"

PORT=<Port configured in the server>

gst-launch-1.0 udpsrc port=$PORT ! $CAPS  ! rtph264depay ! queue ! avdec_h264 ! xvimagesink sync=true async=false

Stream H.264 encoded video file over RTP
These pipelines use a video file and send it over the network. Here you can use any file encoded in H.264.

  • Server: DM81xx
CLIENT_IP=<Your IP address>

FILE=sintel_trailer-1080p.mp4

gst-launch filesrc location=$FILE  ! qtdemux  ! queue ! h264parse !  gstperf ! rtph264pay ! udpsink host=$CLIENT_IP -v

As before, you need the udpsink:sink capabilities for the client pipeline.

  • Client: Ubuntu PC

Copy the udpsink caps given by the server pipeline, erase the spaces and the (uint) cast.

CAPS=application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)H264,\
sprop-parameter-sets=(string)\"Z2QAMqw05gHgCJ+WEAAAAwAQAAADAwDxgxmg\\,aOl4TLIs\",payload=(int)96,\
ssrc=2152503956,clock-base=4043051310,seqnum-base=10306

PORT=4951

gst-launch udpsrc port=$PORT ! $CAPS ! rtph264depay ! queue ! ffdec_h264 ! xvimagesink sync=false -v

V4l2 Capture and H264 encoded UDP streaming

These instructions show how to do video streaming over the network, a video will be captured on the board and viewed on the host. These pipelines use the port that you configure to send the packets. To configure the port and the host ip to send the streaming you have to set the variables port=$PORT and host=$HOST to your own).

  • Server: DM81xx

Set the variables HOST and PORT to your own, for example:

HOST=10.251.101.43
PORT=3001

And run the pipeline:

gst-launch-1.0 v4l2src io-mode=3 device=/dev/video0 ! 'video/x-raw,format =(string)NV12,width=1280,height=720,framerate=(fraction)60/1' ! perf print-arm-load=true ! omxbufferalloc num-buffers=12 ! omxh264enc i-period=30 idr-period=30 ! mpegtsmux ! udpsink host=$HOST port=$PORT sync=false async=false
  • Client: Ubuntu PC

Set the same port configured in the server:

PORT=3001

And run the pipeline:

gst-launch-1.0 udpsrc port=$PORT ! tsdemux ! h264parse ! avdec_h264 ! xvimagesink sync=true async=false