Difference between revisions of "Gst-1.0 pipelines for DM816x and DM814x"

From RidgeRun Developer Connection
Jump to: navigation, search
(H264 video decoding)
Line 133: Line 133:
 
And run the pipeline:
 
And run the pipeline:
  
<pre style="background:#d6e4f1">
+
<pre style="background:#a4c211">
 
gst-launch-1.0 videotestsrc is-live=true ! 'video/x-raw,format=(string)NV12,width=1280,height=720,framerate=(fraction)30/1'  ! x264enc byte-stream=true key-int-max=10 ! \
 
gst-launch-1.0 videotestsrc is-live=true ! 'video/x-raw,format=(string)NV12,width=1280,height=720,framerate=(fraction)30/1'  ! x264enc byte-stream=true key-int-max=10 ! \
 
h264parse ! mpegtsmux ! udpsink host=$HOST port=$PORT sync=false -v
 
h264parse ! mpegtsmux ! udpsink host=$HOST port=$PORT sync=false -v
Line 148: Line 148:
  
 
And run the pipeline:
 
And run the pipeline:
 +
 +
<pre style="background:#d6e4f1">
 +
gst-launch-1.0 udpsrc port=$PORT ! tsdemux ! h264parse ! queue ! omxh264dec ! fakesink silent=false -v
 +
</pre>
 +
 +
=== Decode H264 mp4 file ===
 +
 +
Set the file name to be decoded, for example:
  
 
<pre style="background:#a4c211">
 
<pre style="background:#a4c211">
gst-launch-1.0 udpsrc port=$PORT ! tsdemux ! h264parse ! queue ! omxh264dec ! fakesink silent=false -v
+
FILE=test_v4l2src_1.0.mp4
 +
</pre>
 +
 
 +
And run the pipeline:
 +
 
 +
<pre style="background:#d6e4f1">
 +
gst-launch-1.0 filesrc location=$FILE ! qtdemux ! h264parse ! perf ! queue ! omxh264dec ! fakesink -v
 
</pre>
 
</pre>

Revision as of 12:40, 21 March 2016

Introduction

DM81xx

Encode videotest pattern in H.264 (without container)

 
gst-launch-1.0 videotestsrc num-buffers=1000 ! omxbufferalloc num-buffers=8 ! omxh264enc ! perf print-arm-load=true ! filesink location=sample.h264

Omxscaler

With the omxscaler you could do down-scaling and up-scaling of any video source.

Up-scaling the QVGA video test pattern to VGA

gst-launch-1.0 videotestsrc num-buffers=1000 ! 'video/x-raw,format=(string)NV12,width=320,height=240,framerate=(fraction)60/1' ! omxbufferalloc num-buffers=8 ! omxscaler  \
! 'video/x-raw,format=(string)YUY2,width=640,height=480,framerate=(fraction)60/1' ! perf print-arm-load=true ! fakesink silent=false -v

Down-scaling the v4l2src captured video to VGA

gst-launch-1.0 v4l2src io-mode=3 num-buffers=100 ! 'video/x-raw,format=(string)NV12,width=1280,height=720,framerate=(fraction)60/1' ! omxbufferalloc num-buffers=12 ! \
omxscaler ! 'video/x-raw,format=(string)YUY2,width=640,height=480,framerate=(fraction)60/1' ! perf print-arm-load=true ! fakesink silent=false -v

Single Video test source RTP streaming

These instructions show how to do video streaming over the network, a video will be played on the board and viewed on the host. These pipelines use a configured port to send the packets.

Stream H.264 video test pattern over RTP

  • Server: DM81xx
HOST=<Your IP address>

PORT=<The port to use>

gst-launch-1.0 videotestsrc ! 'video/x-raw,format =(string)NV12,width=1280,height=720,framerate=(fraction)60/1' ! perf print-arm-load=true ! omxbufferalloc num-buffers=12 ! omxh264enc \
i-period=30 idr-period=30 ! rtph264pay ! udpsink host=$HOST port=$PORT sync=false async=false -v

This pipeline is going to print the capabilities of each element's pad thanks to the -v option. The pipeline should print something similar to this output:

.
.
.
/GstPipeline:pipeline0/GstOMXH264Enc-omxh264enc:omxh264enc-omxh264enc0.GstPad:src: caps = "video/x-h264\,\ stream-format\=\(string\)byte-stream\,\ alignment\=\(string\)au\,\ profile\=\(string\)baseline\,\ level\=\(string\)4.2\,\ width\=\(int\)1280\,\ height\=\(int\)720\,\ pixel-aspect-ratio\=\(fraction\)1/1\,\ framerate\=\(fraction\)60/1"
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0.GstPad:sink: caps = "video/x-h264\,\ stream-format\=\(string\)byte-stream\,\ alignment\=\(string\)au\,\ profile\=\(string\)baseline\,\ level\=\(string\)4.2\,\ width\=\(int\)1280\,\ height\=\(int\)720\,\ pixel-aspect-ratio\=\(fraction\)1/1\,\ framerate\=\(fraction\)60/1"
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0.GstPad:src: caps = "application/x-rtp\,\ media\=\(string\)video\,\ payload\=\(int\)96\,\ clock-rate\=\(int\)90000\,\ encoding-name\=\(string\)H264\,\ ssrc\=\(uint\)2286224513\,\ timestamp-offset\=\(uint\)3885907970\,\ seqnum-offset\=\(uint\)24314"
/GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad:sink: caps = "application/x-rtp\,\ media\=\(string\)video\,\ payload\=\(int\)96\,\ clock-rate\=\(int\)90000\,\ encoding-name\=\(string\)H264\,\ ssrc\=\(uint\)2286224513\,\ timestamp-offset\=\(uint\)3885907970\,\ seqnum-offset\=\(uint\)24314"
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0.GstPad:src: caps = "application/x-rtp\,\ media\=\(string\)video\,\ clock-rate\=\(int\)90000\,\ encoding-name\=\(string\)H264\,\ packetization-mode\=\(string\)1\,\ sprop-parameter-sets\=\(string\)\"J0KAKouVAKALcgA\\\=\\\,KN4BriAA\"\,\ payload\=\(int\)96\,\ seqnum-offset\=\(uint\)24314\,\ timestamp-offset\=\(uint\)3885907970\,\ ssrc\=\(uint\)2286224513"
/GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad:sink: caps = "application/x-rtp\,\ media\=\(string\)video\,\ clock-rate\=\(int\)90000\,\ encoding-name\=\(string\)H264\,\ packetization-mode\=\(string\)1\,\ sprop-parameter-sets\=\(string\)\"J0KAKouVAKALcgA\\\=\\\,KN4BriAA\"\,\ payload\=\(int\)96\,\ seqnum-offset\=\(uint\)24314\,\ timestamp-offset\=\(uint\)3885907970\,\ ssrc\=\(uint\)2286224513"
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0: timestamp = 3885907970
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0: seqnum = 24314
.
.
.    

You need the udpsink:sink capabilities for the client pipeline.

  • Client: Ubuntu PC

Copy the udpsink caps given by the server pipeline, erase the spaces and the (uint) cast.

CAPS="application/x-rtp,media=(string)video,payload=(int)96,clock-rate=(int)90000,encoding-name=(string)H264, \
ssrc=(int)2286224513,timestamp-offset=(int)3885907970,seqnum-offset=(int)24314"

PORT=<Port configured in the server>

gst-launch-1.0 udpsrc port=$PORT ! $CAPS  ! rtph264depay ! queue ! avdec_h264 ! xvimagesink sync=true async=false

V4l2 Capture and H264 encoded UDP streaming

These instructions show how to do video streaming over the network, a video will be captured on the board and viewed on the host. These pipelines use the port that you configure to send the packets. To configure the port and the host ip to send the streaming you have to set the variables port=$PORT and host=$HOST to your own).

  • Server: DM81xx

Set the variables HOST and PORT to your own, for example:

HOST=10.251.101.43
PORT=3001

And run the pipeline:

gst-launch-1.0 v4l2src io-mode=3 device=/dev/video0 ! 'video/x-raw,format =(string)NV12,width=1280,height=720,framerate=(fraction)60/1' ! perf print-arm-load=true ! omxbufferalloc num-buffers=12 ! \
omxh264enc i-period=30 idr-period=30 ! mpegtsmux ! udpsink host=$HOST port=$PORT sync=false async=false
  • Client: Ubuntu PC

Set the same port configured in the server:

PORT=3001

And run the pipeline:

gst-launch-1.0 udpsrc port=$PORT ! tsdemux ! h264parse ! avdec_h264 ! xvimagesink sync=false async=false

V4l2 Capture, H264 encoding and mp4 recording

With this example you could create a mp4 file recording from a camera or any device that provides you video. This pipeline uses the omxh264enc with its i-period and idr-period properties set to 10 in order to keep a good video quality when it is decode by another pipeline or software.

gst-launch-1.0 v4l2src io-mode=3 device=/dev/video0 num-buffers=1000 ! 'video/x-raw,format=(string)NV12,width=1280,height=720,framerate=(fraction)60/1' ! perf print-arm-load=true ! \
 omxbufferalloc num-buffers=12 ! omxh264enc i-period=10 idr-period=10 ! queue ! rrh264parser single-nalu=true ! mp4mux dts-method=0 ! filesink location=test_v4l2src_1.0.mp4

H264 video decoding

With the omxh264 decoder you are able to decode H264 video streaming or a previously recorder video in order to have raw video to be processed:

Decode H264 video streaming

From the host side we have to generate a test video pattern to send via udp:

  • Server: Ubuntu PC

Set the variables HOST and PORT to your own, for example:

HOST=10.251.101.120
PORT=3001

And run the pipeline:

gst-launch-1.0 videotestsrc is-live=true ! 'video/x-raw,format=(string)NV12,width=1280,height=720,framerate=(fraction)30/1'  ! x264enc byte-stream=true key-int-max=10 ! \
h264parse ! mpegtsmux ! udpsink host=$HOST port=$PORT sync=false -v

  • Client: DM81xx

Set the same port configured in the server:

PORT=3001

And run the pipeline:

gst-launch-1.0 udpsrc port=$PORT ! tsdemux ! h264parse ! queue ! omxh264dec ! fakesink silent=false -v

Decode H264 mp4 file

Set the file name to be decoded, for example:

FILE=test_v4l2src_1.0.mp4

And run the pipeline:

gst-launch-1.0 filesrc location=$FILE ! qtdemux ! h264parse ! perf ! queue ! omxh264dec ! fakesink -v