Difference between revisions of "Gst-1.0 pipelines for DM816x and DM814x"
m |
m |
||
(2 intermediate revisions by the same user not shown) | |||
Line 1: | Line 1: | ||
+ | <seo title="Gst-1.0 Pipelines for DM816x & DM814x | Omx Scaling & Capture" titlemode="replace" keywords="GStreamer, Linux SDK, Linux BSP, Embedded Linux, Device Drivers, Nvidia, Xilinx, TI, NXP, Freescale, Embedded Linux driver development, Linux Software development, Embedded Linux SDK, Embedded Linux Application development, GStreamer Multimedia Framework." description="Get GStreamer 1.0 pipelines used on the DM8168 (EVM and Z3 boards) and DM8148 (EVM). Discover a variety of pipelines tested on DM8168 and DM8148 platforms now at RidgeRun."></seo> | ||
+ | |||
<table> | <table> | ||
<tr> | <tr> | ||
Line 4: | Line 6: | ||
<td valign=top> | <td valign=top> | ||
{{Debug Symbol}} Problems running the pipelines shown on this page?<br>Please see our [http://developer.ridgerun.com/wiki/index.php?title=GStreamer_Debugging GStreamer Debugging guide] for help. | {{Debug Symbol}} Problems running the pipelines shown on this page?<br>Please see our [http://developer.ridgerun.com/wiki/index.php?title=GStreamer_Debugging GStreamer Debugging guide] for help. | ||
+ | <br/> | ||
+ | <br/> | ||
+ | <br/> | ||
+ | {{Sponsor Button}} | ||
</td> | </td> | ||
</table> | </table> |
Latest revision as of 14:58, 14 January 2019
Problems running the pipelines shown on this page? Error creating thumbnail: Unable to save thumbnail to destination
|
Introduction
On this page you are going to find a set of pipelines used on the DM8168 (EVM and Z3 boards) and DM8148 (EVM). You could find pipelines regarding to H264 encoding and decoding, up and down scaling, video capture through v4l2 and OMX, and video streaming. All of these pipeline were tested in the DM8168 and DM8148 platforms with several resolutions and framerates.
DM81xx
Omx H264 encoding
With the omxh264 encoder we are able to encode raw video into H264 frames using different parameters to obtain the desired configuration.
Encode videotest pattern in H.264 (without container)
gst-launch-1.0 videotestsrc num-buffers=1000 ! omxbufferalloc num-buffers=8 ! omxh264enc ! perf print-arm-load=true ! filesink location=sample.h264
Encode videotest pattern in H.264 (with container)
gst-launch-1.0 videotestsrc num-buffers=1000 ! omxbufferalloc num-buffers=8 ! omxh264enc ! perf print-arm-load=true ! queue ! rrh264parser single-nalu=true \ ! mp4mux dts-method=0 ! filesink location=test.mp4 -v
Single Video test source RTP/H264 streaming
These instructions show how to do video streaming over the network, a video will be played on the board and viewed on the host. These pipelines use a configured port to send the packets.
Stream H.264 video test pattern over RTP
- Server: DM81xx
HOST=<Your IP address> PORT=<The port to use> gst-launch-1.0 videotestsrc ! 'video/x-raw,format =(string)NV12,width=1280,height=720,framerate=(fraction)60/1' ! perf print-arm-load=true ! omxbufferalloc num-buffers=12 ! omxh264enc \ i-period=30 idr-period=30 ! rtph264pay ! udpsink host=$HOST port=$PORT sync=false async=false -v
This pipeline is going to print the capabilities of each element's pad thanks to the -v option. The pipeline should print something similar to this output:
. . . /GstPipeline:pipeline0/GstOMXH264Enc-omxh264enc:omxh264enc-omxh264enc0.GstPad:src: caps = "video/x-h264\,\ stream-format\=\(string\)byte-stream\,\ alignment\=\(string\)au\,\ profile\=\(string\)baseline\,\ level\=\(string\)4.2\,\ width\=\(int\)1280\,\ height\=\(int\)720\,\ pixel-aspect-ratio\=\(fraction\)1/1\,\ framerate\=\(fraction\)60/1" /GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0.GstPad:sink: caps = "video/x-h264\,\ stream-format\=\(string\)byte-stream\,\ alignment\=\(string\)au\,\ profile\=\(string\)baseline\,\ level\=\(string\)4.2\,\ width\=\(int\)1280\,\ height\=\(int\)720\,\ pixel-aspect-ratio\=\(fraction\)1/1\,\ framerate\=\(fraction\)60/1" /GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0.GstPad:src: caps = "application/x-rtp\,\ media\=\(string\)video\,\ payload\=\(int\)96\,\ clock-rate\=\(int\)90000\,\ encoding-name\=\(string\)H264\,\ ssrc\=\(uint\)2286224513\,\ timestamp-offset\=\(uint\)3885907970\,\ seqnum-offset\=\(uint\)24314" /GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad:sink: caps = "application/x-rtp\,\ media\=\(string\)video\,\ payload\=\(int\)96\,\ clock-rate\=\(int\)90000\,\ encoding-name\=\(string\)H264\,\ ssrc\=\(uint\)2286224513\,\ timestamp-offset\=\(uint\)3885907970\,\ seqnum-offset\=\(uint\)24314" /GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0.GstPad:src: caps = "application/x-rtp\,\ media\=\(string\)video\,\ clock-rate\=\(int\)90000\,\ encoding-name\=\(string\)H264\,\ packetization-mode\=\(string\)1\,\ sprop-parameter-sets\=\(string\)\"J0KAKouVAKALcgA\\\=\\\,KN4BriAA\"\,\ payload\=\(int\)96\,\ seqnum-offset\=\(uint\)24314\,\ timestamp-offset\=\(uint\)3885907970\,\ ssrc\=\(uint\)2286224513" /GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad:sink: caps = "application/x-rtp\,\ media\=\(string\)video\,\ clock-rate\=\(int\)90000\,\ encoding-name\=\(string\)H264\,\ packetization-mode\=\(string\)1\,\ sprop-parameter-sets\=\(string\)\"J0KAKouVAKALcgA\\\=\\\,KN4BriAA\"\,\ payload\=\(int\)96\,\ seqnum-offset\=\(uint\)24314\,\ timestamp-offset\=\(uint\)3885907970\,\ ssrc\=\(uint\)2286224513" /GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0: timestamp = 3885907970 /GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0: seqnum = 24314 . . .
You need the udpsink:sink capabilities for the client pipeline.
- Client: Ubuntu PC
Copy the udpsink caps given by the server pipeline, erase the spaces and the (uint) cast.
CAPS="application/x-rtp,media=(string)video,payload=(int)96,clock-rate=(int)90000,encoding-name=(string)H264, \ ssrc=(int)2286224513,timestamp-offset=(int)3885907970,seqnum-offset=(int)24314" PORT=<Port configured in the server> gst-launch-1.0 udpsrc port=$PORT ! $CAPS ! rtph264depay ! queue ! avdec_h264 ! xvimagesink sync=true async=false
Omx Scaling
With the omxscaler you could do down-scaling and up-scaling of any video source.
Up-scaling the QVGA video test pattern to VGA
gst-launch-1.0 videotestsrc is-live=true num-buffers=1000 ! 'video/x-raw,format=(string)NV12,width=320,height=240,framerate=(fraction)60/1' ! omxscaler \ ! 'video/x-raw,format=(string)YUY2,width=640,height=480,framerate=(fraction)60/1' ! perf print-arm-load=true ! fakesink silent=false -v
Down-scaling the v4l2src captured video to VGA
gst-launch-1.0 v4l2src io-mode=3 num-buffers=100 ! 'video/x-raw,format=(string)NV12,width=1280,height=720,framerate=(fraction)60/1' ! omxbufferalloc num-buffers=12 ! \ omxscaler ! 'video/x-raw,format=(string)YUY2,width=640,height=480,framerate=(fraction)60/1' ! perf print-arm-load=true ! fakesink silent=false -v
V4l2src video capture
RidgeRun has created some patches in order to have the gst-1.0 v4l2src plugin working in the DM81xx platforms. In this wiki Gst-1.0_v4l2src_in_DM81xx you could find more information about the changes done to get it work. Here are 2 useful examples using v4l2src:
V4l2 Capture and H264 encoded UDP streaming
These instructions show how to do video streaming over the network, a video will be captured on the board and viewed on the host. These pipelines use the port that you configure to send the packets. To configure the port and the host ip to send the streaming you have to set the variables port=$PORT and host=$HOST to your own).
- Server: DM81xx
Set the variables HOST and PORT to your own, for example:
HOST=10.251.101.43 PORT=3001
And run the pipeline:
gst-launch-1.0 v4l2src io-mode=3 device=/dev/video0 ! 'video/x-raw,format =(string)NV12,width=1280,height=720,framerate=(fraction)60/1' ! perf print-arm-load=true ! omxbufferalloc num-buffers=12 ! \ omxh264enc i-period=30 idr-period=30 ! mpegtsmux ! udpsink host=$HOST port=$PORT sync=false async=false
- Client: Ubuntu PC
Set the same port configured in the server:
PORT=3001
And run the pipeline:
gst-launch-1.0 udpsrc port=$PORT ! tsdemux ! h264parse ! avdec_h264 ! xvimagesink sync=false async=false
V4l2 Capture, H264 encoding and mp4 recording
With this example you could create a mp4 file recording from a camera or any device that provides you video. This pipeline uses the omxh264enc with its i-period and idr-period properties set to 10 in order to keep a good video quality when it is decode by another pipeline or software.
gst-launch-1.0 v4l2src io-mode=3 device=/dev/video0 num-buffers=1000 ! 'video/x-raw,format=(string)NV12,width=1280,height=720,framerate=(fraction)60/1' ! perf print-arm-load=true ! \ omxbufferalloc num-buffers=12 ! omxh264enc i-period=10 idr-period=10 ! queue ! rrh264parser single-nalu=true ! mp4mux dts-method=0 ! filesink location=test_v4l2src_1.0.mp4
Omx H264 decoding
With the omxh264 decoder you are able to decode H264 video streaming or a previously recorder video in order to have raw video to be processed:
Decode H264 video streaming
From the host side we have to generate a test video pattern to send via udp:
- Server: Ubuntu PC
Set the variables HOST and PORT to your own, for example:
HOST=10.251.101.120 PORT=3001
And run the pipeline:
gst-launch-1.0 videotestsrc is-live=true ! 'video/x-raw,format=(string)NV12,width=1280,height=720,framerate=(fraction)30/1' ! x264enc byte-stream=true key-int-max=2 ! \ h264parse ! mpegtsmux ! udpsink host=$HOST port=$PORT sync=false -v
- Client: DM81xx
Set the same port configured in the server:
PORT=3001
And run the pipeline:
gst-launch-1.0 udpsrc port=$PORT ! tsdemux ! h264parse ! queue ! omxh264dec ! fakesink silent=false -v
Decode H264 mp4 file
Set the file name to be decoded, for example:
FILE=test_v4l2src_1.0.mp4
And run the pipeline:
gst-launch-1.0 filesrc location=$FILE ! qtdemux ! h264parse ! perf ! queue ! omxh264dec ! fakesink -v
Omx video capture
You have to set the CAPS with the resolution and framerate of your video source, for example:
CAPS='video/x-raw,format=(string)NV12,width=1280,height=720,framerate=(fraction)60/1'
Simple video capture
gst-launch-1.0 omxcamera num-buffers=100 ! $CAPS ! perf print-arm-load=true ! fakesink silent=true -v
Capture and scaling
You have to set the resolution to scale, for example:
SCALECAPS='video/x-raw,format=(string)NV12,width=640,height=480,framerate=(fraction)60/1'
And run the pipeline:
gst-launch-1.0 omxcamera num-buffers=1000 ! $CAPS ! omxscaler ! $SCALECAPS ! perf print-arm-load=true ! fakesink silent=true -v
We could encode the captured and scaled video:
gst-launch-1.0 omxcamera num-buffers=1000 ! $CAPS ! omxscaler ! $SCALECAPS ! perf print-arm-load=true ! videoconvert ! omxh264enc ! fakesink silent=true -v
Omx Capture, H264 encoding and mp4 recording
gst-launch-1.0 omxcamera num-buffers=300 ! 'video/x-raw,format=(string)NV12,width=1280,height=720,framerate=(fraction)60/1' ! perf print-arm-load=true ! \ omxh264enc i-period=10 idr-period=10 ! queue ! rrh264parser single-nalu=true ! mp4mux dts-method=0 ! filesink location=test_omxcamera_1.0.mp4