Difference between revisions of "NVIDIA Advanced GStreamer Pipelines with Gstd"

From RidgeRun Developer Connection
Jump to: navigation, search
m
 
(31 intermediate revisions by 2 users not shown)
Line 1: Line 1:
= Introduction =
+
<seo title="NVIDIA Advanced GStreamer Pipelines with Gstd | NVIDIA accelerated pipelines | RidgeRun" titlemode="replace" keywords="GStreamer, Linux SDK, Linux BSP,  Embedded Linux, Device Drivers, NVIDIA, Xilinx, TI, NXP, Freescale, Embedded Linux driver development, Linux Software development, Embedded Linux SDK, Embedded Linux Application development, GStreamer Multimedia Framework, NVIDIA accelerated pipelines, GStreamer Daemon, Gstd, GstInterpipe, H264, H265, H264 encoding, H265 encoding, NVIDIA_H265_Encoding, Jetson Xavier NX, Jetson Xavier, Xavier, Xavier NX, Xavier NX Pipelines"  description="This wiki intends to show how to handle different NVIDIA accelerated pipelines using GStreamer Daemon Gstd along with GstInterpipe.!"></seo>
  
This wiki intends to show how to handle different NVIDIA accelerated pipelines using [https://developer.ridgerun.com/wiki/index.php?title=GStreamer_Daemon Gstd] along with [https://developer.ridgerun.com/wiki/index.php?title=GstInterpipe GstInterpipe]. We will be using different pipelines to describe the system shown on the next figure:
+
<table>
 +
<tr>
 +
<td><div class="clear; float:right">__TOC__</div></td>
 +
<td>
 +
{{NVIDIA Preferred Partner logo}}
 +
<td>
 +
<td>
 +
{{GStreamer debug}}
 +
<td>
 +
<center>
 +
{{ContactUs Button}}
 +
</center>
 +
</tr>
 +
</table>
  
[[File:System pipelines design.png|1344px|thumb|center|System pipelines design]]
+
== Introduction ==
  
We will explain this example using the shell interface of Gstd on a Jetson Xavier NX device. However, you can check the other [https://developer.ridgerun.com/wiki/index.php?title=GStreamer_Daemon_-_API_Reference available APIs] for Gstreamer Daemon, which have a very similar syntax.  
+
This wiki intends to show how to handle different NVIDIA accelerated pipelines using [[GStreamer_Daemon | Gstd]] along with [[GstInterpipe | GstInterpipe]]. We will be using different pipelines to describe the system shown in the next figure:
 +
<br>
 +
<br>
 +
[[File:System pipelines design.png|1344px|thumb|center|<span style="color:blue">System pipelines design</span>]]
 +
<br>
 +
<br>
 +
We will explain this example using the shell interface of Gstd on a Jetson Xavier NX device. However, you can check the other [[GStreamer_Daemon_-_API_Reference | available APIs]] for GStreamer Daemon, which has a very similar syntax.
  
= Procedure =
+
== Procedure ==
  
First of all we need to initialize the Gstreamer Daemon. To do this open a terminal and run:
+
First of all, we need to initialize the GStreamer Daemon. To do this open a terminal and run:
  
 
<source lang=bash>
 
<source lang=bash>
Line 15: Line 34:
 
</source>
 
</source>
  
== Creating the Sources ==
+
=== Creating the Sources ===
  
=== Video RAW Source ===
+
==== Video RAW Source ====
  
Here we will be using a videotestsrc for compatibility but you could use either v4l2src or nvarguscamerasrc depending on your camera. The raw video pipeline will have a resolution of 1280x720@120fps, to demonstrate later how to decrease the framerate. This will be one of the branches that we can choose with the input selector.
+
Here we will be using a videotestsrc for compatibility but you could use either v4l2src or nvarguscamerasrc depending on your camera. The raw video pipeline will have a resolution of 1280x720@60fps, to demonstrate later how to decrease the framerate. This will be one of the branches that we can choose with the input selector.
  
 
<source lang=bash>
 
<source lang=bash>
Line 25: Line 44:
 
HEIGHT=720
 
HEIGHT=720
 
FRAMERATE=60
 
FRAMERATE=60
gst-client pipeline_create test_src videotestsrc is-live=true ! "video/x-raw,width=${WIDTH},height=${HEIGHT},framerate=${FRAMERATE}/1" ! interpipesink name=test_src sync=true
+
FORMAT="I420"
 +
gst-client pipeline_create test_src videotestsrc is-live=true ! "video/x-raw,width=${WIDTH},height=${HEIGHT},framerate=${FRAMERATE}/1,format=${FORMAT}" ! interpipesink name=test_src sync=true
 
</source>
 
</source>
  
=== UDP Source ===
+
{{Ambox
 +
|type=notice
 +
|small=left
 +
|issue='''NOTE''': For some use cases it might be useful to set a format of '''UYVP''' to simulate the output of an HD-SDI converter. If this is your case, set the format variable to<code>FORMAT="UYVP"</code>.
 +
|style=width:unset;
 +
}}
  
In this example we will generate a UDP stream to use as source in the same Xavier NX, however you could modify the udpsrc element properties to listen to any address you need.
+
==== UDP Source ====
  
Similar to the raw video pipeline we will use a videotestsrc stream with another pattern and send it locally through UDP, so that it can be listened by the actual UDP source branch from which we choose.
+
In this example we will generate a UDP stream to use as a source in the same Xavier NX, however, you could modify the udpsrc element properties to listen to any address you need.
 +
 
 +
Similar to the raw video pipeline we will use a videotestsrc stream with another pattern and send it locally through UDP so that it can be listened by the actual UDP source branch from which we choose.
  
 
<source lang=bash>
 
<source lang=bash>
Line 48: Line 75:
 
</source>
 
</source>
  
 +
=== Creating the Processing Pipeline ===
  
== Creating the Processing Pipeline ==
+
In this stage, our processing pipeline will be able to choose from either of the raw video or UDP streams and apply some processing. The processing here includes:
 
 
In this stage our processing pipeline will be able to choose from either of the raw video or UDP streams and apply some processing. The processing here includes:
 
  
 
* Framerate decrease (60 -> 30)
 
* Framerate decrease (60 -> 30)
Line 57: Line 83:
 
* Scaling (640x480)
 
* Scaling (640x480)
 
* Encoding (H265)
 
* Encoding (H265)
 +
* Performance monitoring (requires [https://github.com/RidgeRun/gst-perf gst-perf])
 
* UDP streaming with multicast
 
* UDP streaming with multicast
  
Line 69: Line 96:
 
ROI_RIGHT=1200
 
ROI_RIGHT=1200
 
FRAMERATE=30
 
FRAMERATE=30
gst-client pipeline_create proc_pipe interpipesrc name=interpipe listen-to=test_src is-live=true format=time ! videorate name=framerate_filter max-rate=${FRAMERATE} drop-only=true ! nvvidconv name=cropper top=${ROI_TOP} bottom=${ROI_BOTTOM} left=${ROI_LEFT} right=${ROI_RIGHT} ! capsfilter name=scale_filter caps="video/x-raw(memory:NVMM),width=${WIDTH},height=${HEIGHT}" ! queue max-size-buffers=3 leaky=downstream ! nvv4l2h265enc name=encoder bitrate=2000000 iframeinterval=300 vbv-size=33333 insert-sps-pps=true control-rate=constant_bitrate profile=Main num-B-Frames=0 ratecontrol-enable=true preset-level=UltraFastPreset EnableTwopassCBR=false maxperf-enable=true ! h265parse ! mpegtsmux alignment=7 ! queue ! udpsink name=output_udp host="${MULTICAST_IP}" port=${PORT} auto-multicast=true sync=false
+
gst-client pipeline_create proc_pipe interpipesrc name=interpipe listen-to=test_src is-live=true format=time ! videorate name=framerate_filter max-rate=${FRAMERATE} drop-only=true ! nvvidconv name=cropper top=${ROI_TOP} bottom=${ROI_BOTTOM} left=${ROI_LEFT} right=${ROI_RIGHT} ! capsfilter name=scale_filter caps="video/x-raw(memory:NVMM),width=${WIDTH},height=${HEIGHT}" ! queue max-size-buffers=3 leaky=downstream ! nvv4l2h265enc name=encoder bitrate=2000000 iframeinterval=300 vbv-size=33333 insert-sps-pps=true control-rate=constant_bitrate profile=Main num-B-Frames=0 ratecontrol-enable=true preset-level=UltraFastPreset EnableTwopassCBR=false maxperf-enable=true ! perf name=perf_monitor ! h265parse ! mpegtsmux alignment=7 ! queue ! udpsink name=output_udp host="${MULTICAST_IP}" port=${PORT} auto-multicast=true sync=false
 
</source>
 
</source>
  
== Interacting with the Pipelines ==
+
{{Ambox
 +
|type=notice
 +
|small=left
 +
|issue='''NOTE''': If you set an input format of '''UYVP''' to simulate the output of an HD-SDI converter, then you need to add a '''videoconvert''' element before the '''nvvidconv''' element, so the pipeline would be like this:
 +
|style=width:unset;
 +
}}
  
=== Basic Controls ===
+
<source lang=bash>
 +
gst-client pipeline_create proc_pipe interpipesrc name=interpipe listen-to=test_src is-live=true format=time ! videorate name=framerate_filter max-rate=${FRAMERATE} drop-only=true ! videoconvert ! nvvidconv name=cropper top=${ROI_TOP} bottom=${ROI_BOTTOM} left=${ROI_LEFT} right=${ROI_RIGHT} ! capsfilter name=scale_filter caps="video/x-raw(memory:NVMM),width=${WIDTH},height=${HEIGHT}" ! queue max-size-buffers=3 leaky=downstream ! nvv4l2h265enc name=encoder bitrate=2000000 iframeinterval=300 vbv-size=33333 insert-sps-pps=true control-rate=constant_bitrate profile=Main num-B-Frames=0 ratecontrol-enable=true preset-level=UltraFastPreset EnableTwopassCBR=false maxperf-enable=true ! perf name=perf_monitor ! h265parse ! mpegtsmux alignment=7 ! queue ! udpsink name=output_udp host="${MULTICAST_IP}" port=${PORT} auto-multicast=true sync=false
 +
</source>
 +
 
 +
=== Interacting with the Pipelines ===
 +
 
 +
==== Basic Controls ====
  
 
* Playing the pipelines
 
* Playing the pipelines
Line 112: Line 150:
 
</source>
 
</source>
  
=== Changing between Sources ===
+
==== Changing between Sources ====
  
 
You can choose to listen from the UDP source with:
 
You can choose to listen from the UDP source with:
Line 126: Line 164:
 
</source>
 
</source>
  
=== Setting the Output Resolution ===
+
==== Setting the Output Resolution ====
  
 
The output resolution is controlled by the scaler block and we can modify it by setting the scale_filter caps.
 
The output resolution is controlled by the scaler block and we can modify it by setting the scale_filter caps.
Line 136: Line 174:
 
</source>
 
</source>
  
=== Decreasing the Framerate ===
+
==== Setting the Encoder Input Format ====
 +
 
 +
The encoder input format is controlled also by the scaler block and we can modify it by setting the scale_filter caps. The supported values are '''NV12''', '''I420''' and '''P010_10LE'''.
 +
 
 +
<source lang=bash>
 +
NEW_FORMAT="P010_10LE"
 +
gst-client element_set proc_pipe scale_filter caps "video/x-raw(memory:NVMM),format=${NEW_FORMAT}"
 +
</source>
 +
 
 +
==== Decreasing the Framerate ====
  
 
The output framerate is controlled by the framerate filter and we can modify it by setting the max rate that it is allowed to pass.
 
The output framerate is controlled by the framerate filter and we can modify it by setting the max rate that it is allowed to pass.
Line 144: Line 191:
 
</source>
 
</source>
  
=== Modifying UDP Input and Output Settings ===
+
==== Modifying UDP Input and Output Settings ====
  
 
To modify the input settings just modify the udp_src pipeline:
 
To modify the input settings just modify the udp_src pipeline:
Line 162: Line 209:
 
</source>
 
</source>
  
=== Setting a ROI ===
+
==== Setting a ROI ====
  
 
You can set a specific ROI with the following command:
 
You can set a specific ROI with the following command:
Line 177: Line 224:
 
</source>
 
</source>
  
'''Note''': Keep in mind that the output ROI must be greater than 120x68.
+
{{Ambox
 +
|type=notice
 +
|small=left
 +
|issue='''NOTE:''' Keep in mind that the output ROI must be greater than 120x68.
 +
|style=width:unset;
 +
}}
  
=== Tuning the Encoder Settings ===
+
==== Tuning the Encoder Settings ====
  
You can modify any of the available encoder settings. You can check this [https://developer.ridgerun.com/wiki/index.php?title=NVIDIA_H265_Encoding_Configurations wiki] to see which setting are available using Gstreamer.
+
You can modify any of the available encoder settings. You can check this [[NVIDIA_H265_Encoding_Configurations | NVIDIA_H265_Encoding_Configurations]] to see which settings are available using GStreamer.
  
 
An example for setting the bitrate would be:
 
An example for setting the bitrate would be:
Line 193: Line 245:
 
</source>
 
</source>
  
= UDP Clients =
+
==== Querying Bitrate and Framerate ====
 +
 
 +
You can get the values of current and average bitrate and framerate by querying the perf element with the following command:
 +
 
 +
<source lang=bash>
 +
gst-client element_get proc_pipe perf_monitor last-info
 +
</source>
  
The clients for the final stream just need to connect to the correct multicast group. This can be done on the client side by running the following pipeline:
+
And it should give you an output similar to this:
  
 +
<source lang=bash>
 +
{
 +
  "code" : 0,
 +
  "description" : "Success",
 +
  "response" : {
 +
    "name" : "last-info",
 +
    "value" : "\"perf:\\ perf_monitor\\;\\ timestamp:\\ 1:03:21.318198410\\;\\ bps:\\ 3981424.000\\;\\ mean_bps:\\ 4627982.149\\;\\ fps:\\ 59.995\\;\\ mean_fps:\\ 60.023\"",
 +
    "param" : {
 +
        "description" : "A string containing the performance information posted to the GStreamer bus (timestamp, bps, mean_bps, fps, mean_fps)",
 +
        "type" : "gchararray",
 +
        "access" : "((GstdParamFlags) READ | 224)"
 +
    }
 +
}
 +
}
 +
</source>
 +
 +
== UDP Clients ==
 +
 +
The clients for the final stream just need to connect to the correct multicast group. This can be done on the client-side by running the following pipelines:
 +
 +
=== x86 ===
 
<source lang=bash>
 
<source lang=bash>
 
ADDRESS=224.1.1.1
 
ADDRESS=224.1.1.1
 
PORT=12345
 
PORT=12345
gst-launch-1.0 udpsrc port=${PORT} address=${ADDRESS} ! tsdemux ! h265parse ! avdec_h265 ! queue ! videoconvert ! autovideosink sync=false
+
gst-launch-1.0 udpsrc port=${PORT} address=${ADDRESS} ! tsdemux name=demux demux. ! queue ! h265parse ! avdec_h265 ! videoconvert ! autovideosink sync=false
 +
</source>
 +
 
 +
=== Jetson ===
 +
<source lang=bash>
 +
ADDRESS=224.1.1.1
 +
PORT=12345
 +
gst-launch-1.0 udpsrc port=${PORT} address=${ADDRESS} ! tsdemux name=demux demux. ! queue ! h265parse ! nvv4l2decoder ! nvvidconv ! xvimagesink sync=false
 +
</source>
 +
 
 +
 
 +
== Adding Metadata ==
 +
 
 +
Sometimes it might be required to include metadata in the stream to add any additional information. Our solution [[GStreamer_In-Band_Metadata_for_MPEG_Transport_Stream|GStreamer In-Band Metadata for MPEG Transport Stream]] makes it easier for your convenience.
 +
 
 +
=== Sending Metadata through TCP ===
 +
 
 +
In this example, we will show how to send/receive the metadata through a TCP socket. First, we create the TCP metadata source with the following pipeline:
 +
 
 +
<source lang=bash>
 +
META_SOURCE_IP="10.251.101.238"
 +
META_SOURCE_PORT="3001"
 +
gst-client pipeline_create tcp_meta metasrc is-live=true name=meta ! tcpserversink host=${META_SOURCE_IP} port=${META_SOURCE_PORT}
 +
</source>
 +
 
 +
The metasrc element may send any kind of binary metadata, but it requires an application to do that. Instead, we will show how to send specifically string data using gstd. To do this just set the ''metadata'' property with the text you would like to send:
 +
 
 +
<source lang=bash>
 +
gst-client element_set tcp_meta meta metadata Hello_TCP
 +
</source>
 +
 
 +
This will send the metadata just once, but if we want it to send it periodically we just need to set the ''period'' property of the metasrc element. For example, if we want to send it every second it would be like this:
 +
 
 +
<source lang=bash>
 +
gst-client element_set tcp_meta meta period 1
 +
</source>
 +
 
 +
Then we play the pipeline:
 +
 
 +
<source lang=bash>
 +
gst-client pipeline_play tcp_meta
 
</source>
 
</source>
  
[[Category:HowTo]][[Category:Jetson]][[Category:JetsonTX2]][[Category:JetsonXavierNX]]
+
=== Modifying the Processing Pipe to Support Metadata ===
 +
 
 +
We can add metadata support by slightly modifying the '''proc_pipe''' of our last example.
 +
 
 +
<source lang=bash>
 +
MULTICAST_IP="224.1.1.1"
 +
PORT="12345"
 +
META_SOURCE_IP="10.251.101.238"
 +
META_SOURCE_PORT="3001"
 +
WIDTH=640
 +
HEIGHT=480
 +
ROI_TOP=0
 +
ROI_BOTTOM=720
 +
ROI_LEFT=0
 +
ROI_RIGHT=1200
 +
FRAMERATE=30
 +
gst-client pipeline_create proc_pipe interpipesrc name=interpipe listen-to=test_src is-live=true format=time ! videorate name=framerate_filter max-rate=${FRAMERATE} drop-only=true ! nvvidconv name=cropper top=${ROI_TOP} bottom=${ROI_BOTTOM} left=${ROI_LEFT} right=${ROI_RIGHT} ! capsfilter name=scale_filter caps="video/x-raw(memory:NVMM),width=${WIDTH},height=${HEIGHT}" ! queue max-size-buffers=3 leaky=downstream ! nvv4l2h265enc name=encoder bitrate=2000000 iframeinterval=300 vbv-size=33333 insert-sps-pps=true control-rate=constant_bitrate profile=Main num-B-Frames=0 ratecontrol-enable=true preset-level=UltraFastPreset EnableTwopassCBR=false maxperf-enable=true ! h265parse ! mpegtsmux alignment=7 name=mux ! queue ! udpsink name=output_udp host="${MULTICAST_IP}" port=${PORT} auto-multicast=true sync=false \
 +
tcpclientsrc host=${META_SOURCE_IP} port=${META_SOURCE_PORT} ! queue ! mux.meta_54
 +
</source>
 +
 
 +
Here we are adding the incoming TCP stream to the MPEG-TS multiplexer so that receiver applications can process the metadata in a separate way too. The '''meta_54''' is an identifier for the stream used to demux the content of the receiving side in case there are multiple metadata streams muxed.
 +
 
 +
Then we play the processing pipeline:
 +
 
 +
<source lang=bash>
 +
gst-client pipeline_play proc_pipe
 +
</source>
 +
 
 +
=== Receiving Metadata ===
 +
 
 +
In order to receive the metadata we also need to modify the UDP receiver clients to demux each of the incoming streams:
 +
 
 +
<source lang=bash>
 +
ADDRESS=224.1.1.1
 +
PORT=12345
 +
gst-launch-1.0 udpsrc port=${PORT} address=${ADDRESS} ! tsdemux name=demux ! queue ! h265parse ! avdec_h265 ! queue ! videoconvert ! autovideosink sync=false demux.private_0_0036 ! queue ! 'meta/x-klv' ! metasink -v
 +
</source>
 +
 
 +
Note that in the '''private_0_0036''' identifier the '''36''' corresponds to the hexadecimal representation of the '''meta_54''' identifier we used in the muxer.
 +
 
 +
Here the ''metasink'' element will allow us to see without an extra application the sent metadata, since it will dump the contents to the standard output in a way similar to this:
 +
 
 +
<source lang=bash>
 +
00000000 (0x7fd0f002d590): 48 65 6c 6c 6f 5f 54 43 50 00                    Hello_TCP.     
 +
00000000 (0x7fd0f002d550): 48 65 6c 6c 6f 5f 54 43 50 00                    Hello_TCP.     
 +
00000000 (0x7fd0f002d5b0): 48 65 6c 6c 6f 5f 54 43 50 00                    Hello_TCP.     
 +
00000000 (0x7fd0f002d650): 48 65 6c 6c 6f 5f 54 43 50 00                    Hello_TCP.     
 +
00000000 (0x7fd0f002d670): 48 65 6c 6c 6f 5f 54 43 50 00                    Hello_TCP.     
 +
00000000 (0x7fd0f002d690): 48 65 6c 6c 6f 5f 54 43 50 00                    Hello_TCP.     
 +
00000000 (0x7fd0f002d6b0): 48 65 6c 6c 6f 5f 54 43 50 00                    Hello_TCP.
 +
</source>
 +
 
 +
=== Extracting and Processing Metadata ===
 +
 
 +
In case you want to apply some processing rather than just print to the standard output, you can create an application instead of the gst-launch-1.0 pipeline and use '''appsink''' instead of metasink to extract the data. Here's an example using the last pipeline with appsink to extract the data:
 +
 
 +
<source lang=python>
 +
import sys
 +
 
 +
import gi
 +
gi.require_version("Gst", "1.0")
 +
from gi.repository import Gst, GObject
 +
import numpy
 +
 
 +
 
 +
def message_handler(bus, msg, loop):
 +
    """Handle gracefully the EOS and errors"""
 +
    if msg.type in [Gst.MessageType.EOS, Gst.MessageType.ERROR]:
 +
        loop.quit()
 +
 
 +
 
 +
def on_new_sample(sink, data):
 +
    """Get the KLV data on every buffer the appsink receives"""
 +
    sample = sink.emit("pull-sample")
 +
    buffer = sample.get_buffer()
 +
    size = buffer.get_size()
 +
 
 +
    # Extract the KLV data into an array and do your processing
 +
    klv_array = numpy.ndarray(
 +
        size, buffer=buffer.extract_dup(0, size), dtype=numpy.uint8)
 +
 
 +
    print("\nMeta: ", end="")
 +
    for byte in klv_array:
 +
        print(chr(byte), end="")
 +
 
 +
    return Gst.FlowReturn.OK
 +
 
 +
 
 +
def main(args):
 +
    Gst.init(args)
 +
 
 +
    timeout_seconds = 3
 +
 
 +
    pipeline = Gst.parse_launch(
 +
        "udpsrc address=224.1.1.1 port=12345 ! tsdemux name=demux ! queue ! h265parse ! avdec_h265 "
 +
        "! queue ! videoconvert ! autovideosink sync=false demux.private_0_0036 "
 +
        "! queue ! meta/x-klv ! appsink name=sink emit-signals=true")
 +
 
 +
    sink = pipeline.get_by_name("sink")
 +
    sink.connect("new-sample", on_new_sample, sink)
 +
 
 +
    # Init GObject loop to handle Gstreamer Bus Events
 +
    loop = GObject.MainLoop()
 +
 
 +
    # Listen to bus messages to handle errors and EOS
 +
 
 +
    bus = pipeline.get_bus()
 +
    bus.add_signal_watch()
 +
    bus.enable_sync_message_emission()
 +
    bus.connect("message", message_handler, loop)
 +
 
 +
    print("Playing...\nPress Ctrl+C to exit\n")
 +
 
 +
    pipeline.set_state(Gst.State.PLAYING)
 +
    pipeline.get_state(timeout_seconds * Gst.SECOND)
 +
 
 +
    try:
 +
        loop.run()
 +
    except BaseException:
 +
        loop.quit()
 +
 
 +
    print("\nClosing app...")
 +
    pipeline.set_state(Gst.State.NULL)
 +
    pipeline.get_state(timeout_seconds * Gst.SECOND)
 +
 
 +
 
 +
if __name__ == "__main__":
 +
    sys.exit(main(sys.argv))
 +
 
 +
</source>
 +
 
 +
{{ContactUs}}
 +
 
 +
[[Category:GStreamer]][[Category:Jetson]][[Category:JetsonNano]][[Category:JetsonTX2]][[Category:NVIDIA Xavier]][[Category:JetsonXavierNX]]

Latest revision as of 05:11, 24 August 2022

Nvidia-preferred-partner-badge-rgb-for-screen.png

Error something wrong.jpg Problems running the pipelines shown on this page?
Please see our GStreamer Debugging guide for help.

RR Contact Us.png

Introduction

This wiki intends to show how to handle different NVIDIA accelerated pipelines using Gstd along with GstInterpipe. We will be using different pipelines to describe the system shown in the next figure:

Error creating thumbnail: Unable to save thumbnail to destination
System pipelines design



We will explain this example using the shell interface of Gstd on a Jetson Xavier NX device. However, you can check the other available APIs for GStreamer Daemon, which has a very similar syntax.

Procedure

First of all, we need to initialize the GStreamer Daemon. To do this open a terminal and run:

gstd

Creating the Sources

Video RAW Source

Here we will be using a videotestsrc for compatibility but you could use either v4l2src or nvarguscamerasrc depending on your camera. The raw video pipeline will have a resolution of 1280x720@60fps, to demonstrate later how to decrease the framerate. This will be one of the branches that we can choose with the input selector.

WIDTH=1280
HEIGHT=720
FRAMERATE=60
FORMAT="I420"
gst-client pipeline_create test_src videotestsrc is-live=true ! "video/x-raw,width=${WIDTH},height=${HEIGHT},framerate=${FRAMERATE}/1,format=${FORMAT}" ! interpipesink name=test_src sync=true

UDP Source

In this example we will generate a UDP stream to use as a source in the same Xavier NX, however, you could modify the udpsrc element properties to listen to any address you need.

Similar to the raw video pipeline we will use a videotestsrc stream with another pattern and send it locally through UDP so that it can be listened by the actual UDP source branch from which we choose.

PORT_H265=8000
WIDTH=1280
HEIGHT=720
FRAMERATE=60 
gst-client pipeline_create udp_sink videotestsrc pattern=ball flip=true is-live=true ! "video/x-raw,width=${WIDTH},height=${HEIGHT},framerate=${FRAMERATE}/1" ! nvvidconv ! queue max-size-buffers=3 leaky=downstream ! nvv4l2h265enc bitrate=2000000 iframeinterval=300 vbv-size=33333 insert-sps-pps=true control-rate=constant_bitrate profile=Main num-B-Frames=0 ratecontrol-enable=true preset-level=UltraFastPreset EnableTwopassCBR=false maxperf-enable=true ! h265parse ! mpegtsmux alignment=7 ! queue ! udpsink host=127.0.0.1 port=${PORT_H265}

And now we create the actual UDP source branch that is listening to the locally generated UDP stream.

gst-client pipeline_create udp_src udpsrc port=${PORT_H265} ! tsdemux ! h265parse ! nvv4l2decoder ! nvvidconv ! "video/x-raw" ! interpipesink name=udp_src sync=true

Creating the Processing Pipeline

In this stage, our processing pipeline will be able to choose from either of the raw video or UDP streams and apply some processing. The processing here includes:

  • Framerate decrease (60 -> 30)
  • ROI cropping (80px from the right)
  • Scaling (640x480)
  • Encoding (H265)
  • Performance monitoring (requires gst-perf)
  • UDP streaming with multicast
MULTICAST_IP="224.1.1.1"
PORT="12345"
WIDTH=640
HEIGHT=480
ROI_TOP=0
ROI_BOTTOM=720
ROI_LEFT=0
ROI_RIGHT=1200
FRAMERATE=30
gst-client pipeline_create proc_pipe interpipesrc name=interpipe listen-to=test_src is-live=true format=time ! videorate name=framerate_filter max-rate=${FRAMERATE} drop-only=true ! nvvidconv name=cropper top=${ROI_TOP} bottom=${ROI_BOTTOM} left=${ROI_LEFT} right=${ROI_RIGHT} ! capsfilter name=scale_filter caps="video/x-raw(memory:NVMM),width=${WIDTH},height=${HEIGHT}" ! queue max-size-buffers=3 leaky=downstream ! nvv4l2h265enc name=encoder bitrate=2000000 iframeinterval=300 vbv-size=33333 insert-sps-pps=true control-rate=constant_bitrate profile=Main num-B-Frames=0 ratecontrol-enable=true preset-level=UltraFastPreset EnableTwopassCBR=false maxperf-enable=true ! perf name=perf_monitor ! h265parse ! mpegtsmux alignment=7 ! queue ! udpsink name=output_udp host="${MULTICAST_IP}" port=${PORT} auto-multicast=true sync=false
gst-client pipeline_create proc_pipe interpipesrc name=interpipe listen-to=test_src is-live=true format=time ! videorate name=framerate_filter max-rate=${FRAMERATE} drop-only=true ! videoconvert ! nvvidconv name=cropper top=${ROI_TOP} bottom=${ROI_BOTTOM} left=${ROI_LEFT} right=${ROI_RIGHT} ! capsfilter name=scale_filter caps="video/x-raw(memory:NVMM),width=${WIDTH},height=${HEIGHT}" ! queue max-size-buffers=3 leaky=downstream ! nvv4l2h265enc name=encoder bitrate=2000000 iframeinterval=300 vbv-size=33333 insert-sps-pps=true control-rate=constant_bitrate profile=Main num-B-Frames=0 ratecontrol-enable=true preset-level=UltraFastPreset EnableTwopassCBR=false maxperf-enable=true ! perf name=perf_monitor ! h265parse ! mpegtsmux alignment=7 ! queue ! udpsink name=output_udp host="${MULTICAST_IP}" port=${PORT} auto-multicast=true sync=false

Interacting with the Pipelines

Basic Controls

  • Playing the pipelines
gst-client pipeline_play udp_sink
gst-client pipeline_play udp_src
gst-client pipeline_play test_src
gst-client pipeline_play proc_pipe
  • Pausing the pipelines
gst-client pipeline_pause udp_sink
gst-client pipeline_pause udp_src
gst-client pipeline_pause test_src
gst-client pipeline_pause proc_pipe
  • Stopping the pipelines
gst-client pipeline_stop udp_sink
gst-client pipeline_stop udp_src
gst-client pipeline_stop test_src
gst-client pipeline_stop proc_pipe
  • Deleting the pipelines
gst-client pipeline_delete udp_sink
gst-client pipeline_delete udp_src
gst-client pipeline_delete test_src
gst-client pipeline_delete proc_pipe

Changing between Sources

You can choose to listen from the UDP source with:

gst-client element_set proc_pipe interpipe listen-to udp_src

Or from the raw video source with:

gst-client element_set proc_pipe interpipe listen-to test_src

Setting the Output Resolution

The output resolution is controlled by the scaler block and we can modify it by setting the scale_filter caps.

NEW_WIDTH=320
NEW_HEIGHT=240
gst-client element_set proc_pipe scale_filter caps "video/x-raw(memory:NVMM),width=${NEW_WIDTH},height=${NEW_HEIGHT}"

Setting the Encoder Input Format

The encoder input format is controlled also by the scaler block and we can modify it by setting the scale_filter caps. The supported values are NV12, I420 and P010_10LE.

NEW_FORMAT="P010_10LE"
gst-client element_set proc_pipe scale_filter caps "video/x-raw(memory:NVMM),format=${NEW_FORMAT}"

Decreasing the Framerate

The output framerate is controlled by the framerate filter and we can modify it by setting the max rate that it is allowed to pass.

gst-client element_set proc_pipe framerate_filter max-rate 30

Modifying UDP Input and Output Settings

To modify the input settings just modify the udp_src pipeline:

NEW_PORT=12346
gst-client element_set udp_src input_udp port ${NEW_PORT}

And for the output modify the proc_pipe pipeline:

NEW_ADDRESS=225.1.1.1
NEW_PORT=12346
gst-client element_set proc_pipe output_udp host ${NEW_ADDRESS}
gst-client element_set proc_pipe output_udp port ${NEW_PORT}

Setting a ROI

You can set a specific ROI with the following command:

NEW_TOP=100
NEW_BOTTOM=600
NEW_LEFT=100
NEW_RIGHT=1200
gst-client element_set proc_pipe cropper top ${NEW_TOP}
gst-client element_set proc_pipe cropper bottom ${NEW_BOTTOM}
gst-client element_set proc_pipe cropper left ${NEW_LEFT}
gst-client element_set proc_pipe cropper right ${NEW_RIGHT}

Tuning the Encoder Settings

You can modify any of the available encoder settings. You can check this NVIDIA_H265_Encoding_Configurations to see which settings are available using GStreamer.

An example for setting the bitrate would be:

# 10Mbit/s
gst-client element_set proc_pipe encoder bitrate 10000000

# 1Mbit/s
gst-client element_set proc_pipe encoder bitrate 1000000

Querying Bitrate and Framerate

You can get the values of current and average bitrate and framerate by querying the perf element with the following command:

gst-client element_get proc_pipe perf_monitor last-info

And it should give you an output similar to this:

{
  "code" : 0,
  "description" : "Success",
  "response" : {
    "name" : "last-info",
    "value" : "\"perf:\\ perf_monitor\\;\\ timestamp:\\ 1:03:21.318198410\\;\\ bps:\\ 3981424.000\\;\\ mean_bps:\\ 4627982.149\\;\\ fps:\\ 59.995\\;\\ mean_fps:\\ 60.023\"",
    "param" : {
        "description" : "A string containing the performance information posted to the GStreamer bus (timestamp, bps, mean_bps, fps, mean_fps)",
        "type" : "gchararray",
        "access" : "((GstdParamFlags) READ | 224)"
    }
}
}

UDP Clients

The clients for the final stream just need to connect to the correct multicast group. This can be done on the client-side by running the following pipelines:

x86

ADDRESS=224.1.1.1
PORT=12345
gst-launch-1.0 udpsrc port=${PORT} address=${ADDRESS} ! tsdemux name=demux demux. ! queue !  h265parse ! avdec_h265 ! videoconvert ! autovideosink sync=false

Jetson

ADDRESS=224.1.1.1
PORT=12345
gst-launch-1.0 udpsrc port=${PORT} address=${ADDRESS} ! tsdemux name=demux demux. ! queue !  h265parse ! nvv4l2decoder ! nvvidconv ! xvimagesink sync=false


Adding Metadata

Sometimes it might be required to include metadata in the stream to add any additional information. Our solution GStreamer In-Band Metadata for MPEG Transport Stream makes it easier for your convenience.

Sending Metadata through TCP

In this example, we will show how to send/receive the metadata through a TCP socket. First, we create the TCP metadata source with the following pipeline:

META_SOURCE_IP="10.251.101.238"
META_SOURCE_PORT="3001"
gst-client pipeline_create tcp_meta metasrc is-live=true name=meta ! tcpserversink host=${META_SOURCE_IP} port=${META_SOURCE_PORT}

The metasrc element may send any kind of binary metadata, but it requires an application to do that. Instead, we will show how to send specifically string data using gstd. To do this just set the metadata property with the text you would like to send:

gst-client element_set tcp_meta meta metadata Hello_TCP

This will send the metadata just once, but if we want it to send it periodically we just need to set the period property of the metasrc element. For example, if we want to send it every second it would be like this:

gst-client element_set tcp_meta meta period 1

Then we play the pipeline:

gst-client pipeline_play tcp_meta

Modifying the Processing Pipe to Support Metadata

We can add metadata support by slightly modifying the proc_pipe of our last example.

MULTICAST_IP="224.1.1.1"
PORT="12345"
META_SOURCE_IP="10.251.101.238"
META_SOURCE_PORT="3001"
WIDTH=640
HEIGHT=480
ROI_TOP=0
ROI_BOTTOM=720
ROI_LEFT=0
ROI_RIGHT=1200
FRAMERATE=30
gst-client pipeline_create proc_pipe interpipesrc name=interpipe listen-to=test_src is-live=true format=time ! videorate name=framerate_filter max-rate=${FRAMERATE} drop-only=true ! nvvidconv name=cropper top=${ROI_TOP} bottom=${ROI_BOTTOM} left=${ROI_LEFT} right=${ROI_RIGHT} ! capsfilter name=scale_filter caps="video/x-raw(memory:NVMM),width=${WIDTH},height=${HEIGHT}" ! queue max-size-buffers=3 leaky=downstream ! nvv4l2h265enc name=encoder bitrate=2000000 iframeinterval=300 vbv-size=33333 insert-sps-pps=true control-rate=constant_bitrate profile=Main num-B-Frames=0 ratecontrol-enable=true preset-level=UltraFastPreset EnableTwopassCBR=false maxperf-enable=true ! h265parse ! mpegtsmux alignment=7 name=mux ! queue ! udpsink name=output_udp host="${MULTICAST_IP}" port=${PORT} auto-multicast=true sync=false \
tcpclientsrc host=${META_SOURCE_IP} port=${META_SOURCE_PORT} ! queue ! mux.meta_54

Here we are adding the incoming TCP stream to the MPEG-TS multiplexer so that receiver applications can process the metadata in a separate way too. The meta_54 is an identifier for the stream used to demux the content of the receiving side in case there are multiple metadata streams muxed.

Then we play the processing pipeline:

gst-client pipeline_play proc_pipe

Receiving Metadata

In order to receive the metadata we also need to modify the UDP receiver clients to demux each of the incoming streams:

ADDRESS=224.1.1.1
PORT=12345
gst-launch-1.0 udpsrc port=${PORT} address=${ADDRESS} ! tsdemux name=demux ! queue ! h265parse ! avdec_h265 ! queue ! videoconvert ! autovideosink sync=false demux.private_0_0036 ! queue ! 'meta/x-klv' ! metasink -v

Note that in the private_0_0036 identifier the 36 corresponds to the hexadecimal representation of the meta_54 identifier we used in the muxer.

Here the metasink element will allow us to see without an extra application the sent metadata, since it will dump the contents to the standard output in a way similar to this:

00000000 (0x7fd0f002d590): 48 65 6c 6c 6f 5f 54 43 50 00                    Hello_TCP.      
00000000 (0x7fd0f002d550): 48 65 6c 6c 6f 5f 54 43 50 00                    Hello_TCP.      
00000000 (0x7fd0f002d5b0): 48 65 6c 6c 6f 5f 54 43 50 00                    Hello_TCP.      
00000000 (0x7fd0f002d650): 48 65 6c 6c 6f 5f 54 43 50 00                    Hello_TCP.      
00000000 (0x7fd0f002d670): 48 65 6c 6c 6f 5f 54 43 50 00                    Hello_TCP.      
00000000 (0x7fd0f002d690): 48 65 6c 6c 6f 5f 54 43 50 00                    Hello_TCP.      
00000000 (0x7fd0f002d6b0): 48 65 6c 6c 6f 5f 54 43 50 00                    Hello_TCP.

Extracting and Processing Metadata

In case you want to apply some processing rather than just print to the standard output, you can create an application instead of the gst-launch-1.0 pipeline and use appsink instead of metasink to extract the data. Here's an example using the last pipeline with appsink to extract the data:

import sys

import gi
gi.require_version("Gst", "1.0")
from gi.repository import Gst, GObject
import numpy


def message_handler(bus, msg, loop):
    """Handle gracefully the EOS and errors"""
    if msg.type in [Gst.MessageType.EOS, Gst.MessageType.ERROR]:
        loop.quit()


def on_new_sample(sink, data):
    """Get the KLV data on every buffer the appsink receives"""
    sample = sink.emit("pull-sample")
    buffer = sample.get_buffer()
    size = buffer.get_size()

    # Extract the KLV data into an array and do your processing
    klv_array = numpy.ndarray(
        size, buffer=buffer.extract_dup(0, size), dtype=numpy.uint8)

    print("\nMeta: ", end="")
    for byte in klv_array:
        print(chr(byte), end="")

    return Gst.FlowReturn.OK


def main(args):
    Gst.init(args)

    timeout_seconds = 3

    pipeline = Gst.parse_launch(
        "udpsrc address=224.1.1.1 port=12345 ! tsdemux name=demux ! queue ! h265parse ! avdec_h265 "
        "! queue ! videoconvert ! autovideosink sync=false demux.private_0_0036 "
        "! queue ! meta/x-klv ! appsink name=sink emit-signals=true")

    sink = pipeline.get_by_name("sink")
    sink.connect("new-sample", on_new_sample, sink)

    # Init GObject loop to handle Gstreamer Bus Events
    loop = GObject.MainLoop()

    # Listen to bus messages to handle errors and EOS

    bus = pipeline.get_bus()
    bus.add_signal_watch()
    bus.enable_sync_message_emission()
    bus.connect("message", message_handler, loop)

    print("Playing...\nPress Ctrl+C to exit\n")

    pipeline.set_state(Gst.State.PLAYING)
    pipeline.get_state(timeout_seconds * Gst.SECOND)

    try:
        loop.run()
    except BaseException:
        loop.quit()

    print("\nClosing app...")
    pipeline.set_state(Gst.State.NULL)
    pipeline.get_state(timeout_seconds * Gst.SECOND)


if __name__ == "__main__":
    sys.exit(main(sys.argv))


RidgeRun Resources

Quick Start Client Engagement Process RidgeRun Blog Homepage
Technical and Sales Support RidgeRun Online Store RidgeRun Videos Contact Us

OOjs UI icon message-progressive.svg Contact Us

Visit our Main Website for the RidgeRun Products and Online Store. RidgeRun Engineering informations are available in RidgeRun Professional Services, RidgeRun Subscription Model and Client Engagement Process wiki pages. Please email to support@ridgerun.com for technical questions and contactus@ridgerun.com for other queries. Contact details for sponsoring the RidgeRun GStreamer projects are available in Sponsor Projects page. Ridgerun-logo.svg
RR Contact Us.png