GStreamer pipelines for Jetson with GstQtOverlay

From RidgeRun Developer Connection
< GStreamer Qt Overlay for Embedded Systems‎ | Examples
Revision as of 13:12, 7 August 2023 by Spalli (talk | contribs)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search


Previous: Examples/i.MX6 Index Next: Examples/User_Cases






You can run all the examples presented for a PC-based platform without issues in an NVIDIA Jetson. Nevertheless, you can use NVMM support for Jetson provided it's on Jetpack 4.X, which can be even faster since you reduce the number of memory copies from userspace to device memory and vice-versa. Otherwise NVMM is not yet supported on Jetson with Jetpack 5.X.

Depending on your setup (whether you have a display connected or not), please, take into account configuring the display server by setting the variables presented in Running without a graphics server. In case of this examples, there were executed over ssh, using the following variables:

export QT_QPA_PLATFORM=eglfs
export DISPLAY=:1.0

Display a text overlay

Take the file main.qml used in the PC example.

On Jetson, you can use the nvoverlaysink element, which removes a memory copy (from NVMM to userspace, and a videoconvert in CPU):

gst-launch-1.0 nvarguscamerasrc ! nvvidconv ! qtoverlay qml=main.qml ! nvoverlaysink sync=false

For using the nvarguscamerasrc, your Jetson shall have a camera connected.

gst-launch-1.0 nvarguscamerasrc ! nvvidconv ! qtoverlay qml=main.qml ! nvvidconv ! "video/x-raw(memory:NVMM),format=I420" ! nvoverlaysink sync=false

The nvvidconv elements are intended for converting to RGBA back-and-forth because of QT compatibility

You can still NVMM when using user-space elements, like videoconvert or videotestsrc:

gst-launch-1.0 videotestsrc ! nvvidconv ! 'video/x-raw(memory:NVMM)' ! qtoverlay qml=main.qml ! nvvidconv ! xvimagesink sync=false
gst-launch-1.0 videotestsrc ! nvvidconv ! 'video/x-raw(memory:NVMM)' ! qtoverlay qml=main.qml ! nvoverlaysink sync=false

Any of these will have the same result, as show on Figure 1.

Figure 1. Simple example with label and camera source


Animated gif overlay

Take the file animation.qml used in the PC example.

Run the example shown above using NVMM:

gst-launch-1.0 nvarguscamerasrc ! nvvidconv ! qtoverlay qml=animation.qml ! nvoverlaysink

You can use the other pipelines shown in other examples. The only change is the qml property of qtoverlay.

gst-launch-1.0 nvarguscamerasrc ! nvvidconv ! qtoverlay qml=animation.qml ! nvvidconv ! "video/x-raw(memory:NVMM),format=I420" ! nvoverlaysink
Figure 2. Simple example with gif and camera source

NvArgus camera to display

Take the file main.qml used in the PC example.

This example manages the memory throughout the pipeline in NVMM format, never bringing the data to userspace.

gst-launch-1.0 nvarguscamerasrc ! nvvidconv ! qtoverlay qml=main.qml ! nvoverlaysink
gst-launch-1.0 nvarguscamerasrc ! nvvidconv ! qtoverlay qml=main.qml ! nvvidconv ! "video/x-raw(memory:NVMM),format=I420" ! nvoverlaysink

If you want to use X for displaying:

gst-launch-1.0 nvarguscamerasrc ! nvvidconv ! qtoverlay qml=main.qml ! nvvidconv ! nvvidconv ! ximagesink sync=false

The first nvvidconv is used for converting from any format to RGBA. Then, the two nvvidconvs at the end are used for converting from RBGA to a format acceptable by ximagesink, and the transfer from NVMM memory to user-space memory.

V4L2 camera to display

Take the file main.qml used in the PC example.

Except for v4l2src, this example manages the memory in NVMM format, never bringing the data to userspace.

gst-launch-1.0 v4l2src ! nvvidconv ! qtoverlay qml=main.qml ! nvoverlaysink

Capturing and saving into a file

Having a display is not needed to use the GstQtOverlay plug-in. The display should be set for the Qt Engine to have a reference of the GPU resources available, especially for the EGL.

You can have even pipelines with multiple paths. Please, find the following example by using a 4K camera as a capture device:

With NVMM support:

gst-launch-1.0 nvarguscamerasrc \
   ! nvvidconv \
   ! qtoverlay qml=main.qml \
   ! nvvidconv \
   ! nvv4l2h264enc  maxperf-enable=1 \
   ! h264parse \
   ! qtmux ! filesink location=test.mp4 -e

Some important aspects of the NVMM support observed from the pipeline shown above:

1. Each QtOverlay element should have a fresh NVMM buffer since it works in-place. It is achieved by using an nvvidconv element.

2. The storage can represent a bottleneck. It is recommended to handle the file dumping on a fast storage unit such as SSD or RAM disk.

Without NVMM

The following pipeline uses qtoverlay in Non-NVMM mode. In this case, take into account the video conversion and the encoding, which are done in the CPU. Besides, there is a need for video conversion from RGBA because of QT compatibility.

gst-launch-1.0 nvarguscamerasrc \
   ! nvvidconv \
   ! qtoverlay qml=main.qml \
   ! videoconvert \
   ! x264enc \
   ! h264parse \
   ! qtmux ! filesink location=test.mp4 -e

For an even simpler case:

gst-launch-1.0 videotestsrc \
   ! qtoverlay qml=main.qml \
   ! videoconvert \
   ! x264enc \
   ! h264parse \
   ! qtmux ! filesink location=test.mp4 -e

The pipelines shown above was tested on a Jetson Nano, performing faster than 30fps.

Example with property set

The same 4th example of the pc examples is used here, just with a pipeline change.

  • Pipeline:
nvarguscamerasrc sensor-id=0 ! video/x-raw(memory:NVMM),width=2880, height=1860, framerate=20/1, format=NV12 ! nvvidconv ! video/x-raw,width=600, height=500, framerate=20/1, format=RGBA ! qtoverlay qml=main.qml name=my_overlay ! nvvidconv ! nvv4l2h265enc ! h265parse ! qtmux ! filesink location=video_out.mp4
Figure 3. Example with property set and video source


Stream video over the network from RSTP source

Let's assume:

SOURCE_ADRESS=rtsp://10.251.101.176:5004/test: origin rtsp video

DEST_ADRESS=10.251.101.57: destination computer/board

PORT=5004: source/dest port


On an NVIDIA Jetson board:

gst-launch-1.0 rtspsrc location=$SOURCE_ADRESS ! rtph264depay ! h264parse  ! omxh264dec ! nvvidconv ! video/x-raw ! qtoverlay qml=main.qml ! videoconvert ! video/x-raw,format=I420 ! queue ! jpegenc ! rtpjpegpay ! udpsink host=$DEST_ADRESS port=$PORT sync=false enable-last-sample=false max-lateness=00000000  -v


Previous: Examples/i.MX6 Index Next: Examples/User_Cases