Difference between revisions of "Qualcomm Robotics RB5/GStreamer pipelines/Video encoding"
(Created page with "<noinclude> {{Qualcomm Robotics RB5/Head|previous=GStreamer_pipelines/Capture_display|next=GStreamer_pipelines/Video_decoding|keywords=encoding,gstreamer}} </noinclude> __TOC_...") |
m |
||
Line 1: | Line 1: | ||
<noinclude> | <noinclude> | ||
− | {{Qualcomm Robotics RB5/Head|previous=GStreamer_pipelines/Capture_display|next=GStreamer_pipelines/Video_decoding| | + | {{Qualcomm Robotics RB5/Head|previous=GStreamer_pipelines/Capture_display|next=GStreamer_pipelines/Video_decoding|metakeywords=encoding,gstreamer}} |
</noinclude> | </noinclude> | ||
__TOC__ | __TOC__ | ||
{{DISPLAYTITLE:Qualcomm Robotics RB5 - Encoding GStreamer Pipelines|noerror}} | {{DISPLAYTITLE:Qualcomm Robotics RB5 - Encoding GStreamer Pipelines|noerror}} | ||
− | In this section we present some GStreamer pipelines to capture from the MIPI interface main camera (IMX577) in the Qualcomm Robotics RB5 development kit and encode the video in H264 and H265<ref name="CaptrueNDisplay">Camera Capture/Encode. Retrieved February 17, 2023, from [https://developer.qualcomm.com/qualcomm-robotics-rb5-kit/software-reference-manual/camera-and-video/camera-capture-and-encode]</ref>. The encoding is done in two cases: hardware accelerated encoding using OpenMAX and without hardware acceleration<ref name="EncodeGstreamer">GSteamer Plugins, omxh264enc. Retrieved February 17, 2023, from [https://developer.qualcomm.com/qualcomm-robotics-rb5-kit/software-reference-manual/application-semantics/gstreamer-plugins/omxh264enc]</ref>. We also measure key performance indicator for each pipeline. Check our [[Qualcomm_Robotics_RB5/GStreamer_pipelines|GStreamer Pipelines]] section to find more information about how we extracted the performance metrics presented in this section. For latency, we record the stopwatch in the | + | In this section, we present some GStreamer pipelines to capture from the MIPI interface main camera (IMX577) in the Qualcomm Robotics RB5 development kit and encode the video in H264 and H265<ref name="CaptrueNDisplay">Camera Capture/Encode. Retrieved February 17, 2023, from [https://developer.qualcomm.com/qualcomm-robotics-rb5-kit/software-reference-manual/camera-and-video/camera-capture-and-encode]</ref>. The encoding is done in two cases: hardware accelerated encoding using OpenMAX and without hardware acceleration<ref name="EncodeGstreamer">GSteamer Plugins, omxh264enc. Retrieved February 17, 2023, from [https://developer.qualcomm.com/qualcomm-robotics-rb5-kit/software-reference-manual/application-semantics/gstreamer-plugins/omxh264enc]</ref>. We also measure key performance indicator for each pipeline. Check our [[Qualcomm_Robotics_RB5/GStreamer_pipelines|GStreamer Pipelines]] section to find more information about how we extracted the performance metrics presented in this section. For latency, we record the stopwatch in the encoding pipeline and check the last frame in the generated video. |
==Hardware Acclerated from file source== | ==Hardware Acclerated from file source== | ||
− | The Qualcomm | + | The Qualcomm Robotics RB5 uses [https://developer.qualcomm.com/qualcomm-robotics-rb5-kit/software-reference-manual/camera-and-video/camera-video-architecture OpenMAX] to perform hardware accelerated encoding. Because of the camera architecture, the source element <code>qtiqmmfsrc</code> has the ability give encoded video thanks to the QMMF server. Because of this, we are going to explore two options to encode. The first is going to be directly setting the caps of the source element, so it calls the QMMF server and gives encoded video and the second one, we are going to use the element <code>omxh264enc</code> from the [https://developer.qualcomm.com/qualcomm-robotics-rb5-kit/software-reference-manual/application-semantics/gstreamer-plugins/omxh264enc gst-omx-plugin] that maps the OpenMAX APIs and states with those of GStreamer. In both cases, we are also going to encode in both the H264 format and H265. |
===Encoding directly from camera=== | ===Encoding directly from camera=== | ||
− | + | Here we are encoding the video by setting the caps of the source element <code>qtiqmmfsrc</code> for it to use the QMMF server, this will that allow the streaming of encoded video. | |
<br> | <br> | ||
====H264==== | ====H264==== | ||
− | The following pipeline captures from the main camera and sets the caps of | + | The following pipeline captures from the main camera and sets the caps of the source element to H264 encoding. We then parse and save a file in the MP4 format in the <code>/data/mux1.mp4</code> location. The capture has a resolution of 1920x1080 with a framerate of 30fps. Table 1 shows the performance metrics for this pipeline. |
<pre> | <pre> | ||
Line 123: | Line 123: | ||
===H264 Composed=== | ===H264 Composed=== | ||
− | Now we are going to show an example | + | Now we are going to show an example pipeline that encodes video in H264 format using encoding by software. To do this, we are using the <code>x264enc</code> element from GStreamer. These pipelines will capture from both cameras in the Qualcomm Robotics RB5 Development Kit: IMX577 and OV9282. Then we use the <code>composer</code> element to compose both frames side by side in the output video. Finally, this video is encoded to H264 and saved to an mp4 file. |
<br> | <br> | ||
Revision as of 12:43, 19 July 2023
Qualcomm Robotics RB5 RidgeRun documentation is currently under development. |
Contents
In this section, we present some GStreamer pipelines to capture from the MIPI interface main camera (IMX577) in the Qualcomm Robotics RB5 development kit and encode the video in H264 and H265[1]. The encoding is done in two cases: hardware accelerated encoding using OpenMAX and without hardware acceleration[2]. We also measure key performance indicator for each pipeline. Check our GStreamer Pipelines section to find more information about how we extracted the performance metrics presented in this section. For latency, we record the stopwatch in the encoding pipeline and check the last frame in the generated video.
Hardware Acclerated from file source
The Qualcomm Robotics RB5 uses OpenMAX to perform hardware accelerated encoding. Because of the camera architecture, the source element qtiqmmfsrc
has the ability give encoded video thanks to the QMMF server. Because of this, we are going to explore two options to encode. The first is going to be directly setting the caps of the source element, so it calls the QMMF server and gives encoded video and the second one, we are going to use the element omxh264enc
from the gst-omx-plugin that maps the OpenMAX APIs and states with those of GStreamer. In both cases, we are also going to encode in both the H264 format and H265.
Encoding directly from camera
Here we are encoding the video by setting the caps of the source element qtiqmmfsrc
for it to use the QMMF server, this will that allow the streaming of encoded video.
H264
The following pipeline captures from the main camera and sets the caps of the source element to H264 encoding. We then parse and save a file in the MP4 format in the /data/mux1.mp4
location. The capture has a resolution of 1920x1080 with a framerate of 30fps. Table 1 shows the performance metrics for this pipeline.
gst-launch-1.0 -e -v qtiqmmfsrc camera=0 ! "video/x-h264,format=NV12,width=1920,height=1080,framerate=30/1" ! h264parse \ ! queue ! mp4mux ! filesink location="/data/HW_H264_camera.mp4"
Operation Mode | CPU (%) | FPS |
---|---|---|
Max performance | 15.4 | 29.996 |
H265
The following pipeline captures from the main camera and sets the caps from the source element to H265 encoding. We then parse and save a file in the MP4 format in the /data/mux2.mp4
location. The capture has a resolution of 1920x1080 with a framerate of 30fps. Table 2 shows the performance metrics for this pipeline.
gst-launch-1.0 -e -v qtiqmmfsrc camera=0 ! "video/x-h265,format=NV12,profile=main,level=(string)5.2,width=1920,height=1080,framerate=30/1" ! h265parse \ ! queue ! mp4mux ! filesink location="/data/HW_H265_camera.mp4"
Operation Mode | CPU (%) | FPS |
---|---|---|
Max performance | 15.3 | 29.999 |
Encoding using OpenMAX plugins
In here we are encoding video using the OpenMAX plugins that will allow to use its platform for hardware accelerated encoding.
target-bitrate
, like the example pipelines.H264
The following pipeline captures from the main camera and the then the OMX plugin does the encoding to H264. We then parse and save a file in the MP4 format in the /data/mux3.mp4
location. The capture has a resolution of 1920x1080 with a framerate of 30fps. Table 3 shows the performance metrics for this pipeline.
gst-launch-1.0 -e -v qtiqmmfsrc camera=0 ! "video/x-raw(memory:GBM),format=NV12,width=1920,height=1080,framerate=30/1,profile=high,level=(string)5.1" \ ! omxh264enc target-bitrate=8000000 ! h264parse ! queue ! mp4mux ! filesink location="/data/HW_H264_OpenMax.mp4"
Operation Mode | CPU (%) | FPS | ProcTime (ms) |
---|---|---|---|
Max performance | 16.7 | 29.985 | 9.949 |
H265
The following pipeline captures from the main camera and the then the OMX plugin does the encoding to H265. We then parse and save a file in the MP4 format in the /data/mux4.mp4
location. The capture has a resolution of 1920x1080 with a framerate of 30fps. Table 4 shows the performance metrics for this pipeline.
gst-launch-1.0 -e -v qtiqmmfsrc camera=0 ! "video/x-raw(memory:GBM),level=(string)5.2,format=NV12,width=1920,height=1080,framerate=30/1" \ ! omxh265enc target-bitrate=8000000 ! h265parse ! queue ! mp4mux ! filesink location="/data/HW_H265_OpenMax.mp4"
Operation Mode | CPU (%) | FPS | ProcTime (ms) |
---|---|---|---|
Max performance | 16.4 | 30.005 | 12.787 |
Software encoding
H264 Composed
Now we are going to show an example pipeline that encodes video in H264 format using encoding by software. To do this, we are using the x264enc
element from GStreamer. These pipelines will capture from both cameras in the Qualcomm Robotics RB5 Development Kit: IMX577 and OV9282. Then we use the composer
element to compose both frames side by side in the output video. Finally, this video is encoded to H264 and saved to an mp4 file.
gst-launch-1.0 -v -e qtiqmmfsrc camera=0 name=imx577 ! "video/x-raw,format=NV12,width=640,height=400,framerate=15/1" ! videoconvert ! compositor.sink_0 \
qtiqmmfsrc camera=1 name=ov9282 ! "video/x-raw,format=NV12,width=640,height=400,framerate=15/1" ! videoconvert ! compositor.sink_1 \
compositor background=1 name=compositor sink_0::xpos=0 sink_0::ypos=0 sink_1::xpos=640 sink_1::ypos=0 \
! "video/x-raw,format=NV12,width=1280,height=400,framerate=15/1" ! videoconvert ! x264enc ! h264parse \
! mp4mux ! queue ! filesink location=/data/SW_composed_H264.mp4
Operation Mode | CPU (%) | FPS |
---|---|---|
Max performance | 25.2 | 15.008 |
Taking Snapshots
Another useful encoding format is JPEG. With a JPEG encoder we can use our cameras in the Qualcomm Robotics RB5 to take multiple snapshots! For this, we are using the jpegenc
element from GStreamer's good plug-ins. In our pipeline, we still use the qtiqmmfsrc
element from Qualcomm to capture from the camera. We then encode every capture and save each in a file.
gst-launch-1.0 -v -e qtiqmmfsrc camera=0 ! "video/x-raw,width=1280,height=800,framerate=30/1" ! videoconvert ! jpegenc ! queue ! multifilesink location="/data/capture%05d.jpg"
Operation Mode | CPU (%) | FPS |
---|---|---|
Max performance | 16.4 | 29.867 |
References