Sony IMX264 Linux Driver

From RidgeRun Developer Connection
Jump to: navigation, search

Nvidia-preferred-partner-badge-rgb-for-screen.png

Error something wrong.jpg Problems running the pipelines shown on this page?
Please see our GStreamer Debugging guide for help.

RR Contact Us.png


Sony IMX264LQR Features

The IMX264 is a CMOS image sensor with a 3.45um pixel and global shutter function for industrial applications. This small-sized 3.45um pixel realizes higher sensitivity and lower noise than the existing 5.86 um pixel products and achieves high picture quality, high resolution, and high-speed imaging without focal plane distortion. In addition, these new CMOS image sensors are equipped with a variety of functions such as a trigger mode that arbitrarily controls the storage time using an external trigger signal, ROI (region of interest) mode. (Taken from Sony's webpage).

Sony IMX264LLR/LQR, IMX265LLR/LQR Sensor Specifications!

Sony IMX264 Linux Driver Support

  • NVIDIA Jetson Xavier NX

IMX264 Linux Driver Features

Xavier NX
Feature Details SDK Support
2656x2088@35fps 2 Lanes L4T 32.4.4 / Jetpack 4.4.1
1920x1106@30fps 2 Lanes L4T 32.4.4 / Jetpack 4.4.1
V4l2 Media Controller driver L4T 32.4.4 / Jetpack 4.4.1
ISP usage through NvArgusCameraSrc Only RGGB12 support L4T 32.4.4 / Jetpack 4.4.1
Gain control L4T 32.4.4 / Jetpack 4.4.1
Exposure control L4T 32.4.4 / Jetpack 4.4.1
Gain delay control L4T 32.4.4 / Jetpack 4.4.1
Black level control L4T 32.4.4 / Jetpack 4.4.1
2 simultaneous cameras L4T 32.4.4 / Jetpack 4.4.1

Enabling the IMX264 Linux Driver

In order to use this driver, you have to patch and compile the kernel source using JetPack:

  • Once you have the source code, apply the following patch in order to add the changes required for the IMX264 camera at kernel and dtb level.
4.4.1_imx264.patch
  • Follow the instructions in Build Kernel for building the kernel, and then flash the image.

Make sure to enable IMX264 driver support:

make menuconfig
-> Device Drivers                                                                                                                        
  -> Multimedia support                                                                                           
    -> NVIDIA overlay Encoders, decoders, sensors and other helper chips 
       -> <*> IMX264 camera sensor support

GStreamer Examples: Testing a IMX264 Linux Driver

Capture and Display

  • 2656x2088@35fps
gst-launch-1.0 nvarguscamerasrc sensor-id=0 sensor-mode=0 ! queue ! nvvidconv ! autovideosink
  • 1920x1106@30fps
gst-launch-1.0 nvarguscamerasrc sensor-id=0 sensor-mode=1 ! queue ! nvvidconv ! autovideosink

Capture and Stream

  • 2656x2088@35fps
HOST=<Host ip>
PORT1=<port1>
gst-launch-1.0 nvarguscamerasrc sensor-id=0 sensor-mode=0 ! nvvidconv !  nvv4l2h264enc ! rtph264pay config-interval=10  ! udpsink host=$HOST port=$PORT1

And on the host side:

PORT1=<port1>
gst-launch-1.0 udpsrc port=$PORT1 ! 'application/x-rtp, media=(string)video, encoding-name=(string)H264' !  queue ! rtph264depay ! avdec_h264 ! videoconvert ! xvimagesink
  • 1920x1106@30fps
HOST=<Host ip>
PORT1=<port1>
gst-launch-1.0 nvarguscamerasrc sensor-id=0 sensor-mode=1 ! nvvidconv !  nvv4l2h264enc ! rtph264pay config-interval=10  ! udpsink host=$HOST port=$PORT1

And on the host side:

PORT1=<port1>
gst-launch-1.0 udpsrc port=$PORT1 ! 'application/x-rtp, media=(string)video, encoding-name=(string)H264' !  queue ! rtph264depay ! avdec_h264 ! videoconvert ! xvimagesink

Analog gain

Analog gain can range from 1 to 480.

v4l2-ctl -c gain=<value>

Exposure

Exposure time is in us, with the minimum value being 52us for all-pixel mode and 43us for Full-HD mode. The maximum value is 36091 for all-pixel mode and 16105 for Full-HD mode. The command to set this value is:

v4l2-ctl -c exposure=<value>

Black level

This value is added to the analog gain. It ranges from 0 to 511. The command to use this control is:

v4l2-ctl -c black_level=<value> 

Gain delay

This register determines if the changes are reflected on the current or next frame.

The command to use this is:

v4l2-ctl -c gain_delay=0

Which is the default or

v4l2-ctl -c gain_delay=1

Performance

ARM Load

For a simple pipeline like:

gst-launch-1.0 nvarguscamerasrc sensor-id=0 sensor-mode=0 ! queue ! nvvidconv ! autovideosink

Tegrastats displays the following output when capturing with the sensor driver:

RAM 1534/7771MB (lfb 1293x4MB) CPU [31%@1190,27%@1190,off,off,off,off]
RAM 1534/7771MB (lfb 1293x4MB) CPU [35%@1190,28%@1190,off,off,off,off]
RAM 1534/7771MB (lfb 1293x4MB) CPU [32%@1190,30%@1190,off,off,off,off]
RAM 1534/7771MB (lfb 1293x4MB) CPU [31%@1190,31%@1190,off,off,off,off]
RAM 1534/7771MB (lfb 1293x4MB) CPU [33%@1190,29%@1190,off,off,off,off]
RAM 1534/7771MB (lfb 1293x4MB) CPU [32%@1190,27%@1190,off,off,off,off]
RAM 1534/7771MB (lfb 1293x4MB) CPU [32%@1190,28%@1190,off,off,off,off]
RAM 1534/7771MB (lfb 1293x4MB) CPU [31%@1190,25%@1190,off,off,off,off]
RAM 1534/7771MB (lfb 1293x4MB) CPU [31%@1190,25%@1190,off,off,off,off]
RAM 1534/7771MB (lfb 1293x4MB) CPU [36%@1190,27%@1190,off,off,off,off]

Framerate

Using the next pipeline we were able to measure the framerate for single capture with perf element:

gst-launch-1.0 nvarguscamerasrc sensor-id=0 sensor-mode=0 ! perf  ! fakesink
INFO:
perf: perf0; timestamp: 0:01:08.024467133; bps: 193536.000; mean_bps: 193536.000; fps: 35.685; mean_fps: 35.685
INFO:
perf: perf0; timestamp: 0:01:09.032803056; bps: 282240.000; mean_bps: 237888.000; fps: 35.702; mean_fps: 35.694
INFO:
perf: perf0; timestamp: 0:01:10.048219125; bps: 290304.000; mean_bps: 255360.000; fps: 35.453; mean_fps: 35.614
INFO:
perf: perf0; timestamp: 0:01:11.052939940; bps: 290304.000; mean_bps: 264096.000; fps: 35.831; mean_fps: 35.668
INFO:
perf: perf0; timestamp: 0:01:12.066355695; bps: 290304.000; mean_bps: 269337.600; fps: 35.523; mean_fps: 35.639
INFO:
perf: perf0; timestamp: 0:01:13.068997375; bps: 282240.000; mean_bps: 271488.000; fps: 35.905; mean_fps: 35.683
INFO:
perf: perf0; timestamp: 0:01:14.080078073; bps: 290304.000; mean_bps: 274176.000; fps: 35.605; mean_fps: 35.672

The results show the framerate constant at 35FPS that use nvarguscamerasrc and passing frames through the ISP to convert from Bayer to YUV.

Similarly, for the pipeline:

gst-launch-1.0 nvarguscamerasrc sensor-id=0 sensor-mode=1 ! perf  ! fakesink

we obtain:

INFO:
perf: perf0; timestamp: 0:02:48.161150666; bps: 169344.000; mean_bps: 169344.000; fps: 30.150; mean_fps: 30.150
INFO:
perf: perf0; timestamp: 0:02:49.161479732; bps: 241920.000; mean_bps: 205632.000; fps: 29.990; mean_fps: 30.070
INFO:
perf: perf0; timestamp: 0:02:50.193516084; bps: 241920.000; mean_bps: 217728.000; fps: 30.038; mean_fps: 30.059
INFO:
perf: perf0; timestamp: 0:02:51.226902182; bps: 241920.000; mean_bps: 223776.000; fps: 29.998; mean_fps: 30.044
INFO:
perf: perf0; timestamp: 0:02:52.260178724; bps: 241920.000; mean_bps: 227404.800; fps: 30.002; mean_fps: 30.036
INFO:
perf: perf0; timestamp: 0:02:53.260523866; bps: 241920.000; mean_bps: 229824.000; fps: 29.990; mean_fps: 30.028
INFO:
perf: perf0; timestamp: 0:02:54.293488648; bps: 241920.000; mean_bps: 231552.000; fps: 30.011; mean_fps: 30.025


RidgeRun Resources

Quick Start Client Engagement Process RidgeRun Blog Homepage
Technical and Sales Support RidgeRun Online Store RidgeRun Videos Contact Us

OOjs UI icon message-progressive.svg Contact Us

Visit our Main Website for the RidgeRun Products and Online Store. RidgeRun Engineering informations are available in RidgeRun Professional Services, RidgeRun Subscription Model and Client Engagement Process wiki pages. Please email to support@ridgerun.com for technical questions and contactus@ridgerun.com for other queries. Contact details for sponsoring the RidgeRun GStreamer projects are available in Sponsor Projects page. Ridgerun-logo.svg
RR Contact Us.png