Difference between revisions of "OmniVision OS08A10 Linux driver"

From RidgeRun Developer Connection
Jump to: navigation, search
(Demo SD card for Jetson TX1 J20)
m
 
(30 intermediate revisions by 2 users not shown)
Line 1: Line 1:
<seo title="Sony Linux Drivers | Linux Driver for Jetson TX1 | RidgeRun Developer" titlemode="replace" keywords="GStreamer, Linux SDK, Linux BSP,  Embedded Linux, Device Drivers, Nvidia, Xilinx, TI, NXP, Freescale, Embedded Linux driver development, Linux Software development, Embedded Linux SDK, Embedded Linux Application development, GStreamer Multimedia Framework." description="Check out our comprehensive overview of the Omnivision os08a10 Linux driver for Jetson TX2. Watch an overview video, check out the features and more at RidgeRun!"></seo>
+
<seo title="Omnivision OS08A10 Linux Driver for NVIDIA Jetson | Omnivision OS08A10 Linux Driver | RidgeRun" titlemode="replace" metakeywords="GStreamer, Linux SDK, Linux BSP,  Embedded Linux, Device Drivers, Nvidia, Xilinx, TI, NXP, Freescale, Embedded Linux driver development, Linux Software development, Embedded Linux SDK, Embedded Linux Application development, GStreamer Multimedia Framework, OS08A10 Jetson, NVIDIA, RidgeRun, V4L2 Driver, Omnivision OS08A10, OS08A10, OS08A10 Linux Driver, OmniVision OS08A10 image sensor, Jetson OS08A10 latency" metadescription="Check out RidgeRun's comprehensive overview and features of the Omnivision OS08A10 Linux driver for NVIDIA Jetson.!"></seo>
  
 
<table>
 
<table>
Line 5: Line 5:
 
<td><div class="clear; float:right">__TOC__</div></td>
 
<td><div class="clear; float:right">__TOC__</div></td>
 
<td>
 
<td>
<html>
+
{{Shopping cart mpo for V4L2 camera drivers}}
<div class="ecwid ecwid-SingleProduct-v2 ecwid-SingleProduct-v2-bordered ecwid-SingleProduct-v2-centered ecwid-Product ecwid-Product-59350605" itemscope itemtype="http://schema.org/Product" data-single-product-id="59350605"><div itemprop="image"></div><div class="ecwid-title" itemprop="name"></div><div itemtype="http://schema.org/Offer" itemscope itemprop="offers"><div class="ecwid-productBrowser-price ecwid-price" itemprop="price" data-spw-price-location="button"><div itemprop="priceCurrency" content="USD"></div></div></div><div customprop="options"></div><div customprop="addtobag"></div></div><script type="text/javascript" src="https://app.ecwid.com/script.js?7804024&data_platform=singleproduct_v2" charset="utf-8"></script><script type="text/javascript">xProduct()</script>
+
<td>
</html>
+
{{NVIDIA Preferred Partner logo}}
</td>
+
<td>
<td><center>
+
<td>
{{Template:Eval SDK Download, Demo Image download and Contact Us buttons}}
+
{{GStreamer debug}}
</center></td></tr></table>
+
<td>  
 
+
<center>
<table>
+
{{ContactUs Button}}
<tr>
+
</center>
<td><div class="clear; float:right">__TOC__</div></td>
+
</tr>
<td valign=top>
 
{{Debug Symbol}} Problems running the pipelines shown on this page?<br>Please see our [http://developer.ridgerun.com/wiki/index.php?title=GStreamer_Debugging GStreamer Debugging guide] for help.
 
</td>
 
 
</table>
 
</table>
 +
<br>
  
'''Keywords:''' OS08A10 Jetson TX2, Gstreamer, NVIDIA, RidgeRun, V4L2 Driver
+
==OmniVision OS08A10 image sensor features==
 
 
 
 
=OS08A10 Features=
 
  
 
The Omnivision OS08A10 is an image sensor with the following features:
 
The Omnivision OS08A10 is an image sensor with the following features:
Line 51: Line 46:
 
* PLL with SCC support
 
* PLL with SCC support
 
* Support for FSIN
 
* Support for FSIN
 
 
  
 
RidgeRun has developed a driver for the Jetson TX1 platform with the following support:
 
RidgeRun has developed a driver for the Jetson TX1 platform with the following support:
Line 59: Line 52:
 
* V4l2 Media controller driver
 
* V4l2 Media controller driver
 
* Tested resolution 3840 x 2160 @ 30 fps
 
* Tested resolution 3840 x 2160 @ 30 fps
* Capture with v4l2src and also with nvcamerasrc using the ISP.
+
* Capture with nvcamerasrc using the ISP.
 +
* Three cameras capturing at same time.
  
=Enabling the driver=
+
==Enabling the OmniVision OS08A10 Linux driver==
  
 
In order to use this driver, you have to patch and compile the kernel source using JetPack:
 
In order to use this driver, you have to patch and compile the kernel source using JetPack:
  
 
+
* Follow the instructions in [[Compiling_Jetson_TX1/TX2_source_code#Download_the_kernel_source_code | (Downloading sources)]] to get the kernel source code.
==Using Jetpack==
 
 
 
* Follow the instructions in [https://developer.ridgerun.com/wiki/index.php?title=Compiling_Jetson_TX1/TX2_source_code#Download_the_kernel_source_code (Downloading sources)] to get the kernel source code.
 
  
 
* Once you have the source code, apply the following three patches in order to add the changes required for the os08a10 cameras at kernel and dtb level.
 
* Once you have the source code, apply the following three patches in order to add the changes required for the os08a10 cameras at kernel and dtb level.
Line 76: Line 67:
 
</pre>
 
</pre>
  
* Follow the instructions in [https://developer.ridgerun.com/wiki/index.php?title=Compiling_Jetson_TX1/TX2_source_code#Build_Kernel (Build Kernel)] for building the kernel, and then flash the image.
+
* Follow the instructions in [[Compiling_Jetson_TX1/TX2_source_code#Build_Kernel | (Build Kernel)]] for building the kernel, and then flash the image.
  
 
Make sure to enable os08a10 driver support:
 
Make sure to enable os08a10 driver support:
Line 90: Line 81:
 
</pre>
 
</pre>
  
=Using the driver=
+
==Using the driver==
  
==Gstreamer examples==
+
===Gstreamer examples===
 
 
The Gstreamer version distributed with Jetpack doesn't support bayer RAW12 only RAW8 so gstreamer needs to be patched in order to capture using v4l2src. Follow the steps in the following wiki page to add the support for RAW12:
 
 
 
[http://developer.ridgerun.com/wiki/index.php?title=Compile_gstreamer_on_tegra_X1 (add support in GStreamer and V4L RAW12)]
 
  
 
'''Important Note:''' When you are accessing to the board through serial or ssh and you want to run a pipeline to display with autovideosink, nveglglessink, xvimagesink or any other video sink, you have to run your pipeline with ''DISPLAY=:0'' at the beginning of the description:
 
'''Important Note:''' When you are accessing to the board through serial or ssh and you want to run a pipeline to display with autovideosink, nveglglessink, xvimagesink or any other video sink, you have to run your pipeline with ''DISPLAY=:0'' at the beginning of the description:
Line 104: Line 91:
 
</pre>   
 
</pre>   
  
=== Capture ===
+
==== Capture ====
  
==== Nvcamerasrc ====
+
===== Nvcamerasrc =====
  
 
*3840x2160
 
*3840x2160
Line 114: Line 101:
 
</pre>
 
</pre>
  
===Dual Capture===
+
====Dual Capture====
  
 
Using the following pipelines we can test the performance of the Jetson TX2 when doing dual capture:
 
Using the following pipelines we can test the performance of the Jetson TX2 when doing dual capture:
  
==== Nvcamerasrc ====
+
===== Nvcamerasrc =====
  
 
<pre  style="background:#d6e4f1">
 
<pre  style="background:#d6e4f1">
Line 126: Line 113:
 
</pre>
 
</pre>
  
We noticed that using two cameras with the max resolution '''3840x2160''', the cpu load consumption measured with tegrastats doesn't change considerably, and it remains almost the same:
+
We noticed that using two cameras with the max resolution '''3840x2160''', the CPU load consumption measured with tegrastats doesn't change considerably, and it remains almost the same:
  
 
* Tegrastats in normal operation:
 
* Tegrastats in normal operation:
  
 
<pre>
 
<pre>
 
 
RAM 2037/7847MB (lfb 1268x4MB) CPU [0%@345,off,off,0%@345,0%@345,0%@345]  
 
RAM 2037/7847MB (lfb 1268x4MB) CPU [0%@345,off,off,0%@345,0%@345,0%@345]  
 
RAM 2037/7847MB (lfb 1268x4MB) CPU [3%@345,off,off,1%@345,2%@345,0%@345]  
 
RAM 2037/7847MB (lfb 1268x4MB) CPU [3%@345,off,off,1%@345,2%@345,0%@345]  
Line 157: Line 143:
 
</pre>
 
</pre>
  
=== Dual capture and dual display ===
+
====Three cameras ====
 +
 
 +
Using the following pipelines we can test the performance of the Jetson TX2 when using three cameras to capture:
 +
 
 +
===== Nvcamerasrc =====
 +
 
 +
<pre  style="background:#d6e4f1">
 +
gst-launch-1.0 nvcamerasrc sensor-id=0 fpsRange="30 30" ! 'video/x-raw(memory:NVMM), width=(int)3840, height=(int)2160, format=(string)I420, \
 +
framerate=(fraction)30/1' ! fakesink nvcamerasrc sensor-id=1 fpsRange="30 30" ! 'video/x-raw(memory:NVMM), width=(int)3840, height=(int)2160, \
 +
format=(string)I420, framerate=(fraction)30/1' ! fakesink nvcamerasrc sensor-id=2 fpsRange="30 30" ! 'video/x-raw(memory:NVMM), width=(int)3840, \
 +
height=(int)2160, format=(string)I420, framerate=(fraction)30/1' ! fakesink
 +
</pre>
 +
 
 +
We noticed that using three cameras with the max resolution '''3840x2160''', the CPU load consumption measured with tegrastats doesn't change considerably, and it remains almost the same:
 +
 
 +
* Tegrastats in normal operation:
 +
 
 +
<pre>
 +
RAM 2037/7847MB (lfb 1268x4MB) CPU [0%@345,off,off,0%@345,0%@345,0%@345]
 +
RAM 2037/7847MB (lfb 1268x4MB) CPU [3%@345,off,off,1%@345,2%@345,0%@345]
 +
RAM 2037/7847MB (lfb 1268x4MB) CPU [6%@345,off,off,0%@345,2%@345,1%@345]
 +
RAM 2037/7847MB (lfb 1268x4MB) CPU [0%@345,off,off,4%@345,0%@345,0%@345]
 +
RAM 2037/7847MB (lfb 1268x4MB) CPU [6%@345,off,off,3%@345,3%@345,2%@345]
 +
RAM 2037/7847MB (lfb 1268x4MB) CPU [2%@345,off,off,0%@345,1%@345,3%@345]
 +
RAM 2037/7847MB (lfb 1268x4MB) CPU [1%@345,off,off,0%@345,2%@345,1%@345]
 +
</pre>
 +
 
 +
* Tegrastats with the above pipeline running
 +
 
 +
<pre>
 +
RAM 2685/7855MB (lfb 1135x4MB) CPU [41%@345,off,off,38%@345,40%@345,43%@345]
 +
RAM 2686/7855MB (lfb 1135x4MB) CPU [44%@345,off,off,43%@345,39%@345,40%@345]
 +
RAM 2687/7855MB (lfb 1133x4MB) CPU [44%@345,off,off,41%@345,40%@345,41%@345]
 +
RAM 2687/7855MB (lfb 1133x4MB) CPU [45%@345,off,off,43%@345,43%@345,36%@345]
 +
RAM 2688/7855MB (lfb 1133x4MB) CPU [43%@345,off,off,39%@345,43%@345,42%@345]
 +
RAM 2689/7855MB (lfb 1133x4MB) CPU [43%@345,off,off,41%@345,38%@345,41%@345]
 +
RAM 2689/7855MB (lfb 1133x4MB) CPU [45%@345,off,off,40%@345,38%@345,41%@345]
 +
</pre>
 +
 
 +
==== Dual capture and dual display ====
  
 
<pre style="background:#d6e4f1">
 
<pre style="background:#d6e4f1">
Line 165: Line 190:
 
</pre>
 
</pre>
  
=== Video Encoding Transport Stream 3840x2160@30fps ===
+
==== Video Encoding Transport Stream 3840x2160@30fps ====
  
 
<pre style="background:#d6e4f1">
 
<pre style="background:#d6e4f1">
Line 174: Line 199:
 
</pre>
 
</pre>
  
=== Snapshots ===
+
==== Snapshots ====
  
 
<pre style="background:#d6e4f1">
 
<pre style="background:#d6e4f1">
Line 181: Line 206:
 
</pre>
 
</pre>
  
= Jetson os08a10 glass to glass latency =
+
== Framerate ==
 +
Using the next pipeline we were able to measure the framerate for single capture with perf element:
 +
<pre style="background:#d6e4f1">
 +
gst-launch-1.0 nvcamerasrc sensor-id=0 fpsRange="30 30" ! 'video/x-raw(memory:NVMM), width=(int)3840, height=(int)2160, format=(string)I420, framerate=(fraction)30/1' ! perf  ! fakesink
 +
</pre>
 +
 
 +
<pre>
 +
nvidia@tegra-ubuntu:~$ gst-launch-1.0 nvcamerasrc sensor-id=0 fpsRange="30 30" ! 'video/x-raw(memory:NVMM), width=(int)3840, height=(int)2160, format=(string)I420, framerate=(fraction)30/1' ! perf  ! fakesink
 +
Setting pipeline to PAUSED ...
 +
 
 +
Available Sensor modes :
 +
3840 x 2160 FR=30.000000 CF=0x1109208a10 SensorModeType=4 CSIPixelBitDepth=12 DynPixelBitDepth=12
 +
Pipeline is live and does not need PREROLL ...
 +
Setting pipeline to PLAYING ...
 +
New clock: GstSystemClock
 +
 
 +
NvCameraSrc: Trying To Set Default Camera Resolution. Selected sensorModeIndex = 0 WxH = 3840x2160 FrameRate = 30.000000 ...
 +
 
 +
GST-PERF INFO -->  Timestamp: 0:16:43.009877206; Bps: 0; fps: 0.0
 +
GST-PERF INFO -->  Timestamp: 0:16:44.011800118; Bps: 807; fps: 34.96
 +
GST-PERF INFO -->  Timestamp: 0:16:45.039982896; Bps: 785; fps: 34.4
 +
GST-PERF INFO -->  Timestamp: 0:16:46.068519581; Bps: 785; fps: 34.4
 +
GST-PERF INFO -->  Timestamp: 0:16:47.097418847; Bps: 785; fps: 34.4
 +
GST-PERF INFO -->  Timestamp: 0:16:48.125239768; Bps: 786; fps: 34.7
 +
GST-PERF INFO -->  Timestamp: 0:16:49.153094886; Bps: 786; fps: 34.7
 +
GST-PERF INFO -->  Timestamp: 0:16:50.181542408; Bps: 785; fps: 34.4
 +
GST-PERF INFO -->  Timestamp: 0:16:51.211657663; Bps: 784; fps: 33.98
 +
GST-PERF INFO -->  Timestamp: 0:16:52.236531858; Bps: 789; fps: 34.17
 +
GST-PERF INFO -->  Timestamp: 0:16:53.239527665; Bps: 806; fps: 33.93
 +
GST-PERF INFO -->  Timestamp: 0:16:54.266731117; Bps: 786; fps: 34.7
 +
GST-PERF INFO -->  Timestamp: 0:16:55.293416318; Bps: 787; fps: 34.11
 +
GST-PERF INFO -->  Timestamp: 0:16:56.321500133; Bps: 785; fps: 34.4
 +
GST-PERF INFO -->  Timestamp: 0:16:57.350354047; Bps: 785; fps: 34.4
 +
GST-PERF INFO -->  Timestamp: 0:16:58.377735892; Bps: 786; fps: 34.7
 +
GST-PERF INFO -->  Timestamp: 0:16:59.380256442; Bps: 806; fps: 33.93
 +
GST-PERF INFO -->  Timestamp: 0:17:00.406086717; Bps: 788; fps: 34.14
 +
GST-PERF INFO -->  Timestamp: 0:17:01.433547314; Bps: 786; fps: 34.7
 +
^Chandling interrupt.
 +
Interrupt: Stopping pipeline ...
 +
Execution ended after 0:00:19.198055964
 +
Setting pipeline to PAUSED ...
 +
Setting pipeline to READY ...
 +
Setting pipeline to NULL ...
 +
Freeing pipeline ...
 +
</pre>
 +
 
 +
The results show the framerate constant at 34FPS that use nvcamerasrc and passing frames through the ISP to convert from Bayer to YUV.
  
= Contact Us =
+
== Jetson OS08A10 glass to glass latency ==
 +
The latency glass to glass for the Os08a10 camera is around 130ms, the image below is one of the pictures taken when the latency was calculated.
 +
The left chronometer is the image that the camera took and the right chronometer belongs to the real-time chronometer that the camera was capturing at the moment 
 +
[[Image:os08a10_glass_to_glass.jpg|thumb|center|800px|OS08A10 Glass to Glass latency.]]
  
If you are interested in the evaluation version or for technical questions please send an email to '''support@ridgerun.com'''.
+
{{ContactUs}}
If you are interested in purchasing the driver, please post your inquiry at our [http://www.ridgerun.com/contact/ '''Contact Us'''] link.
 
  
[[Category:Jetson]][[Category:Jetson V4L2 Drivers]]
+
[[Category:Jetson]][[Category:Jetson V4L2 Drivers]][[Category:OmniVision]]

Latest revision as of 05:25, 20 February 2023

Nvidia-preferred-partner-badge-rgb-for-screen.png

Error something wrong.jpg Problems running the pipelines shown on this page?
Please see our GStreamer Debugging guide for help.

RR Contact Us.png


OmniVision OS08A10 image sensor features

The Omnivision OS08A10 is an image sensor with the following features:

  • 2 µm x 2 µm pixel
  • Optical size of 1/1.8"
  • Programmable controls for:
    • Frame rate
    • Mirror and flip
    • Cropping
    • Windowing
  • Supports output formats:
    • 12-/10-bit RGB RAW
  • Supports image sizes:
    • 4K2K (3840x2160)
    • 2560 x 1440
    • 1080p (1920x1080)
    • 720p (1280x720)
  • Supports 2x2 binning
  • Standard serial SCCB interface
  • 12-bit ADC
  • Up to 4-lane MIPI/LVDS serial output interface (supports maximum speed up to 1500 Mbps/lane)
  • 2-exposure staggered HDR support
  • Programmable I/O drive capability
  • Light sensing mode (LSM)
  • PLL with SCC support
  • Support for FSIN

RidgeRun has developed a driver for the Jetson TX1 platform with the following support:

  • L4T 28.2.1 and Jetpack 3.3
  • V4l2 Media controller driver
  • Tested resolution 3840 x 2160 @ 30 fps
  • Capture with nvcamerasrc using the ISP.
  • Three cameras capturing at same time.

Enabling the OmniVision OS08A10 Linux driver

In order to use this driver, you have to patch and compile the kernel source using JetPack:

  • Once you have the source code, apply the following three patches in order to add the changes required for the os08a10 cameras at kernel and dtb level.
3.2.1_os08a10.patch
  • Follow the instructions in (Build Kernel) for building the kernel, and then flash the image.

Make sure to enable os08a10 driver support:

make menuconfig
-> Device Drivers                                                                                                                        
  -> Multimedia support                                                                                           
    -> Encoders, decoders, sensors and other helper chips
       -> <*> OS08A10 camera sensor support

Using the driver

Gstreamer examples

Important Note: When you are accessing to the board through serial or ssh and you want to run a pipeline to display with autovideosink, nveglglessink, xvimagesink or any other video sink, you have to run your pipeline with DISPLAY=:0 at the beginning of the description:

DISPLAY=:0 gst-launch-1.0 ...

Capture

Nvcamerasrc
  • 3840x2160
gst-launch-1.0 nvcamerasrc sensor-id=0 fpsRange="30 30" ! 'video/x-raw(memory:NVMM), width=(int)3840, height=(int)2160, format=(string)I420, framerate=(fraction)30/1' ! autovideosink

Dual Capture

Using the following pipelines we can test the performance of the Jetson TX2 when doing dual capture:

Nvcamerasrc
gst-launch-1.0 nvcamerasrc sensor-id=0 fpsRange="30 30" ! 'video/x-raw(memory:NVMM), width=(int)3840, height=(int)2160, format=(string)I420, framerate=(fraction)30/1' ! \
fakesink nvcamerasrc sensor-id=1 fpsRange="30 30" ! 'video/x-raw(memory:NVMM), width=(int)3840, height=(int)2160, format=(string)I420, framerate=(fraction)30/1' ! \
 fakesink

We noticed that using two cameras with the max resolution 3840x2160, the CPU load consumption measured with tegrastats doesn't change considerably, and it remains almost the same:

  • Tegrastats in normal operation:
RAM 2037/7847MB (lfb 1268x4MB) CPU [0%@345,off,off,0%@345,0%@345,0%@345] 
RAM 2037/7847MB (lfb 1268x4MB) CPU [3%@345,off,off,1%@345,2%@345,0%@345] 
RAM 2037/7847MB (lfb 1268x4MB) CPU [6%@345,off,off,0%@345,2%@345,1%@345] 
RAM 2037/7847MB (lfb 1268x4MB) CPU [0%@345,off,off,4%@345,0%@345,0%@345] 
RAM 2037/7847MB (lfb 1268x4MB) CPU [6%@345,off,off,3%@345,3%@345,2%@345] 
RAM 2037/7847MB (lfb 1268x4MB) CPU [2%@345,off,off,0%@345,1%@345,3%@345] 
RAM 2037/7847MB (lfb 1268x4MB) CPU [1%@345,off,off,0%@345,2%@345,1%@345]
  • Tegrastats with the above pipeline running
RAM 1999/7847MB (lfb 1305x4MB) CPU [31%@345,off,off,29%@345,26%@345,26%@345] 
RAM 1999/7847MB (lfb 1305x4MB) CPU [28%@345,off,off,31%@345,26%@345,30%@345] 
RAM 1999/7847MB (lfb 1305x4MB) CPU [27%@345,off,off,28%@345,28%@345,33%@345] 
RAM 2000/7847MB (lfb 1305x4MB) CPU [26%@345,off,off,29%@345,26%@345,26%@345] 
RAM 2000/7847MB (lfb 1305x4MB) CPU [26%@345,off,off,29%@345,30%@345,28%@345] 
RAM 2000/7847MB (lfb 1305x4MB) CPU [30%@345,off,off,28%@345,28%@345,27%@345] 
RAM 2001/7847MB (lfb 1304x4MB) CPU [32%@345,off,off,22%@345,27%@345,28%@345] 
RAM 2001/7847MB (lfb 1304x4MB) CPU [33%@345,off,off,26%@345,26%@345,28%@345] 
RAM 2001/7847MB (lfb 1304x4MB) CPU [30%@345,off,off,26%@345,29%@345,22%@345] 
RAM 2001/7847MB (lfb 1304x4MB) CPU [28%@345,off,off,29%@345,27%@345,28%@345] 
RAM 2001/7847MB (lfb 1304x4MB) CPU [29%@345,off,off,33%@345,28%@345,26%@345]

Three cameras

Using the following pipelines we can test the performance of the Jetson TX2 when using three cameras to capture:

Nvcamerasrc
gst-launch-1.0 nvcamerasrc sensor-id=0 fpsRange="30 30" ! 'video/x-raw(memory:NVMM), width=(int)3840, height=(int)2160, format=(string)I420, \
framerate=(fraction)30/1' ! fakesink nvcamerasrc sensor-id=1 fpsRange="30 30" ! 'video/x-raw(memory:NVMM), width=(int)3840, height=(int)2160, \
format=(string)I420, framerate=(fraction)30/1' ! fakesink nvcamerasrc sensor-id=2 fpsRange="30 30" ! 'video/x-raw(memory:NVMM), width=(int)3840, \
height=(int)2160, format=(string)I420, framerate=(fraction)30/1' ! fakesink

We noticed that using three cameras with the max resolution 3840x2160, the CPU load consumption measured with tegrastats doesn't change considerably, and it remains almost the same:

  • Tegrastats in normal operation:
RAM 2037/7847MB (lfb 1268x4MB) CPU [0%@345,off,off,0%@345,0%@345,0%@345] 
RAM 2037/7847MB (lfb 1268x4MB) CPU [3%@345,off,off,1%@345,2%@345,0%@345] 
RAM 2037/7847MB (lfb 1268x4MB) CPU [6%@345,off,off,0%@345,2%@345,1%@345] 
RAM 2037/7847MB (lfb 1268x4MB) CPU [0%@345,off,off,4%@345,0%@345,0%@345] 
RAM 2037/7847MB (lfb 1268x4MB) CPU [6%@345,off,off,3%@345,3%@345,2%@345] 
RAM 2037/7847MB (lfb 1268x4MB) CPU [2%@345,off,off,0%@345,1%@345,3%@345] 
RAM 2037/7847MB (lfb 1268x4MB) CPU [1%@345,off,off,0%@345,2%@345,1%@345]
  • Tegrastats with the above pipeline running
RAM 2685/7855MB (lfb 1135x4MB) CPU [41%@345,off,off,38%@345,40%@345,43%@345]
RAM 2686/7855MB (lfb 1135x4MB) CPU [44%@345,off,off,43%@345,39%@345,40%@345]
RAM 2687/7855MB (lfb 1133x4MB) CPU [44%@345,off,off,41%@345,40%@345,41%@345]
RAM 2687/7855MB (lfb 1133x4MB) CPU [45%@345,off,off,43%@345,43%@345,36%@345]
RAM 2688/7855MB (lfb 1133x4MB) CPU [43%@345,off,off,39%@345,43%@345,42%@345]
RAM 2689/7855MB (lfb 1133x4MB) CPU [43%@345,off,off,41%@345,38%@345,41%@345]
RAM 2689/7855MB (lfb 1133x4MB) CPU [45%@345,off,off,40%@345,38%@345,41%@345]

Dual capture and dual display

DISPLAY=:0 gst-launch-1.0 nvcamerasrc sensor-id=0 fpsRange="30 30" ! 'video/x-raw(memory:NVMM), width=(int)1640, height=(int)1232, format=(string)I420, \
framerate=(fraction)21/1' ! nvegltransform ! nveglglessink nvcamerasrc sensor-id=2 fpsRange="30 30" ! 'video/x-raw(memory:NVMM), width=(int)1640, \
height=(int)1232, format=(string)I420, framerate=(fraction)21/1' ! nvegltransform ! nveglglessink -e

Video Encoding Transport Stream 3840x2160@30fps

CAPS="video/x-raw(memory:NVMM), width=(int)3840, height=(int)2160, format=(string)I420, framerate=(fraction)30/1"

gst-launch-1.0 nvcamerasrc sensor-id=1 fpsRange="30 30" num-buffers=500 ! capsfilter caps="$CAPS" ! omxh264enc ! \
               mpegtsmux ! filesink location=test.ts

Snapshots

gst-launch-1.0 -v nvcamerasrc sensor-id=1 fpsRange="30 30" num-buffers=10 ! 'video/x-raw(memory:NVMM), width=(int)3840, height=(int)2160, format=(string)I420, \
framerate=(fraction)30/1' ! nvvidconv ! 'video/x-raw, width=(int)3840, height=(int)2160, format=(string)I420, framerate=(fraction)30/1' ! multifilesink location=test_%d.yuv

Framerate

Using the next pipeline we were able to measure the framerate for single capture with perf element:

gst-launch-1.0 nvcamerasrc sensor-id=0 fpsRange="30 30" ! 'video/x-raw(memory:NVMM), width=(int)3840, height=(int)2160, format=(string)I420, framerate=(fraction)30/1' ! perf  ! fakesink
nvidia@tegra-ubuntu:~$ gst-launch-1.0 nvcamerasrc sensor-id=0 fpsRange="30 30" ! 'video/x-raw(memory:NVMM), width=(int)3840, height=(int)2160, format=(string)I420, framerate=(fraction)30/1' ! perf  ! fakesink
Setting pipeline to PAUSED ...

Available Sensor modes : 
3840 x 2160 FR=30.000000 CF=0x1109208a10 SensorModeType=4 CSIPixelBitDepth=12 DynPixelBitDepth=12
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock

NvCameraSrc: Trying To Set Default Camera Resolution. Selected sensorModeIndex = 0 WxH = 3840x2160 FrameRate = 30.000000 ...

GST-PERF INFO -->  Timestamp: 0:16:43.009877206; Bps: 0; fps: 0.0 
GST-PERF INFO -->  Timestamp: 0:16:44.011800118; Bps: 807; fps: 34.96 
GST-PERF INFO -->  Timestamp: 0:16:45.039982896; Bps: 785; fps: 34.4 
GST-PERF INFO -->  Timestamp: 0:16:46.068519581; Bps: 785; fps: 34.4 
GST-PERF INFO -->  Timestamp: 0:16:47.097418847; Bps: 785; fps: 34.4 
GST-PERF INFO -->  Timestamp: 0:16:48.125239768; Bps: 786; fps: 34.7 
GST-PERF INFO -->  Timestamp: 0:16:49.153094886; Bps: 786; fps: 34.7 
GST-PERF INFO -->  Timestamp: 0:16:50.181542408; Bps: 785; fps: 34.4 
GST-PERF INFO -->  Timestamp: 0:16:51.211657663; Bps: 784; fps: 33.98 
GST-PERF INFO -->  Timestamp: 0:16:52.236531858; Bps: 789; fps: 34.17 
GST-PERF INFO -->  Timestamp: 0:16:53.239527665; Bps: 806; fps: 33.93 
GST-PERF INFO -->  Timestamp: 0:16:54.266731117; Bps: 786; fps: 34.7 
GST-PERF INFO -->  Timestamp: 0:16:55.293416318; Bps: 787; fps: 34.11 
GST-PERF INFO -->  Timestamp: 0:16:56.321500133; Bps: 785; fps: 34.4 
GST-PERF INFO -->  Timestamp: 0:16:57.350354047; Bps: 785; fps: 34.4 
GST-PERF INFO -->  Timestamp: 0:16:58.377735892; Bps: 786; fps: 34.7 
GST-PERF INFO -->  Timestamp: 0:16:59.380256442; Bps: 806; fps: 33.93 
GST-PERF INFO -->  Timestamp: 0:17:00.406086717; Bps: 788; fps: 34.14 
GST-PERF INFO -->  Timestamp: 0:17:01.433547314; Bps: 786; fps: 34.7 
^Chandling interrupt.
Interrupt: Stopping pipeline ...
Execution ended after 0:00:19.198055964
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...

The results show the framerate constant at 34FPS that use nvcamerasrc and passing frames through the ISP to convert from Bayer to YUV.

Jetson OS08A10 glass to glass latency

The latency glass to glass for the Os08a10 camera is around 130ms, the image below is one of the pictures taken when the latency was calculated. The left chronometer is the image that the camera took and the right chronometer belongs to the real-time chronometer that the camera was capturing at the moment

OS08A10 Glass to Glass latency.


RidgeRun Resources

Quick Start Client Engagement Process RidgeRun Blog Homepage
Technical and Sales Support RidgeRun Online Store RidgeRun Videos Contact Us

OOjs UI icon message-progressive.svg Contact Us

Visit our Main Website for the RidgeRun Products and Online Store. RidgeRun Engineering informations are available in RidgeRun Professional Services, RidgeRun Subscription Model and Client Engagement Process wiki pages. Please email to support@ridgerun.com for technical questions and contactus@ridgerun.com for other queries. Contact details for sponsoring the RidgeRun GStreamer projects are available in Sponsor Projects page. Ridgerun-logo.svg
RR Contact Us.png