Difference between revisions of "Image Stitching for NVIDIA Jetson/User Guide/Controlling the Stitcher"

From RidgeRun Developer Connection
Jump to: navigation, search
m
 
(24 intermediate revisions by 3 users not shown)
Line 1: Line 1:
 
<noinclude>
 
<noinclude>
{{Image_Stitching_for_NVIDIA_Jetson/Head|previous=Image Stitching for NVIDIA Jetson Basics|next=User Guide/Homography estimation|keywords=Image Stitching, CUDA, Stitcher, OpenCV, Panorama}}
+
{{Image_Stitching_for_NVIDIA_Jetson/Head|previous=User Guide|next=User Guide/Homography estimation|metakeywords=Image Stitching, CUDA, Stitcher, OpenCV, Panorama}}
 
</noinclude>
 
</noinclude>
  
This page serves as a guide to configure the stitcher in order to meet different application requirements.
+
{{DISPLAYTITLE: User Guide on Controlling the Stitcher|noerror}}
  
The stitcher uses the following parameters to determine its runtime behavior:
+
This page provides a basic description of the parameters required when building a cudastitcher pipeline. As well as an explanation of the workflow used when working with the stitcher.
* '''Border Width''': Border width for the blender.
 
* '''Left-Center Homography''': Left-center homography from the stitching of three images.
 
* '''Right-Center Homography''': Right-center homography from the stitching of two or three images.
 
* '''async-homography''': Enable the async estimation of the homographies.
 
* '''async-homography-time''': Time in milliseconds for the estimation of the homography.
 
* '''fov''': Field of view of the cameras.
 
* '''overlap''': Overlap between the images.
 
* '''stitching-quality-treshold''': Stitching quality treshold, between 0 and 1, where 1 is the best score. This threshold does not allow new homographies bellow this value.
 
  
All of the properties should be configured at start-up. Currently, the stitcher is able to operate in either two or three image stitching modes. For the two image stitching mode, the left image is kept as a reference and the right one is transformed, for the three image case, the centre image is the reference.
+
== Workflow diagram ==
 +
The following diagram provides a visual representation of the workflow needed when using the stitcher, as well as the auxiliary tools required. The [[Image_Stitching_for_NVIDIA_Jetson/User_Guide/Controlling_the_Stitcher#Workflow_overview| workflow overview]] provides a more detailed written version of the workflow and links to the required wikis.
  
In the following image the effect of these parameters can be seen:
+
[[File:Stitcher workflow diagram.png|700px|center|none|2 Images Stitching Example]]
  
[[File:Stitching example.png|600px|frameless|center]]
+
== Workflow parameters ==
  
The first step on the stitching is the image warping, this uses the '''Homography''' for the corresponding image to transform the input image in such a way as it matches the reference image. Depending on the camera setup this can introduce distortion on the input image, but note that the effect on the previous image is exaggerated.  
+
When using the stitcher, parameter acquirement and selection is a crucial steps in order to get the expected output. These parameters can be obtained from scripts provided within the stitcher itself.
 +
These parameters are:
  
If you want to learn how to calculate this visit the following [[Image_Stitching_for_NVIDIA_Jetson/User_Guide/Homography_estimation|page]].
+
==== Undistort parameters ====
 +
:If you are using the stitcher as well as the cuda-undistort element, there are more parameters to be obtained, information about those and how to set them can be found in the [[CUDA_Accelerated_GStreamer_Camera_Undistort/User_Guide | cuda undistort wiki]]
  
The second step is copying over the images over to the output buffer, this is handled by the homography and no parameter is involved in this process.
+
==== Homography List====
 +
:This parameter defines the transformations between pairs of images. It is specified with the option <code>homography-list</code> and is set as a JSON formatted string, the JSON is constructed manually based on the individual homographies calculated with the homography estimation tool.
  
Finally, the stitch is blended, this is marked with the washed-out purple and orange in the previous image. This section is not copied over from the input images but is a mix of both of these and is controlled by the '''border-width''' parameter. This is especially useful when adjacent cameras don't have the same exposure.
+
:Read the [[Image_Stitching_for_NVIDIA_Jetson/User_Guide/Homography_estimation|Homography estimation guide]] on how to calculate the homography between two images.
 +
:Then visit the [[Image_Stitching_for_NVIDIA_Jetson/User_Guide/Homography_list|Homography list guide]] to better understand its format and how to construct it based on individual homographies.
  
'''Warning''': If the homography between the two images is not optimal or the cameras have too much distortion a too large border width can introduce ghosting.
+
==== Refinement parameters ====
 +
:The stitcher is capable of refining the homographies during execution, there are multiple parameters associated with this feature, see the [[Image_Stitching_for_NVIDIA_Jetson/User_Guide/Homography_Refinement|Homography refinement guide]] for more information.
  
 +
==== Blending Width ====
 +
:This parameter sets the amount of pixels to be blended between two images. It can be set with the <code>border-width</code> option. [[Image_Stitching_for_NVIDIA_Jetson/User_Guide/Blending|This guide]] provides more information on the topic.
  
== Example Pipeline ==
+
== Workflow overview ==
 +
Here are presented the basic steps and execution order that needs to be followed in order to configure the stitcher properly and acquire the parameters for its usage.
  
Check the following simple pipeline to see how each parameter is used:
+
#Know your input sources (N)
 +
#Apply distortion correction to the inputs (only if necessary), see [[CUDA_Accelerated_GStreamer_Camera_Undistort/User_Guide/Camera_Calibration | CUDA Accelerated GStreamer Camera Undistort Camera Calibration User Guide]] for more details
 +
#*Run the calibration tool for each source that requires it
 +
#**'''input''': Multiple images of a calibration pattern
 +
#**'''output''': Camera matrix and distortion parameters
 +
#*Save the camera matrix and distortion parameters for each camera since they will be required to build the pipelines 
 +
#*Repeat until every input has been corrected
 +
#Calculate all (N-1) homographies between pairs of adjacent images, see [[Image_Stitching_for_NVIDIA_Jetson/User_Guide/Homography_estimation|Image Stitching Homography estimation User Guide]] for more details
 +
#*Run the homography estimation tool for each image (target) and its reference (fixed)
 +
#**'''input''': Two still images from adjacent sources with overlap and a JSON config file
 +
#**'''output''': Homography matrix that describes the transformation between input sources
 +
#*Save the homography matrices; they will be required in the next steps
 +
#*Repeat until every (N-1) image has been a target (except for the original reference image)
 +
#Assemble the homographies list JSON file
 +
#*This step is done manually, see [[Image_Stitching_for_NVIDIA_Jetson/User_Guide/Homography_list|Image Stitching Homography list User Guide]] for more details
 +
#Set the blending width, see [[Image_Stitching_for_NVIDIA_Jetson/User_Guide/Blending|Image Stitching Blending User Guide]] for more details
 +
#Build and launch the stitcher pipeline, see [[Image Stitching for NVIDIA Jetson/Examples|Image Stitching GStreamer Pipeline Examples]]
  
<syntaxhighlight lang=bash>
 
LC_HOMOGRAPHY="{\
 
  \"h00\": 7.3851e-01, \"h01\": 1.0431e-01, \"h02\": 1.4347e+03, \
 
  \"h10\":-1.0795e-01, \"h11\": 9.8914e-01, \"h12\":-9.3916e+00, \
 
  \"h20\":-2.3449e-04, \"h21\": 3.3206e-05, \"h22\": 1.0000e+00}"
 
 
RC_HOMOGRAPHY="{\
 
  \"h00\": 7.3851e-01, \"h01\": 1.0431e-01, \"h02\": 1.4347e+03, \
 
  \"h10\":-1.0795e-01, \"h11\": 9.8914e-01, \"h12\":-9.3916e+00, \
 
  \"h20\":-2.3449e-04, \"h21\": 3.3206e-05, \"h22\": 1.0000e+00}"\
 
 
BORDER_WIDTH=10
 
 
gst-launch-1.0 -e cudastitcher name=stitcher \
 
  left-center-homography="$LC_HOMOGRAPHY" \
 
  right-center-homography="$RC_HOMOGRAPHY" \
 
  border-width=$BORDER_WIDTH \
 
  nvarguscamerasrc maxperf=true sensor-id=0 ! nvvidconv ! stitcher.sink_0 \
 
  nvarguscamerasrc maxperf=true sensor-id=1 ! nvvidconv ! stitcher.sink_1 \
 
  nvarguscamerasrc maxperf=true sensor-id=2 ! nvvidconv ! stitcher.sink_2 \
 
  stitcher. ! perf print-arm-load=true ! queue ! nvvidconv ! nvoverlaysink
 
</syntaxhighlight>
 
 
You can find more complex examples [[Image_Stitching_for_NVIDIA_Jetson/Examples|here]].
 
  
 
<noinclude>
 
<noinclude>
{{Image_Stitching_for_NVIDIA_Jetson/Foot|Image Stitching for NVIDIA Jetson Basics|User Guide/Homography estimation}}
+
{{Image_Stitching_for_NVIDIA_Jetson/Foot|User Guide|User Guide/Homography estimation}}
 
</noinclude>
 
</noinclude>

Latest revision as of 12:00, 26 February 2023



Previous: User Guide Index Next: User Guide/Homography estimation



Nvidia-preferred-partner-badge-rgb-for-screen.png




This page provides a basic description of the parameters required when building a cudastitcher pipeline. As well as an explanation of the workflow used when working with the stitcher.

Workflow diagram

The following diagram provides a visual representation of the workflow needed when using the stitcher, as well as the auxiliary tools required. The workflow overview provides a more detailed written version of the workflow and links to the required wikis.

2 Images Stitching Example

Workflow parameters

When using the stitcher, parameter acquirement and selection is a crucial steps in order to get the expected output. These parameters can be obtained from scripts provided within the stitcher itself. These parameters are:

Undistort parameters

If you are using the stitcher as well as the cuda-undistort element, there are more parameters to be obtained, information about those and how to set them can be found in the cuda undistort wiki

Homography List

This parameter defines the transformations between pairs of images. It is specified with the option homography-list and is set as a JSON formatted string, the JSON is constructed manually based on the individual homographies calculated with the homography estimation tool.
Read the Homography estimation guide on how to calculate the homography between two images.
Then visit the Homography list guide to better understand its format and how to construct it based on individual homographies.

Refinement parameters

The stitcher is capable of refining the homographies during execution, there are multiple parameters associated with this feature, see the Homography refinement guide for more information.

Blending Width

This parameter sets the amount of pixels to be blended between two images. It can be set with the border-width option. This guide provides more information on the topic.

Workflow overview

Here are presented the basic steps and execution order that needs to be followed in order to configure the stitcher properly and acquire the parameters for its usage.

  1. Know your input sources (N)
  2. Apply distortion correction to the inputs (only if necessary), see CUDA Accelerated GStreamer Camera Undistort Camera Calibration User Guide for more details
    • Run the calibration tool for each source that requires it
      • input: Multiple images of a calibration pattern
      • output: Camera matrix and distortion parameters
    • Save the camera matrix and distortion parameters for each camera since they will be required to build the pipelines
    • Repeat until every input has been corrected
  3. Calculate all (N-1) homographies between pairs of adjacent images, see Image Stitching Homography estimation User Guide for more details
    • Run the homography estimation tool for each image (target) and its reference (fixed)
      • input: Two still images from adjacent sources with overlap and a JSON config file
      • output: Homography matrix that describes the transformation between input sources
    • Save the homography matrices; they will be required in the next steps
    • Repeat until every (N-1) image has been a target (except for the original reference image)
  4. Assemble the homographies list JSON file
  5. Set the blending width, see Image Stitching Blending User Guide for more details
  6. Build and launch the stitcher pipeline, see Image Stitching GStreamer Pipeline Examples


Previous: User Guide Index Next: User Guide/Homography estimation