Getting started with AI on NXP i.MX8M Plus - Development - Integrating Artificial Intelligence software stack - Installing GstInference

From RidgeRun Developer Connection
Jump to: navigation, search




NXP Partner Program Registered Vertical.jpg NXP Partner Program Horizontal.jpg
Previous: Development/Integrating Artificial Intelligence software stack/Installing R2Inference Index Next: Development/Developing software for the board





Configuring GstInference

Since GstInference requires R2Inference for a proper build, then you should follow the Installing R2Inference before going through this wiki page.

Please find the local.conf file at:

cd $HOME/<your image folder>/<your build directory>/conf/

Finally, add the following line at this local.conf file:

IMAGE_INSTALL_append += "gst-inference"


After all please ensure that the recipe is like the following:

SUMMARY = "GstInference"
DESCRIPTION = "A GStreamer deep learning inference framework"
HOMEPAGE = "https://developer.ridgerun.com/wiki/index.php?title=GstInference"
SECTION = "multimedia"
LICENSE = "LGPL2.1"

LIC_FILES_CHKSUM = "file://COPYING;md5=b5ec61ada91a1aad4812666edfd8c57e"

DEPENDS = "gstreamer1.0 gstreamer1.0-plugins-base r2inference opencv"

SRCBRANCH ?= "master"
SRCREV = "55cad49398398f1fcc424f76244e65585c7cd8f4"
SRC_URI = "git://github.com/RidgeRun/gst-inference;protocol=https;branch=${SRCBRANCH}"

EXTRA_OEMESON += " -Denable-gtk-doc=false"

S = "${WORKDIR}/git"

FILES_${PN} += "${libdir}/gstreamer-1.0/libgstinferenceoverlay.so "
FILES_${PN} += "${libdir}/gstreamer-1.0/libgstinference.so "
FILES_${PN} += "${libdir}/gstreamer-1.0/libgstinferenceutils.so "

inherit meson pkgconfig gettext

Building the image with the changes in local.conf and bblayers.conf

cd $HOME/<your image folder>

source setup-environment <your build directory>

bitbake imx-image-core

Testing the GstInference installation

Ensure that GstInference is integrated in GStreamer

After flashing the image into the SD with the changes described here, then we can test it in the i.MX8M Plus.

In the board, please execute the gst-inspect command, as follows:

gst-inspect-1.0 inference

This command will show something like the following output:

Plugin Details:
  Name                     inference
  Description              Infer pre-trained model on incomming image frames on a variety of architectures and different backends
  Filename                 /usr/lib/gstreamer-1.0/libgstinference.so
  Version                  0.11.0.1
  License                  LGPL
  Source module            gst-inference
  Binary package           GstInference
  Origin URL               Unknown package origin

  resnet50v1: resnet50v1
  inceptionv1: inceptionv1
  inceptionv2: inceptionv2
  mobilenetv2: mobilenetv2
  inceptionv3: inceptionv3
  inceptionv4: inceptionv4
  tinyyolov2: tinyyolov2
  tinyyolov3: tinyyolov3
  mobilenetv2ssd: mobilenetv2ssd
  Rosetta: rosetta

  10 features:
  +-- 10 elements

Executing some inferences

For this section, please refer to the Video inferences section.



Previous: Development/Integrating Artificial Intelligence software stack/Installing R2Inference Index Next: Development/Developing software for the board