Difference between revisions of "GStreamer Debugging"
m (Created page with '= Links = * Good collection of useful hints (including core dumps, using gdb) for GStreamer applications -http://www.buzztard.org/index.php/Debugging * GStreamer application deb...') |
m |
||
Line 20: | Line 20: | ||
== Use gst-tracelib library to log key pipeline behavior == | == Use gst-tracelib library to log key pipeline behavior == | ||
− | This gst-tracelib library hooks into gstreamer key functions and logs the behavior. When the application exits it displays some general statistics. Further analysis can be done based on the data written to the | + | This gst-tracelib library hooks into gstreamer key functions and logs the behavior. When the application exits it displays some general statistics. Further analysis can be done based on the data written to the log file. Detailed usage description in the source code [http://cgit.freedesktop.org/~ensonic/gst-tracelib/tree/README README] file. |
gst-tracelib logs | gst-tracelib logs | ||
Line 64: | Line 64: | ||
== Capturing a core dump == | == Capturing a core dump == | ||
− | If your GStreamer application is crashing with a | + | If your GStreamer application is crashing with a seg fault or similar condition, enable saving a core dump before running the application. |
<pre> | <pre> | ||
Line 89: | Line 89: | ||
== Use gdb to attach to GStreamer application and examine all the threads == | == Use gdb to attach to GStreamer application and examine all the threads == | ||
− | The [ | + | The [[Debug and Profiling Guide#Software debugging with GDB | SDK Debugging Guide]] provides detailed instructions. |
Built your application with symbols (-g) and no optimization (-O0). Use GStreamer libraries that are built with symbols. | Built your application with symbols (-g) and no optimization (-O0). Use GStreamer libraries that are built with symbols. | ||
Line 129: | Line 129: | ||
If you have a GStreamer application that locks up and doesn't run correctly | If you have a GStreamer application that locks up and doesn't run correctly | ||
even after you exit the program (possibly with cntl-C) and restart, then | even after you exit the program (possibly with cntl-C) and restart, then | ||
− | it is possible some kernel provided resource is the | + | it is possible some kernel provided resource is the culprit. For example, |
if you are using a defective ALSA audio out driver, you might find the | if you are using a defective ALSA audio out driver, you might find the | ||
GStreamer pipeline locks up in the middle. If you exit the GStreamer application | GStreamer pipeline locks up in the middle. If you exit the GStreamer application |
Revision as of 11:58, 6 September 2010
Contents
- 1 Links
- 2 GStreamer debugging approaches
- 2.1 Use standard GStreamer debug output with filter
- 2.2 Use gst-tracelib library to log key pipeline behavior
- 2.3 Watch system interrupts
- 2.4 Capturing a core dump
- 2.5 Use gdb to attach to GStreamer application and examine all the threads
- 2.6 Exit locked GStreamer application and see what works stand alone
- 2.7 Examine a history of what transpired just prior to lockup
- 2.8 Look for memory leaks
Links
- Good collection of useful hints (including core dumps, using gdb) for GStreamer applications -http://www.buzztard.org/index.php/Debugging
- GStreamer application debugging - http://www.gstreamer.net/data/doc/gstreamer/head/gstreamer/html/gst-running.html
- GStreamer debugging (helpful syntax) - http://www.gstreamer.net/data/doc/gstreamer/head/manual/html/section-checklist-debug.html
- Flumotion approach - http://www.flumotion.net/doc/flumotion/manual/en/trunk/html/chapter-debug.html
GStreamer debugging approaches
Some of these approaches are only useful when you are running a pipeline and audio and/or video stops at an unexpected place in the data stream.
Use standard GStreamer debug output with filter
gst-launch videotestsrc num-buffers=3 ! fakesink --gst-debug=GST_REFCOUNTING:5 --gst-debug-no-color=1 2>&1 | grep "\->0" > log.txt
Gets useful data, but typically slows pipeline performance to the point of being not usable.
Use gst-tracelib library to log key pipeline behavior
This gst-tracelib library hooks into gstreamer key functions and logs the behavior. When the application exits it displays some general statistics. Further analysis can be done based on the data written to the log file. Detailed usage description in the source code README file.
gst-tracelib logs
- dataflow, messages, queries and events.
- caps set|get
- pipeline topology changes
- resource usage
Example usage:
export GSTTL_HIDE="caps;chk;topo" export GSTTL_LOG_SIZE=1048576 AV_FILE=/SD/content/MoMen-dm365.mov LD_PRELOAD=/usr/lib/gst-tracelib/libgsttracelib.so filesrc location = $AV_FILE ! qtdemux name=demux ! queue ! dmaidec_h264 numOutputBufs=12 ! \ priority nice=-10 ! queue ! priority nice=-10 ! dmaiperf ! TIDmaiVideoSink accelFrameCopy=true \ videoOutput=DVI videoStd=720P_60 demux.audio_00 ! queue ! priority nice=-5 ! dmaidec_aac ! alsasink
By default, the log file is named /tmp/gsttl.log.
Watch system interrupts
Run the telnet daemon on the target - likely:
/etc/init.d/inetd start # or inetd
telnet into the target hardware and watch the interrupt count
while sleep 1 ; do cat /proc/interrupts ; done
If the pipeline is suppose to be running, the changes in the interrupt count may provide clues as to what is going on.
Capturing a core dump
If your GStreamer application is crashing with a seg fault or similar condition, enable saving a core dump before running the application.
ulimit -c 10000 mkdir -m 777 /root/dumps echo "/root/dumps/%e.core" > /proc/sys/kernel/core_pattern
Once you have a /root/dumps/*.core file, copy it to your and and inspect it with
ddd -debugger arm-linux-gnueabi-gdb $GSTREAMER_APPLIATION
Then in gdb,
target core <core file> bt
and see what function caused the core dump.
Use gdb to attach to GStreamer application and examine all the threads
The SDK Debugging Guide provides detailed instructions.
Built your application with symbols (-g) and no optimization (-O0). Use GStreamer libraries that are built with symbols.
Attach to your running GStreamer application using gdbserver
ps PID=4512 # set the right value based on your application's PID gdbserver :2345 --attach $PID
Then start the cross compile version of the GNU debugger, like
ddd -debugger arm-linux-gnueabi-gdb $GSTREAMER_APP
and get the gdb debugger connected to gdb server on the target
set solib-absolute-prefix <path to devdir>/fs/fs file <path to file> target remote <ip address>:2345
List the threads and do a back trace on each one
info threads bt thread 2 bt
Exit locked GStreamer application and see what works stand alone
If you have a GStreamer application that locks up and doesn't run correctly even after you exit the program (possibly with cntl-C) and restart, then it is possible some kernel provided resource is the culprit. For example, if you are using a defective ALSA audio out driver, you might find the GStreamer pipeline locks up in the middle. If you exit the GStreamer application and try a simple audio application, like aplay, you might be able to identify the source of your problem.
Examine a history of what transpired just prior to lockup
If you have a GStreamer application that locks up and can change the pipeline to include a (at this point mythical) recent activity history logger. Such a logger element could be put anywhere in the pipeline. The logger would have circular buffers to keep track of all potentially interesting recent history, such as pad activity, bus activity, and any other relevant information. The circular buffer entries would all be timestamped. When some event occurs (a file exists, a message/signal is received, etc), the element would dump the history, and continue capturing new data.
This idea is after the pipeline locks up, you could cause the history logger to dump it data, and then get an idea of what is suppose to be happening that isn't not occurring.
Look for memory leaks
If you have a GStreamer application that runs for a while, or maybe a long time, and then fails, the problem could be due to a memory leak. Use Valgrind to verify all allocated memory is freed properly.
valgrind --tool=memcheck --trace-children=yes $GSTRAMER_APP