IMX6 GStreamer Pipelines - SDK Turrialba

From RidgeRun Developer Connection
Revision as of 17:29, 12 June 2013 by Efernandez (Talk | contribs)

Jump to: navigation, search

*********************PAGE UNDER CONSTRUCTION*********************

The following examples show the usage of GStreamer with the RR iMX6 SDK Turrialba.

Live Preview

Display videotest pattern

gst-launch -v videotestsrc ! mfw_v4lsink

tested: 20130530

Display only video

This pipeline supports all the containers supported by aiurdemux and formats supported by vpudec.

Containers: ogg, matroska, webm, quicktime, m4a, 3gp, mpeg and flv.

Formats: H264, H263, mpeg

gst-launch -v filesrc location= <testvideo> typefind=true ! aiurdemux ! queue ! vpudec ! mfw_v4lsink

tested: 20130530

Video loopback

gst-launch -v mfw_v4lsrc ! mfw_v4lsink
tested: 20130601
Debug Log Messages while running the Pipeline:
/ # gst-launch -v mfw_v4lsrc ! mfw_v4lsink
MFW_GST_V4LSRC_PLUGIN 3.0.5 build on May 23 2013 16:08:33.
'''ERROR: v4l2 capture: slave not found!
ERROR: v4l2 capture: slave not found!'''
MFW_GST_V4LSINK_PLUGIN 3.0ERROR: v4l2 capture: slave not found!
.5 build on May 23 2013 16:08:31.
Setting pipeline to PAUSED ...
ERROR: Pipeline doesn't want to pause.
Setting pipeline to NULL ...
Total rendered:0
Freeing pipeline ...
[--->FINALIZE v4l_sink

Note : This Error might be because of input src camera is not connected to the board

Video Decoding

Video + Audio decoding

gst-launch filesrc location= <testvideo> typefind=true ! aiurdemux name=demux demux. \
! queue max-size-buffers=0 max-size-time=0 ! vpudec ! mfw_v4lsink demux. ! queue max-size-buffers=0 max-size-time=0 ! \
beepdec ! audioconvert ! 'audio/x-raw-int, channels=2' ! alsasink

Video Encoding

Videotestsrc H-264 codec in avi container

gst-launch videotestsrc ! queue ! vpuenc codec=6 ! matroskamux ! filesink location=test.avi

tested: 20130531

H-264 codec in avi container 720p30

gst-launch mfw_v4lsrc capture-mode=4 fps-n=30 ! queue ! vpuenc codec=6 ! matroskamux ! filesink location=test.avi

H-264 codec in avi container 720p60

gst-launch mfw_v4lsrc capture-mode=4 fps-n=60 ! queue ! vpuenc codec=6 ! matroskamux ! filesink location=test.avi

H-264 codec in avi container 1080p15

gst-launch mfw_v4lsrc capture-mode=5 fps-n=15 ! queue ! vpuenc codec=6 ! matroskamux ! filesink location=test.avi

Video Transcoding

QuickTime to mkv

gst-launch filesrc location= input.mov typefind=true ! aiurdemux ! vpudec ! vpuenc ! matroskamux ! filesink location= output.mkv 

tested: 20130601

Debug Message Log while running the above pipeline:

/ # gst-launch filesrc location= sample_sorenson.mov typefind=true ! aiurdemux ! vpudec ! vpuenc ! matroskamux ! filesink location= output.mkv


Setting pipeline to PAUSED ...
[INFO] Product Info: i.MX6Q/D/S
vpuenc versions :)
plugin: 3.0.5
wrapper: 1.0.28(VPUWRAPPER_ARM_LINUX Build on May 23 2013 16:08:16)
vpulib: 5.4.10
firmware: 2.1.8.34588
[INFO] Product Info: i.MX6Q/D/S
vpudec versions :)
plugin: 3.0.5
wrapper: 1.0.28(VPUWRAPPER_ARM_LINUX Build on May 23 2013 16:08:16)
vpulib: 5.4.10
firmware: 2.1.8.34588
Pipeline is PREROLLING ...
Aiur: 3.0.5
Core: MPEG4PARSER_06.04.25 build on Dec 10 2012 16:29:48
mime: video/quicktime; audio/x-m4a; application/x-3gp
file: /usr/lib/imx-mm/parser/lib_mp4_parser_arm11_elinux.so.3.1
Content Info:
URI:
file:///sample_sorenson.mov
Idx File:
//.aiur/.sample_sorenson.mov.aidx
Seekable : Yes
Size(byte): 82395


'''ERROR: from element /GstPipeline:pipeline0/GstAiurDemux:aiurdemux0: GStreamer encountered a general stream error.'''
Additional debug info:
../../../../../src/src/parser/aiur/src/aiurdemux.c(4332): aiurdemux_pull_task (): /GstPipeline:pipeline0/GstAiurDemux:aiurdemux0:
streaming stopped, reason error, state 1
ERROR: pipeline doesn't want to preroll.
Setting pipeline to NULL ...
Freeing pipeline ...
[--->FINALIZE aiurdemux



Multi-Display

Multiple video playback on multiple screens (one video per screen).

2x1080p video + capture

Video 1 to monitor 1

gst-launch filesrc location= <VIDEO1> typefind=true ! aiurdemux ! queue ! vpudec ! mfw_v4lsink device=/dev/video16 &

Video 2 to monitor 2

gst-launch filesrc location= <VIDEO2> typefind=true ! aiurdemux ! queue ! vpudec ! mfw_v4lsink device=/dev/video18 &

Video capture to monitor 3

gst-launch mfw_v4lsrc ! mfw_v4lsink device=/dev/video20 &

3x720p video

Video 1 to monitor 1

gst-launch filesrc location= <VIDEO1> typefind=true ! aiurdemux ! queue ! vpudec ! mfw_v4lsink device=/dev/video16 &

Video 2 to monitor 2

gst-launch filesrc location= <VIDEO2> typefind=true ! aiurdemux ! queue ! vpudec ! mfw_v4lsink device=/dev/video18 &

Video 3 to monitor 3

gst-launch filesrc location= <VIDEO3> typefind=true ! aiurdemux ! queue ! vpudec ! mfw_v4lsink device=/dev/video20 &

Multi-Overlay

Several video playbacks on a single screen.

3 720p video overlay on HDMI monitor

To select first monitor use the following command.

DISP=mon1

Note: for the other monitors use.

  • Second monitor: mon2
  • Third monitor: mon3

Video 1 in left corner

gst-launch filesrc location= <VIDEO1> typefind=true ! aiurdemux ! queue ! vpudec ! mfw_isink display=$DISP axis-left=0 axis-top=0 disp-width=320 disp-height=240 &

Video 2 in the middle

gst-launch filesrc location= <VIDEO2> typefind=true ! aiurdemux ! queue ! vpudec ! mfw_isink display=$DISP axis-left=340 axis-top=0 disp-width=320 disp-height=240 &

Video 3 in right corner

gst-launch filesrc location= <VIDEO3> typefind=true ! aiurdemux ! queue ! vpudec ! mfw_isink display=$DISP axis-left=680 axis-top=0 disp-width=320 disp-height=240 &

Multi-Display and Multi-Overlay

Several video playbacks on multiple screens (multiple video per screen).

The following pipelines reproduce 3 videos on the first monitor and 2 videos and 1 capture on the second monitor with a total of 6 video playbacks in 2 monitors.

First Monitor

Left

gst-launch filesrc location= <VIDEO1> typefind=true ! aiurdemux ! queue ! vpudec ! mfw_isink display=mon1 axis-left=0 axis-top=0 disp-width=320 disp-height=240 &

Center

gst-launch filesrc location= <VIDEO2> typefind=true ! aiurdemux ! queue ! vpudec ! mfw_isink display=mon1 axis-left=340 axis-top=0 disp-width=320 disp-height=240 &

Right

gst-launch filesrc location= <VIDEO3> typefind=true ! aiurdemux ! queue ! vpudec ! mfw_isink display=mon1 axis-left=680 axis-top=0 disp-width=320 disp-height=240 &

Second Monitor

Left

gst-launch filesrc location= <VIDEO4> typefind=true ! aiurdemux ! queue ! vpudec ! mfw_isink display=mon2 axis-left=0 axis-top=0 disp-width=320 disp-height=240 &

Capture

gst-launch mfw_v4lsrc ! mfw_isink display=mon2 axis-left=340 axis-top=0 disp-width=320 disp-height=240 &

Right

gst-launch filesrc location= <VIDEO5> typefind=true ! aiurdemux ! queue ! vpudec ! mfw_isink display=mon2 axis-left=680 axis-top=0 disp-width=320 disp-height=240 &

Audio Playback

MultiFormat audio Playback

This pipeline decodes the formats supported by the beepdec decoder.

Formats: mpeg, AAC and vorbis.

gst-launch filesrc location= <audiotest> ! mpegaudioparse ! beepdec ! audioconvert ! alsasink

Audio Encoding

WAV to mp3

gst-launch filesrc location= test.wav ! wavparse ! mfw_mp3encoder ! filesink location= test.mp3

tested: 20130601

/ # gst-launch filesrc location=sample14.wav ! wavparse ! mfw_mp3encoder ! filesink location= test.mp3

BLN_MAD-MMCODECS_MP3E_ARM_02.02.00_ARM12 build on Mar 21 2012 17:13:02.
MFW_GST_MP3_ENCODER_PLUGIN 3.0.5 build on May 23 2013 16:08:41.
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
'''ERROR: CODEC ERROR code 0x192, please consult Freescale codec team for more information'''
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Got EOS from element "pipeline0".
Execution ended after 5921001 ns.
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...

Note: test.wav file is transcode/encode to test.mp3 file successfully


RTP Video Streaming

These instructions show how to do video streaming over the network, a video will be played on the board and viewed on the host. These pipelines use the default port (4951) to send the packets, if you want to change the port number, you have to add the port capability to the udpsink.(e.g udpsink port=$PORT host=$CLIENT_IP)

Stream H.264 video test pattern over RTP

  • Server: iMX6
CLIENT_IP=10.251.101.58

gst-launch videotestsrc ! vpuenc  codec=6 ! queue ! h264parse ! rtph264pay ! udpsink host=$CLIENT_IP -v

tested: 20130531

This pipeline is going to print the capabilities of each element's pad thanks to the -v option. The pipeline should print something similar to this output:

.

.
.
/GstPipeline:pipeline0/GstVideoTestSrc:videotestsrc0.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)NV12, 
color-matrix=(string)sdtv, chroma-site=(string)mpeg2, width=(int)320, height=(int)240, framerate=(fraction)30/1
Pipeline is PREROLLING ...
/GstPipeline:pipeline0/GstVpuEnc:vpuenc0.GstPad:sink: caps = video/x-raw-yuv, format=(fourcc)NV12, 
color-matrix=(string)sdtv, chroma-site=(string)mpeg2, width=(int)320, height=(int)240, framerate=(fraction)30/1
[INFO]	chromaInterleave 1, mapType 0, linear2TiledEnable 0
/GstPipeline:pipeline0/GstVpuEnc:vpuenc0.GstPad:src: caps = video/x-h264, width=(int)320, 
height=(int)240, framerate=(fraction)30/1, framed=(boolean)true
/GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:src: caps = video/x-h264, width=(int)320, height=(int)240, 
framerate=(fraction)30/1, framed=(boolean)true, parsed=(boolean)true, stream-format=(string)byte-stream, alignment=(string)au
/GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:sink: caps = video/x-h264, width=(int)320, height=(int)240, 
framerate=(fraction)30/1, framed=(boolean)true
/GstPipeline:pipeline0/GstQueue:queue0.GstPad:sink: caps = video/x-h264, width=(int)320, height=(int)240, 
framerate=(fraction)30/1, framed=(boolean)true, parsed=(boolean)true, stream-format=(string)byte-stream, alignment=(string)au
/GstPipeline:pipeline0/GstQueue:queue0.GstPad:src: caps = video/x-h264, width=(int)320, height=(int)240, 
framerate=(fraction)30/1, framed=(boolean)true, parsed=(boolean)true, stream-format=(string)byte-stream, alignment=(string)au
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0.GstPad:sink: caps = video/x-h264, width=(int)320, height=(int)240, 
framerate=(fraction)30/1, framed=(boolean)true, parsed=(boolean)true, stream-format=(string)byte-stream, alignment=(string)au
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0.GstPad:src: caps = application/x-rtp, media=(string)video, 
clock-rate=(int)90000, encoding-name=(string)H264, sprop-parameter-sets=(string)\"Z0JAFKaBQfkA\\,aM4wpIAA\", payload=(int)96, 
ssrc=(uint)87645921, clock-base=(uint)1548379595, seqnum-base=(uint)847
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0: timestamp = 1548379595
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0: seqnum = 847
/GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad:sink: caps = application/x-rtp, media=(string)video, 
clock-rate=(int)90000, encoding-name=(string)H264, sprop-parameter-sets=(string)\"Z0JAFKaBQfkA\\,aM4wpIAA\", payload=(int)96, 
ssrc=(uint)87645921, clock-base=(uint)1548379595, seqnum-base=(uint)847  
.
.
.                                                                       

You need the udpsink:sink capabilities for the client pipeline.

  • Client: Ubuntu PC

Copy the udpsink caps given by the server pipeline, erase the spaces and the (uint) cast.

CAPS=application/x-rtp, media=(string)video,clock-rate=(int)90000, encoding-name=(string)H264,
sprop-parameter-sets=(string)"Z0JAFKaBQfkA\,aM4wpIAA",payload=(int)96,ssrc=87645921, clock-base=1548379595,seqnum-base=847

PORT=4951

gst-launch udpsrc port=$PORT ! $CAPS ! rtph264depay ! queue ! ffdec_h264 ! xvimagesink sync=false -v

tested: 20130531

Stream MPEG4 encoded video file over RTP

These pipelines use a video file and send it over the network. Here you can use any file encoded in MPEG4.

  • Server: iMX6
CLIENT_IP=10.251.101.58

FILE=bbb_twomin_1080p.avi

gst-launch filesrc location=$FILE   typefind=true ! aiurdemux  ! queue ! mpeg4videoparse ! rtpmp4vpay ! udpsink host=$CLIENT_IP -v

tested: 20130531

As before, you need the udpsink:sink capabilities for the client pipeline.

  • Client: Ubuntu PC

Copy the udpsink caps given by the server pipeline, erase the spaces and the (uint) cast.

CAPS=application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)MP4V-ES,
profile-level-id=(string)1,config=(string)000001b001000001b58913000001000000012000c48d8800c53c04871463000001b24c61766335312e34342e30,
payload=(int)96,ssrc=1303818908,clock-base=1090442294,seqnum-base=(uint)63287

PORT=4951

gst-launch udpsrc port=$PORT ! $CAPS ! rtpmp4vdepay ! queue ! ffdec_mpeg4 ! xvimagesink sync=false -v

tested: 20130531

Stream H.264 encoded video capture over RTP

These pipelines use a video capture and send it over the network.

  • Server: iMX6
CLIENT_IP=10.251.101.58

gst-launch mfw_v4lsrc  capture-mode=5 fps-n=15 ! vpuenc codec=6 ! queue ! rtph264pay ! udpsink host=$CLIENT_IP -v

As before, you need the udpsink:sink capabilities for the client pipeline.

  • Client: Ubuntu PC

Copy the udpsink caps given by the server pipeline, erase the spaces and the (uint) cast.

CAPS=application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, 
sprop-parameter-sets=(string)"Z0JAKKaAeAIn5UAA\,aM4wpIAA", payload=(int)96, ssrc=121555752, clock-base=1624542567, seqnum-base=15553

PORT=4951

gst-launch udpsrc port=$PORT ! $CAPS ! rtph264depay ! queue ! ffdec_h264 ! xvimagesink sync=false -v