Search by Tags

Video Encoding and Playback (Linux)

 

Compare with Revision




Subscribe for this article updates

iMX6 Modules

Encoding

Display a video on Apalis iMX6Q from a CSI Camera Module 5MP OV5640 source and concurrently store it H.264 encoded to a file:

root@apalis-imx6:~# gst-launch-1.0 imxv4l2videosrc device=/dev/video2 ! tee ! queue2 ! vpuenc_h264 ! qtmux ! filesink location=temp.mp4 tee0. ! imxeglvivsink -e

Take Still Image

Using the same setup with Apalis iMX6Q from a CSI Camera Module 5MP OV5640 source, one can capture a still image

root@apalis-imx6:~# gst-launch-1.0 v4l2src num-buffers=1 device=/dev/video1 ! jpegenc ! filesink location=test.jpg

Take a Series of Still Images

Using the same setup with Apalis iMX6Q from a CSI Camera Module 5MP OV5640 source, one can capture a series of still images

root@apalis-imx6:~# gst-launch-1.0 v4l2src device=/dev/video1 ! jpegenc ! multifilesink location=test%d.jpg

Apalis TK1 Module

Encoding

The next pipeline will stream video from a UVC webcam.

root@apalis-tk1:~# gst-launch-0.10 v4l2src ! xvimagesink

Below examples allow storing still images resp. videos coming off a UVC webcam or our CSI Camera Module 5MP OV5640.

Take Still Image

Note: The following pipeline runs on our CSI Camera Module 5MP OV5640. Unfortunately the first few frames are currently black or greenish and need to be discarded.

root@apalis-tk1:~# gst-launch-0.10 v4l2src decimate=5 ! 'video/x-raw-yuv,format=(fourcc)UYVY,width=640,height=480' ! ffmpegcolorspace ! video/x-raw-rgb,framerate=1/1 ! ffmpegcolorspace ! pngenc ! filesink location=test.png

If you are using a UVC webcam, the next pipeline will capture the still image

 
root@apalis-tk1:~# gst-launch-0.10 v4l2src num-buffers=1 ! jpegenc ! filesink location=test.jpg

Take a Series of Still Images

As our CSI Camera Module 5MP OV5640 uses YUV as color encoding format,

root@apalis-tk1:~# gst-launch-0.10 v4l2src queue-size=1 ! 'video/x-raw-yuv,format=(fourcc)UYVY,width=640,height=480' ! ffmpegcolorspace ! videorate ! video/x-raw-rgb,framerate=1/1 ! ffmpegcolorspace ! pngenc snapshot=false ! multifilesink location=test%d.png

Note: For GStreamer 1.0 the gstreamer1.0-plugins-good-multifile package is also required which so far was not part of our Embedded Linux demo images as of 2.7b5 or 2.8b1.

root@apalis-tk1:~# gst-launch-1.0 v4l2src device=/dev/video0 ! 'video/x-raw,format={UYVY},width=640,height=480' ! videorate max-rate=10 ! videoconvert ! avenc_png ! multifilesink location=test%d.png

VP8 Video Encoding

root@apalis-tk1:~# gst-launch-1.0 v4l2src ! 'video/x-raw,format={UYVY},width=1280,height=720,framerate=30/1' ! videoconvert ! 'video/x-raw,format={I420}' ! nvvidconv ! 'video/x-raw(memory:NVMM)' ! omxvp8enc bitrate=4000000 ! avimux ! filesink location=test.avi

H.264 Video Encoding

Note: In order for the bitrate property to take any effect the low-latency as well as rc-mode resp. control-rate properties also need to be set as per below example pipelines.

root@apalis-tk1:~# gst-launch-0.10 v4l2src queue-size=1 ! 'video/x-raw-yuv,format=(fourcc)UYVY,width=1280,height=960' ! ffmpegcolorspace ! 'video/x-raw-yuv, format=(fourcc)I420' ! nv_omx_h264enc low-latency=true rc-mode=0 bitrate=4000000 ! qtmux ! filesink location=test.mp4 -e
root@apalis-tk1:~# gst-launch-1.0 v4l2src ! 'video/x-raw,format={UYVY},width=1280,height=720,framerate=30/1' ! videoconvert ! 'video/x-raw,format={I420}' ! nvvidconv ! 'video/x-raw(memory:NVMM)' ! omxh264enc low-latency=true bitrate=4000000 control-rate=2 ! 'video/x-h264,stream-format=(string)byte-stream' ! h264parse ! avimux ! filesink location=test.avi

Verify the supported formats of your camera using the command line below

root@apalis-tk1:~# v4l2-ctl --list-formats-ext

In order to assure format compatibility, also inspect videoconvert format types with the current GStreamer version used in the pipeline

root@apalis-tk1:~# gst-inspect-1.0 videoconvert

For further information and other GStreamer pipeline examples, see Jetson Tk1/Tegra Linux Driver Package Multimedia User Guide

T20/T30 Modules

Gstreamer OpenMAX Wrapper Format Conversion

While regular gstreamer plugins are usually using x-raw-yuv and NVIDIA's gstreamer wrapped OpenMAX stuff usually wanted x-nvrm-yuv there seems to be a third colour format representation called x-nv-yuv which is what the nv_omx_videomixer requires. Some examples:

root@colibri_t20:~# gst-launch videotestsrc ! 'video/x-raw-yuv, width=(int)640, height=(int)480, format=(fourcc)YUY2' ! nvvidconv ! 'video/x-nv-yuv, width=(int)640, height=(int)480, format=(fourcc)I420' ! nv_omx_videomixer ! nv_gl_eglimagesink
root@colibri_t20:~# gst-launch videotestsrc ! 'video/x-raw-yuv, width=(int)640, height=(int)480, format=(fourcc)YUY2' ! nvvidconv ! 'video/x-nvrm-yuv, format=(fourcc)I420' ! nvxvimagesink
root@colibri_t20:~# gst-launch videotestsrc ! 'video/x-raw-yuv, width=(int)640, height=(int)480, format=(fourcc)YUY2' ! nvvidconv ! nvxvimagesink

Video BSP V1.x

The following example shows how to playback video through Gstreamer using a Colibri T20 module. Please note that the two numbers at the end specify which ALSA card and device to use for audio (e.g. alsasink device=hw:1,0 for SPDIF through HDMI and alsasink device=hw:2,0 for WM9715L AC97 through headphone). Further note, since the example is performed with a Tegra module, it utilizes NVIDIA Gstreamer elements in the pipeline. Use of another module may require alternate elements.

root@colibri_t20:~# gst-launch filesrc location=/home/root/nv_medusa_h264_720_6M_cbr_2p_key60_q90_aac128_44.mp4 ! qtdemux name=demux demux.video_00 ! nv_omx_h264dec ! nv_gl_videosink rendertarget=0 demux.audio_00 ! nv_omx_aacdec ! alsasink device=hw:1,0
[ 2388.624904] unknown ioctl code
Setting pipeline to PAUSED ...
[ 2388.677056] unknown ioctl code
Pipeline is PREROLLING ...
[ 2388.840327] avp_init: read firmware from 'nvrm_avp.bin' (36528 bytes)
[ 2388.846957] avp_init: Loading AVP kernel at vaddr=d8c00000 paddr=1ff00000
[ 2388.854848] avp_reset: Resetting AVP: reset_addr=100000
[ 2388.878221] avp_init: avp init done
[ 2388.905824] avp_svc_thread: got remote peer
[ 2388.910754] avp_lib: Successfully loaded library nvmm_manager.axf (lib_id=115da8)
[ 2388.953837] avp_lib: Successfully loaded library nvmm_service.axf (lib_id=117bb0)
[ 2389.017941] avp_lib: Successfully loaded library nvmm_h264dec.axf (lib_id=119a20)
[ 2389.036379] tegra_dvfs: rate 721500000 too high for dvfs on emc
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstAudioSinkClock
Got EOS from element "pipeline0".
Execution ended after 105287328998 ns.
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
[ 2494.724074] avp_lib: Successfully unloaded 'nvmm_h264dec.axf'
[ 2494.764073] avp_lib: Successfully unloaded 'nvmm_service.axf'
Setting pipeline to NULL ...
Freeing pipeline ...
[ 2494.804076] avp_lib: Successfully unloaded 'nvmm_manager.axf'
[ 2494.809853] avp_svc_thread: couldn't receive msg
[ 2494.814710] avp_svc_thread: done
[ 2494.817958] avp_uninit: avp teardown done

Video BSP V2.x

The following example shows how to playback video through Gstreamer using a Colibri T20 module. Please note that the two numbers at the end specify which ALSA card and device to use for audio (e.g. alsasink device=hw:0,0 for WM9715L AC97 through headphone and alsasink device=hw:1,0 for SPDIF through HDMI). Further note, since the example is performed with a Tegra module, it utilizes NVIDIA Gstreamer elements in the pipeline. Use of another module may require an alternate pipeline.

Note: In case you experience banding issues, this is likely due to 16-Bit colour depth in our default image. To enable 24-Bit color depth, consult the Framebuffer (Linux) and the X-Server (Linux) articles.

root@colibri-t20:~# gst-launch filesrc location=/home/root/nv_medusa_h264_720_6M_cbr_2p_key60_q90_aac128_44.mp4 ! qtdemux name=demux demux.video_00 ! nv_omx_h264dec ! nv_gl_eglimagesink demux.audio_00 ! nv_omx_aacdec ! alsasink device=hw:0,0
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...

omx_setup error while setting FilterTimestamp
[32309.790818] avp_init: Using AVP MMU to relocate AVP kernel
[32309.821369] avp_init: Reading firmware from 'nvrm_avp.bin' (46612 bytes)
[32309.828541] avp_init: Loading AVP kernel at vaddr=d7a00000 paddr=18100000
[32309.852155] avp_reset: Resetting AVP: reset_addr=100000
[32309.869479] avp_node_try_connect: trying connect from RPC_AVP_PORT
[32309.875957] process_connect_locked: got connect (111794)
[32309.881278] avp_svc_thread: got remote peer
[32309.885664] [AVP]: AVP kernel (Nov  2 2012 16:47:31)
[32309.896950] avp_node_try_connect: got conn ack 'RPC_AVP_PORT' (cc10dd80 <-> 111758)
[32309.904691] avp_init: avp init done
[32309.908175] avp_lib: loading library 'nvmm_manager.axf'
[32309.928749] avp_lib: Successfully loaded library nvmm_manager.axf (lib_id=118960)
[32309.936494] avp_node_try_connect: trying connect from NVMM_MANAGER_SRV
[32309.951837] avp_node_try_connect: got conn ack 'NVMM_MANAGER_SRV' (cc10db00 <-> 119c00)
[32309.961486] avp_lib: loading library 'nvmm_service.axf'
[32309.983984] avp_lib: Successfully loaded library nvmm_service.axf (lib_id=11a770)
[32309.991654] avp_node_try_connect: trying connect from nbaaaaaa+
[32310.006858] avp_node_try_connect: got conn ack 'nbaaaaaa+' (cc10d200 <-> 11bab8)
[32310.025884] avp_lib: loading library 'nvmm_h264dec.axf'
[32310.140791] avp_lib: Successfully loaded library nvmm_h264dec.axf (lib_id=11c628)
[32310.149179] avp_node_try_connect: trying connect from obaaaaaa+
[32310.161838] avp_node_try_connect: got conn ack 'obaaaaaa+' (cd049540 <-> 11cbe0)
Allocating new output: 1280x720 (x 9)
[32310.234392] tegra20_ac97_hw_params(): dai->id=0, play
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstAudioSinkClock
[32310.491623] tegra20_ac97_trigger()
[32310.495032] tegra20_ac97_start_playback()
[32310.499038] ac97_fifo_set_attention_level()
[32310.503216] ac97_slot_enable()
Got EOS from element "pipeline0".
Execution ended after 114565608000 ns.
Setting pipeline to PAUSED ...
[32425.070926] tegra20_ac97_trigger()
[32425.074368] tegra20_ac97_stop_playback()
[32425.078311] ac97_fifo_set_attention_level()
[32425.083534] ac97_slot_enable()
Setting pipeline to READY ...
[32425.129970] avp_trpc_close: closing 'obaaaaaa+' (11cbe0)
[32425.146843] _send_disconnect: sent disconnect msg for 11cbe0
[32425.182387] avp_lib: Successfully unloaded 'nvmm_h264dec.axf'
[32425.188896] avp_lib: unloaded 'nvmm_h264dec.axf'
[32425.212583] avp_trpc_close: closing 'nbaaaaaa+' (11bab8)
[32425.227385] _send_disconnect: sent disconnect msg for 11bab8
[32425.261971] avp_lib: Successfully unloaded 'nvmm_service.axf'
[32425.267764] avp_lib: unloaded 'nvmm_service.axf'
[32425.273879] process_disconnect_locked: got disconnect (cc10db00)
[32425.311926] avp_lib: Successfully unloaded 'nvmm_manager.axf'
[32425.317732] avp_lib: unloaded 'nvmm_manager.axf'
Setting pipeline to NULL ...
Freeing pipeline ...
[32425.405907] avp_svc_thread: AVP seems to be down; wait for kthread_stop
[32425.413156] avp_svc_thread: exiting
[32425.416771] avp_uninit: avp teardown done

Encoding

Our T20 BSP V2.x supports video encoding as well.

Note the required '-e' option to gst-launch for mp4 containers. This sends an end of file through the pipeline and ensures a correctly written mp4 file. The tests were done with a Logitech C920 webcam. Possible resolutions etc. depend on the webcam used.

Resolutions other than 640x480 require the gstreamer plugins from NVIDIAs L4T R16.3. These will only be part of our images later than June 2013.

VGA V4L2 source in YUV 4:2:2 format encoded to H.264 and stored to a file:

root@colibri_t20:~# gst-launch -e v4l2src device="/dev/video0" ! 'video/x-raw-yuv, width=(int)640, height=(int)480, format=(fourcc)YUY2' ! nvvidconv ! 'video/x-nvrm-yuv, format=(fourcc)I420' ! nv_omx_h264enc ! qtmux ! filesink location=temp.mp4
gst-launch v4l2src device="/dev/video0" ! 'video/x-raw-yuv, width=(int)640, height=(int)480, format=(fourcc)YUY2' ! nvvidconv ! 'video/x-nvrm-yuv, format=(fourcc)I420' ! nv_omx_h264enc ! video/x-h264 ! avimux ! filesink location=temp.avi

800x448 V4L2 source in YUV 4:2:2 format encoded to H.264 and stored to a file:

root@colibri_t20:~# gst-launch -e v4l2src device="/dev/video0" ! 'video/x-raw-yuv, width=(int)800, height=(int)448, format=(fourcc)YUY2' ! nvvidconv ! 'video/x-nvrm-yuv, format=(fourcc)I420' ! nv_omx_h264enc ! qtmux ! filesink location=temp.mp4

720p V4L2 source in YUV 4:2:2 format encoded to H.264 and stored to a file:

root@colibri_t20:~# gst-launch -e v4l2src device="/dev/video0" ! 'video/x-raw-yuv, width=(int)1280, height=(int)720, format=(fourcc)YUY2' ! nvvidconv ! 'video/x-nvrm-yuv, format=(fourcc)I420' ! nv_omx_h264enc ! qtmux ! filesink location=temp.mp4

1280x1024 videotestsource in YUV 4:2:2 format encoded to H.264 and stored to a file:

root@colibri_t20:~# gst-launch -e videotestsrc ! 'video/x-raw-yuv, width=(int)1280, height=(int)1024, format=(fourcc)YUY2' ! nvvidconv ! 'video/x-nvrm-yuv, format=(fourcc)I420' ! nv_omx_h264enc ! qtmux ! filesink location=temp.mp4

videotestsource in YUV 4:2:0 format encoded to H.264 and stored to a file:

root@colibri_t20:~# gst-launch -e videotestsource ! 'video/x-raw-yuv, width=(int)640, height=(int)480, format=(fourcc)I420' ! nvvidconv ! 'video/x-nvrm-yuv' ! nv_omx_h264enc ! qtmux ! filesink location=temp.mp4

Display a video from a VGA V4L2 source and concurrently store it H.264 encoded to a file:

root@colibri_t20:~# gst-launch -e v4l2src device="/dev/video0" ! 'video/x-raw-yuv, width=(int)640, height=(int)480, format=(fourcc)YUY2' ! nvvidconv ! 'video/x-nvrm-yuv, format=(fourcc)I420' ! tee ! nvxvimagesink tee0. ! queue2 ! nv_omx_h264enc ! qtmux ! filesink location=temp.mp4

Display a video on Apalis T30 from a OV5640 CSI-2 full HD V4L2 source and concurrently store it H.264 encoded to a file:

root@colibri_t20:~# gst-launch v4l2src device=/dev/video0 ! 'video/x-raw-yuv, framerate=15/1, width=1920, height=1088, format=(fourcc)I420' ! tee ! nv_omx_videomixer ! nv_gl_eglimagesink tee0. ! queue2 ! nv_omx_h264enc ! qtmux ! filesink location=temp.mp4 -e

Note: The encoder is limited to resolutions dividable by 16 (e.g. 1920x1088 instead of 1920x1080).

Note2: On T30 the achievable frame rate for full HD is around 15 FPS.

RTP

# Testrun to find out the used video capabilities
root@colibri-t20:~# gst-launch -v filesrc location=temp.mp4 ! qtdemux name=demux demux.video_00 ! queue ! rtph264pay pt=96 ! udpsink host=localhost port=5000 demux.audio_00 ! queue ! rtpmp4apay pt=97 ! udpsink host=localhost port=5001     
Setting pipeline to PAUSED ...                                                 
Pipeline is PREROLLING ...                                                     
/GstPipeline:pipeline0/GstQueue:queue0.GstPad:sink: caps = video/x-h264, stream-format=(string)avc, alignment=(string)au, level=(string)2.1, profile=(string)constrained-baseline, codec_data=(buffer)01424015030100096742802895a0280f4401000468ce3c80, width=int)640, height=(int)480, framerate=(fraction)125/4, pixel-aspect-ratio=fraction)1/1                                                       
/GstPipeline:pipeline0/GstQueue:queue0.GstPad:src: caps = video/x-h264, stream-format=(string)avc, alignment=(string)au, level=(string)2.1, profile=(string)constrained-baseline, codec_data=(buffer)01424015030100096742802895a0280f4401000468ce3c80, width=int)640, height=(int)480, framerate=(fraction)125/4, pixel-aspect-ratio=fraction)1/1                                                        
/GstPipeline:pipeline0/GstRtpH.264Pay:rtph264pay0.GstPad:src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H.264, sprop-parameter-sets=(string)\"Z0KAKJWgKA9E\\,aM48gA\\=\\=\", payload=(int)96, ssrc=(uint)3804678311, clock-base=(uint)311926741, seqnum-base=(uint)59376    
/GstPipeline:pipeline0/GstRtpH.264Pay:rtph264pay0.GstPad:sink: caps = video/x-h264, stream-format=(string)avc, alignment=(string)au, level=(string)2.1, profile=(string)constrained-baseline, codec_data=(buffer)01424015030100096742802895a0280f4401000468ce3c80, width=int)640, height=(int)480, framerate=(fraction)125/4, pixel-aspect-ratio=fraction)1/1                                             
/GstPipeline:pipeline0/GstRtpH.264Pay:rtph264pay0: timestamp = 311926741        
/GstPipeline:pipeline0/GstRtpH.264Pay:rtph264pay0: seqnum = 59376               
/GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H.264, sprop-parameter-sets=(string)\"Z0KAKJWgKA9E\\,aM48gA\\=\\=\", payload=(int)96, ssrc=(uint)3804678311, clock-base=(uint)311926741, seqnum-base=(uint)59376         
^CCaught interrupt -- handling interrupt.                                      
Interrupt: Stopping pipeline ...                                               
ERROR: pipeline doesn't want to preroll.                                       
Setting pipeline to NULL ...                                                   
/GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad:sink: caps = NULL            
/GstPipeline:pipeline0/GstRtpH.264Pay:rtph264pay0.GstPad:sink: caps = NULL      
/GstPipeline:pipeline0/GstRtpH.264Pay:rtph264pay0.GstPad:src: caps = NULL       
/GstPipeline:pipeline0/GstQueue:queue0.GstPad:src: caps = NULL                 
/GstPipeline:pipeline0/GstQueue:queue0.GstPad:sink: caps = NULL                
/GstPipeline:pipeline0/GstQTDemux:demux.GstPad:video_00: caps = NULL           
Freeing pipeline ...                                                           

# use the displayed video capabilities
VCAPS="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H.264, sprop-parameter-sets=(string)\"Z0KAKJWgKA9E\\,aM48gA\\=\\=\", payload=(int)96, ssrc=(uint)3804678311, clock-base=(uint)311926741, seqnum-base=(uint)59376"

# start the receiver with the found video capabilities
root@colibri_t20:~# gst-launch udpsrc port=5000 ! $VCAPS ! rtph264depay ! nv_omx_h264dec ! nv_gl_eglimagesink

# launch the sender again
root@colibri_t20:~# gst-launch-0.10 -e v4l2src device="/dev/video0" ! 'video/x-raw-yuv, width=(int)640, height=(int)480, format=(fourcc)YUY2' ! nvvidconv ! 'video/x-nvrm-yuv, format=(fourcc)I420' ! nv_omx_h264enc ! rtph264pay pt=96 ! udpsink host=localhost port=5000

Fullscreen via alternate Video Sink (BSP V1.x)

nv_omx_hdmioverlaysink
=> fullscreen via HDMI aka DVI-D interface

nv_omx_lvdsoverlaysink
=> fullscreen via TFTLCD aka LVDS resp. VGA or DVI-A interface
Note: So far this only works as long as one uses a display manager that is limited to one single resolution (e.g. the libnvodm_disp.so.vgaonly one). This is currently being investigated by NVIDIA.

Fullscreen via alternate Video Sink (since BSP V2.1 Beta 3)

nv_omx_hdmi_videosink
=> fullscreen via HDMI-1 aka DVI-D interface

nv_omx_videosink
=> fullscreen via LVDS-1 aka parallel RGB resp. VGA or DVI-A interface

Video Codecs (BSP V1.x)

root@colibri_t20:~# gst-launch filesrc location=/home/root/media/MatrixXQ.avi ! avidemux name=demux demux.video_00 ! nv_omx_mpeg4dec ! nv_gl_videosink rendertarget=0 demux.audio_00 ! nv_omx_mp3dec ! alsasink device=hw:2,0

root@colibri_t20:~# gst-launch filesrc location=/home/root/media/Avatar_-_Featurette_HD_1080p.mov ! qtdemux name=demux demux.video_00 ! nv_omx_h264dec ! nv_gl_videosink rendertarget=0 demux.audio_00 ! nv_omx_aacdec ! alsasink device=hw:2,0

root@colibri_t20:~# gst-launch filesrc location=/home/root/media/bourne_ultimatum_trailer.mp4 ! qtdemux name=demux demux.video_00 ! nv_omx_h264dec ! nv_gl_videosink rendertarget=0 demux.audio_00 ! nv_omx_aacdec ! alsasink device=hw:2,0

Video Codecs (BSP V2.x)

The easiest is to use NVIDIA's proprietary nvgstplayer application as follows:

nvgstplayer -i /media/sda1/nv_medusa_h264_720_6M_cbr_2p_key60_q90_aac128_44.mp4 --svs nv_omx_videosink

The audio output can be chosen via ~/.asoundrc (e.g. similar to alsasink device=hw:0,1) as follows:

pcm.!default {
    type hw
    card 0
    device 1
}
ctl.!default {
    type hw
    card 0
    device 1
}

Alternatively one can explicitly specify a Gstreamer pipeline as follows:

root@colibri_t20:~# gst-launch filesrc location=/home/root/media/MatrixXQ.avi ! avidemux name=demux demux.video_00 ! nv_omx_mpeg4dec ! nv_gl_videosink rendertarget=0 demux.audio_00 ! nv_omx_mp3dec ! alsasink device=hw:2,0

root@colibri_t20:~# gst-launch filesrc location=/media/sda1/nv_medusa_h264_720_6M_cbr_2p_key60_q90_aac128_44.mp4 ! qtdemux name=demux demux.video_00 ! nv_omx_h264dec ! nvxvimagesink demux.audio_00 ! nv_omx_aacdec ! alsasink device=hw:0,0

Video Playback on Colibri i.MX7 Modules

Attention: Colibri i.MX7 modules lack the hardware accelerated decoding and the memory bandwidth is limited. Therefore, software decoders must be used. Depending on the videos resolution and encoding the performance may be poor.

Encoding

JPEG Image Encoding

Although Toradex Embedded Linux BSPs 2.7.4 and 2.8 currently do not support jpeg encoders for Colibri i.MX7, it is possible to download and install GStreamer plugins to do so.

root@colibri-imx7:~# opkg update
root@colibri-imx7:~# opkg install gstreamer1.0-plugins-good-jpeg

Take Still Image

To take a still image with JPEG encoders using a UVC webcam, one can use the following GStreamer pipeline

root@colibri-imx7:~# gst-launch-1.0 v4l2src num-buffers=1 ! jpegenc ! filesink location=test.jpg

MKV and OGG Video Encoding

Toradex Embedded Linux BSPs 2.7.4 and 2.8 currently support only 1 video encoder and decoder, as you can check with the command line below

root@colibri-imx7:~# gst-inspect-1.0 | grep enc
vorbis:  vorbisenc: Vorbis audio encoder
theora:  theoraenc: Theora video encoder
encoding:  encodebin: Encoder Bin
wavenc:  wavenc: WAV audio muxer
imxmp3enc.imx:  imxmp3enc: imx mp3 audio encoder
coretracers:  latency (GstTracerFactory)

root@colibri-imx7:~# gst-inspect-1.0 | grep dec
vorbis:  vorbisdec: Vorbis audio decoder
ivorbisdec:  ivorbisdec: Vorbis audio decoder
theora:  theoradec: Theora video decoder
beep.imx:  beepdec: Beep universal decoder
playback:  uridecodebin: URI Decoder
playback:  decodebin: Decoder Bin

To record a video with MKV extension using a UVC webcam, one can use the next GStreamer pipeline

gst-launch-1.0 v4l2src device=/dev/video0 ! 'video/x-raw, framerate=30/1, width=640,height=480' ! videoconvert ! theoraenc ! matroskamux ! filesink location=test.mkv

To record a video with OGG extension using a UVC webcam, one can use the next GStreamer pipeline

gst-launch-1.0 imxv4l2src device=/dev/video0 ! 'video/x-raw, framerate=30/1, width=640,height=480' ! videoconvert ! theoraenc ! oggmux ! filesink location=test.ogg

Video Playback

To play the recorded videos, one can use the following GStreamer pipelines

gst-launch-1.0 filesrc location=test.mkv ! matroskademux ! theoradec ! videoconvert ! autovideosink

gst-launch-1.0 filesrc location=test.ogg ! oggdemux ! theoradec ! videoconvert ! autovideosink

Video Encoding on Colibri i.MX6 ULL Modules

Attention: Colibri i.MX6 ULL modules lack the hardware accelerated decoding and the memory bandwidth is limited. Therefore, software decoders must be used. Depending on the videos resolution and encoding the performance may be poor.

Encoding

In order to encode the videos, it is necessary to install GStreamer and its plugins with the next command line

root@colibri-imx6ull:~# opkg install gstreamer1.0-plugins-base-ximagesink gstreamer1.0-plugins-good-video4linux2 gstreamer1.0-plugins-base-videoconvert gstreamer1.0-plugins-bad-fbdevsink gstreamer1.0-plugins-base-theora gstreamer1.0-plugins-good-matroska gstreamer1.0-plugins-base-ogg

To encode and record MKV videos, one can use the following command

root@colibri-imx6ull:~# gst-launch-1.0 v4l2src device=/dev/video0 ! 'video/x-raw, framerate=30/1' ! videoconvert ! theoraenc ! matroskamux ! filesink location=./test1.mkv

To encode and record OGG videos, one can use the following command

root@colibri-imx6ull:~# gst-launch-1.0 imxv4l2src device=/dev/video0 ! video/x-raw,width=1280,height=720 ! videoconvert ! theoraenc ! oggmux ! filesink location=videotestsrc.ogg

Video Playback on Vybrid Modules

Attention: Vybrid modules lack the hardware accelerated decoding and the memory bandwidth is limited. Therefore, software decoders must be used. Depending on the videos resolution and encoding the performance may be poor.

A collection of open decoders may be installed along with other useful Gstreamer packages using the following command:

opkg install gst-ffmpeg gst-plugins-base-ffmpegcolorspace gst-plugins-base-ximagesink gst-plugins-base-alsa gst-plugins-good-isomp4 gst-plugins-good-matroska 

With these additions, an H.264 encoded mp4 may be played back as follows (as an example):

gst-launch-0.10 filesrc location=test_vid.mp4 ! qtdemux name=demux demux.video_00 ! queue ! ffdec_h264 ! ffmpegcolorspace ! ximagesink demux.audio_00 ! queue ! ffdec_aac ! alsasink device=hw:0,0

Note: Software video decoding is very CPU intensive which significantly limits the resolution and bitrate that video can be smoothly played back.

Note: The VF50 Flash storage is particularly size constrained, plan accordingly by minimizing the OS & installed packages or by utilizing SD or other external storage.

Video over HTTP

A video may be played from an HTTP source by installing the souphttpsrc plugin for Gstreamer:

opkg install gst-plugins-good-souphttpsrc

An example Gstreamer pipeline for playback over HTTP (additionally requiring the vorbis decoder and audioconvert gstreamer plugin packages):

gst-launch-0.10 souphttpsrc location=http://upload.wikimedia.org/wikipedia/commons/4/4b/MS_Diana_genom_Bergs_slussar_16_maj_2014.webm ! matroskademux name=demux demux.video_00 ! queue ! ffdec_vp8 ! ffmpegcolorspace ! ximagesink demux.audio_00 ! queue ! vorbisdec ! audioconvert ! alsasink device=hw:0,0