Search by Tags

Video Encoding and Playback (Linux)

 

Compare with Revision




Subscribe for this article updates

Tegra Modules

Gstreamer OpenMAX Wrapper Format Conversion

While regular gstreamer plugins are usually using x-raw-yuv and NVIDIA's gstreamer wrapped OpenMAX stuff usually wanted x-nvrm-yuv there seems to be a third colour format representation called x-nv-yuv which is what the nv_omx_videomixer requires. Some examples:

gst-launch videotestsrc ! 'video/x-raw-yuv, width=(int)640, height=(int)480, format=(fourcc)YUY2' ! nvvidconv ! 'video/x-nv-yuv, width=(int)640, height=(int)480, format=(fourcc)I420' ! nv_omx_videomixer ! nv_gl_eglimagesink
gst-launch videotestsrc ! 'video/x-raw-yuv, width=(int)640, height=(int)480, format=(fourcc)YUY2' ! nvvidconv ! 'video/x-nvrm-yuv, format=(fourcc)I420' ! nvxvimagesink
gst-launch videotestsrc ! 'video/x-raw-yuv, width=(int)640, height=(int)480, format=(fourcc)YUY2' ! nvvidconv ! nvxvimagesink

Video BSP V1.x

The following example shows how to playback video through Gstreamer using the Colibri T20. Please note that the two numbers at the end specify which ALSA card and device to use for audio (e.g. alsasink device=hw:1,0 for SPDIF through HDMI and alsasink device=hw:2,0 for WM9715L AC97 through headphone). Further note, since the example is performed with a Tegra module, it utilizes Nvidia Gstreamer elements in the pipeline. Use of another module may require alternate elements.

root@colibri_t20:~# gst-launch filesrc location=/home/root/nv_medusa_h264_720_6M_cbr_2p_key60_q90_aac128_44.mp4 ! qtdemux name=demux demux.video_00 ! nv_omx_h264dec ! nv_gl_videosink rendertarget=0 demux.audio_00 ! nv_omx_aacdec ! alsasink device=hw:1,0
[ 2388.624904] unknown ioctl code
Setting pipeline to PAUSED ...
[ 2388.677056] unknown ioctl code
Pipeline is PREROLLING ...
[ 2388.840327] avp_init: read firmware from 'nvrm_avp.bin' (36528 bytes)
[ 2388.846957] avp_init: Loading AVP kernel at vaddr=d8c00000 paddr=1ff00000
[ 2388.854848] avp_reset: Resetting AVP: reset_addr=100000
[ 2388.878221] avp_init: avp init done
[ 2388.905824] avp_svc_thread: got remote peer
[ 2388.910754] avp_lib: Successfully loaded library nvmm_manager.axf (lib_id=115da8)
[ 2388.953837] avp_lib: Successfully loaded library nvmm_service.axf (lib_id=117bb0)
[ 2389.017941] avp_lib: Successfully loaded library nvmm_h264dec.axf (lib_id=119a20)
[ 2389.036379] tegra_dvfs: rate 721500000 too high for dvfs on emc
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstAudioSinkClock
Got EOS from element "pipeline0".
Execution ended after 105287328998 ns.
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
[ 2494.724074] avp_lib: Successfully unloaded 'nvmm_h264dec.axf'
[ 2494.764073] avp_lib: Successfully unloaded 'nvmm_service.axf'
Setting pipeline to NULL ...
Freeing pipeline ...
[ 2494.804076] avp_lib: Successfully unloaded 'nvmm_manager.axf'
[ 2494.809853] avp_svc_thread: couldn't receive msg
[ 2494.814710] avp_svc_thread: done
[ 2494.817958] avp_uninit: avp teardown done

Video BSP V2.x

The following example shows how to playback video through Gstreamer using the Colibri T20. Please note that the two numbers at the end specify which ALSA card and device to use for audio (e.g. alsasink device=hw:0,0 for WM9715L AC97 through headphone and alsasink device=hw:1,0 for SPDIF through HDMI). Further note, since the example is performed with a Tegra module, it utilizes Nvidia Gstreamer elements in the pipeline. Use of another module may require an alternate pipeline.

Note: In case you experience banding issues, this is likely due to 16-Bit colour depth in our default image. To enable 24-Bit color depth, consult the Framebuffer (Linux) and the X-Server (Linux) article.

root@colibri-t20:~# gst-launch filesrc location=/home/root/nv_medusa_h264_720_6M_cbr_2p_key60_q90_aac128_44.mp4 ! qtdemux name=demux demux.video_00 ! nv_omx_h264dec ! nv_gl_eglimagesink demux.audio_00 ! nv_omx_aacdec ! alsasink device=hw:0,0
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...

omx_setup error while setting FilterTimestamp
[32309.790818] avp_init: Using AVP MMU to relocate AVP kernel
[32309.821369] avp_init: Reading firmware from 'nvrm_avp.bin' (46612 bytes)
[32309.828541] avp_init: Loading AVP kernel at vaddr=d7a00000 paddr=18100000
[32309.852155] avp_reset: Resetting AVP: reset_addr=100000
[32309.869479] avp_node_try_connect: trying connect from RPC_AVP_PORT
[32309.875957] process_connect_locked: got connect (111794)
[32309.881278] avp_svc_thread: got remote peer
[32309.885664] [AVP]: AVP kernel (Nov  2 2012 16:47:31)
[32309.896950] avp_node_try_connect: got conn ack 'RPC_AVP_PORT' (cc10dd80 <-> 111758)
[32309.904691] avp_init: avp init done
[32309.908175] avp_lib: loading library 'nvmm_manager.axf'
[32309.928749] avp_lib: Successfully loaded library nvmm_manager.axf (lib_id=118960)
[32309.936494] avp_node_try_connect: trying connect from NVMM_MANAGER_SRV
[32309.951837] avp_node_try_connect: got conn ack 'NVMM_MANAGER_SRV' (cc10db00 <-> 119c00)
[32309.961486] avp_lib: loading library 'nvmm_service.axf'
[32309.983984] avp_lib: Successfully loaded library nvmm_service.axf (lib_id=11a770)
[32309.991654] avp_node_try_connect: trying connect from nbaaaaaa+
[32310.006858] avp_node_try_connect: got conn ack 'nbaaaaaa+' (cc10d200 <-> 11bab8)
[32310.025884] avp_lib: loading library 'nvmm_h264dec.axf'
[32310.140791] avp_lib: Successfully loaded library nvmm_h264dec.axf (lib_id=11c628)
[32310.149179] avp_node_try_connect: trying connect from obaaaaaa+
[32310.161838] avp_node_try_connect: got conn ack 'obaaaaaa+' (cd049540 <-> 11cbe0)
Allocating new output: 1280x720 (x 9)
[32310.234392] tegra20_ac97_hw_params(): dai->id=0, play
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstAudioSinkClock
[32310.491623] tegra20_ac97_trigger()
[32310.495032] tegra20_ac97_start_playback()
[32310.499038] ac97_fifo_set_attention_level()
[32310.503216] ac97_slot_enable()
Got EOS from element "pipeline0".
Execution ended after 114565608000 ns.
Setting pipeline to PAUSED ...
[32425.070926] tegra20_ac97_trigger()
[32425.074368] tegra20_ac97_stop_playback()
[32425.078311] ac97_fifo_set_attention_level()
[32425.083534] ac97_slot_enable()
Setting pipeline to READY ...
[32425.129970] avp_trpc_close: closing 'obaaaaaa+' (11cbe0)
[32425.146843] _send_disconnect: sent disconnect msg for 11cbe0
[32425.182387] avp_lib: Successfully unloaded 'nvmm_h264dec.axf'
[32425.188896] avp_lib: unloaded 'nvmm_h264dec.axf'
[32425.212583] avp_trpc_close: closing 'nbaaaaaa+' (11bab8)
[32425.227385] _send_disconnect: sent disconnect msg for 11bab8
[32425.261971] avp_lib: Successfully unloaded 'nvmm_service.axf'
[32425.267764] avp_lib: unloaded 'nvmm_service.axf'
[32425.273879] process_disconnect_locked: got disconnect (cc10db00)
[32425.311926] avp_lib: Successfully unloaded 'nvmm_manager.axf'
[32425.317732] avp_lib: unloaded 'nvmm_manager.axf'
Setting pipeline to NULL ...
Freeing pipeline ...
[32425.405907] avp_svc_thread: AVP seems to be down; wait for kthread_stop
[32425.413156] avp_svc_thread: exiting
[32425.416771] avp_uninit: avp teardown done

Encoding

Our T20 BSP V2.x supports video encoding as well.

Note the required '-e' option to gst-launch for mp4 containers. This sends an end of file through the pipeline and ensures a correctly written mp4 file. The tests were done with a Logitech C920. Possible resolutions etc. depend on the used webcam.

Resolutions other than 640x480 require the gstreamer plugins from Nvidias L4T R16.3. These will only be part of our images later than June 2013.

VGA V4L2 source in YUV 4:2:2 format encoded to H264 and stored to a file:

gst-launch -e v4l2src device="/dev/video0" ! 'video/x-raw-yuv, width=(int)640, height=(int)480, format=(fourcc)YUY2' ! nvvidconv ! 'video/x-nvrm-yuv, format=(fourcc)I420' ! nv_omx_h264enc ! qtmux ! filesink location=temp.mp4
gst-launch v4l2src device="/dev/video0" ! 'video/x-raw-yuv, width=(int)640, height=(int)480, format=(fourcc)YUY2' ! nvvidconv ! 'video/x-nvrm-yuv, format=(fourcc)I420' ! nv_omx_h264enc ! video/x-h264 ! avimux ! filesink location=temp.avi

800x448 V4L2 source in YUV 4:2:2 format encoded to H264 and stored to a file:

gst-launch -e v4l2src device="/dev/video0" ! 'video/x-raw-yuv, width=(int)800, height=(int)448, format=(fourcc)YUY2' ! nvvidconv ! 'video/x-nvrm-yuv, format=(fourcc)I420' ! nv_omx_h264enc ! qtmux ! filesink location=temp.mp4

720p V4L2 source in YUV 4:2:2 format encoded to H264 and stored to a file:

gst-launch -e v4l2src device="/dev/video0" ! 'video/x-raw-yuv, width=(int)1280, height=(int)720, format=(fourcc)YUY2' ! nvvidconv ! 'video/x-nvrm-yuv, format=(fourcc)I420' ! nv_omx_h264enc ! qtmux ! filesink location=temp.mp4

1280x1024 videotestsource in YUV 4:2:2 format encoded to H264 and stored to a file:

gst-launch -e videotestsrc ! 'video/x-raw-yuv, width=(int)1280, height=(int)1024, format=(fourcc)YUY2' ! nvvidconv ! 'video/x-nvrm-yuv, format=(fourcc)I420' ! nv_omx_h264enc ! qtmux ! filesink location=temp.mp4

videotestsource in YUV 4:2:0 format encoded to H264 and stored to a file:

gst-launch -e videotestsource ! 'video/x-raw-yuv, width=(int)640, height=(int)480, format=(fourcc)I420' ! nvvidconv ! 'video/x-nvrm-yuv' ! nv_omx_h264enc ! qtmux ! filesink location=temp.mp4

Display a video from a VGA V4L2 source and concurrently store it H264 encoded to a file:

gst-launch -e v4l2src device="/dev/video0" ! 'video/x-raw-yuv, width=(int)640, height=(int)480, format=(fourcc)YUY2' ! nvvidconv ! 'video/x-nvrm-yuv, format=(fourcc)I420' ! tee ! nvxvimagesink tee0. ! queue2 ! nv_omx_h264enc ! qtmux ! filesink location=temp.mp4

Display a video on Apalis T30 from a OV5640 CSI-2 full HD V4L2 source and concurrently store it H264 encoded to a file:

gst-launch v4l2src device=/dev/video0 ! 'video/x-raw-yuv, framerate=15/1, width=1920, height=1088, format=(fourcc)I420' ! tee ! nv_omx_videomixer ! nv_gl_eglimagesink tee0. ! queue2 ! nv_omx_h264enc ! qtmux ! filesink location=temp.mp4 -e

Note: The encoder is limited to resolutions dividable by 16 (e.g. 1920x1088 instead of 1920x1080).

Note2: On T30 the achievable frame rate for full HD is around 15 FPS.

Display a video on Apalis iMX6Q from a OV5640 CSI-2 VGA V4L2 source and concurrently store it H264 encoded to a file:

gst-launch-1.0 imxv4l2videosrc device=/dev/video2 ! tee ! queue2 ! vpuenc_h264 ! qtmux ! filesink location=temp.mp4 tee0. ! imxeglvivsink -e

RTP

# Testrun to find out the used video capabilities
root@colibri-t20:~# gst-launch -v filesrc location=temp.mp4 ! qtdemux name=demux demux.video_00 ! queue ! rtph264pay pt=96 ! udpsink host=localhost port=5000 demux.audio_00 ! queue ! rtpmp4apay pt=97 ! udpsink host=localhost port=5001     
Setting pipeline to PAUSED ...                                                 
Pipeline is PREROLLING ...                                                     
/GstPipeline:pipeline0/GstQueue:queue0.GstPad:sink: caps = video/x-h264, stream-format=(string)avc, alignment=(string)au, level=(string)2.1, profile=(string)constrained-baseline, codec_data=(buffer)01424015030100096742802895a0280f4401000468ce3c80, width=int)640, height=(int)480, framerate=(fraction)125/4, pixel-aspect-ratio=fraction)1/1                                                       
/GstPipeline:pipeline0/GstQueue:queue0.GstPad:src: caps = video/x-h264, stream-format=(string)avc, alignment=(string)au, level=(string)2.1, profile=(string)constrained-baseline, codec_data=(buffer)01424015030100096742802895a0280f4401000468ce3c80, width=int)640, height=(int)480, framerate=(fraction)125/4, pixel-aspect-ratio=fraction)1/1                                                        
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0.GstPad:src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, sprop-parameter-sets=(string)\"Z0KAKJWgKA9E\\,aM48gA\\=\\=\", payload=(int)96, ssrc=(uint)3804678311, clock-base=(uint)311926741, seqnum-base=(uint)59376    
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0.GstPad:sink: caps = video/x-h264, stream-format=(string)avc, alignment=(string)au, level=(string)2.1, profile=(string)constrained-baseline, codec_data=(buffer)01424015030100096742802895a0280f4401000468ce3c80, width=int)640, height=(int)480, framerate=(fraction)125/4, pixel-aspect-ratio=fraction)1/1                                             
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0: timestamp = 311926741        
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0: seqnum = 59376               
/GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, sprop-parameter-sets=(string)\"Z0KAKJWgKA9E\\,aM48gA\\=\\=\", payload=(int)96, ssrc=(uint)3804678311, clock-base=(uint)311926741, seqnum-base=(uint)59376         
^CCaught interrupt -- handling interrupt.                                      
Interrupt: Stopping pipeline ...                                               
ERROR: pipeline doesn't want to preroll.                                       
Setting pipeline to NULL ...                                                   
/GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad:sink: caps = NULL            
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0.GstPad:sink: caps = NULL      
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0.GstPad:src: caps = NULL       
/GstPipeline:pipeline0/GstQueue:queue0.GstPad:src: caps = NULL                 
/GstPipeline:pipeline0/GstQueue:queue0.GstPad:sink: caps = NULL                
/GstPipeline:pipeline0/GstQTDemux:demux.GstPad:video_00: caps = NULL           
Freeing pipeline ...                                                           

# use the displayed video capabilities
VCAPS="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, sprop-parameter-sets=(string)\"Z0KAKJWgKA9E\\,aM48gA\\=\\=\", payload=(int)96, ssrc=(uint)3804678311, clock-base=(uint)311926741, seqnum-base=(uint)59376"

# start the receiver with the found video capabilities
gst-launch udpsrc port=5000 ! $VCAPS ! rtph264depay ! nv_omx_h264dec ! nv_gl_eglimagesink

# launch the sender again
gst-launch-0.10 -e v4l2src device="/dev/video0" ! 'video/x-raw-yuv, width=(int)640, height=(int)480, format=(fourcc)YUY2' ! nvvidconv ! 'video/x-nvrm-yuv, format=(fourcc)I420' ! nv_omx_h264enc ! rtph264pay pt=96 ! udpsink host=localhost port=5000

Fullscreen via alternate Video Sink (BSP V1.x)

nv_omx_hdmioverlaysink
=> fullscreen via HDMI aka DVI-D interface

nv_omx_lvdsoverlaysink
=> fullscreen via TFTLCD aka LVDS resp. VGA or DVI-A interface
Note: So far this only works as long as one uses a display manager that is limited to one single resolution (e.g. the libnvodm_disp.so.vgaonly one). This is currently being investigated by NVIDIA.

Fullscreen via alternate Video Sink (since BSP V2.1 Beta 3)

nv_omx_hdmi_videosink
=> fullscreen via HDMI-1 aka DVI-D interface

nv_omx_videosink
=> fullscreen via LVDS-1 aka parallel RGB resp. VGA or DVI-A interface

Video Codecs (BSP V1.x)

gst-launch filesrc location=/home/root/media/MatrixXQ.avi ! avidemux name=demux demux.video_00 ! nv_omx_mpeg4dec ! nv_gl_videosink rendertarget=0 demux.audio_00 ! nv_omx_mp3dec ! alsasink device=hw:2,0

gst-launch filesrc location=/home/root/media/Avatar_-_Featurette_HD_1080p.mov ! qtdemux name=demux demux.video_00 ! nv_omx_h264dec ! nv_gl_videosink rendertarget=0 demux.audio_00 ! nv_omx_aacdec ! alsasink device=hw:2,0

gst-launch filesrc location=/home/root/media/bourne_ultimatum_trailer.mp4 ! qtdemux name=demux demux.video_00 ! nv_omx_h264dec ! nv_gl_videosink rendertarget=0 demux.audio_00 ! nv_omx_aacdec ! alsasink device=hw:2,0

Video Codecs (BSP V2.x)

The easiest is to use NVIDIA's proprietary nvgstplayer application as follows:

nvgstplayer -i /media/sda1/nv_medusa_h264_720_6M_cbr_2p_key60_q90_aac128_44.mp4 --svs nv_omx_videosink

The audio output can be chosen via ~/.asoundrc (e.g. similar to alsasink device=hw:0,1) as follows:

pcm.!default {
    type hw
    card 0
    device 1
}
ctl.!default {
    type hw
    card 0
    device 1
}

Alternatively one can explicitly specify a Gstreamer pipeline as follows:

gst-launch filesrc location=/home/root/media/MatrixXQ.avi ! avidemux name=demux demux.video_00 ! nv_omx_mpeg4dec ! nv_gl_videosink rendertarget=0 demux.audio_00 ! nv_omx_mp3dec ! alsasink device=hw:2,0

gst-launch filesrc location=/media/sda1/nv_medusa_h264_720_6M_cbr_2p_key60_q90_aac128_44.mp4 ! qtdemux name=demux demux.video_00 ! nv_omx_h264dec ! nvxvimagesink demux.audio_00 ! nv_omx_aacdec ! alsasink device=hw:0,0

Video Playback on Vybrid Modules

Video playback on Vybrid modules is quite similar to the Tegra modules described in the previous section; however, Vybrid modules lack the hardware accelerated decoding provided by the Tegra modules. Therefore, software decoders may be used.

A collection of open decoders may be installed along with other useful Gstreamer packages using the following command:

opkg install gst-ffmpeg gst-plugins-base-ffmpegcolorspace gst-plugins-base-ximagesink gst-plugins-base-alsa gst-plugins-good-isomp4 gst-plugins-good-matroska 

With these additions, an h.264 encoded mp4 may be played back as follows (as an example):

gst-launch-0.10 filesrc location=test_vid.mp4 ! qtdemux name=demux demux.video_00 ! queue ! ffdec_h264 ! ffmpegcolorspace ! ximagesink demux.audio_00 ! queue ! ffdec_aac ! alsasink device=hw:0,0

Note: Software video decoding is very CPU intensive which significantly limits the resolution and bitrate that video can be smoothly played back.

Note: The flash of the VF50 is particularly size constrained, plan accordingly by minimizing the OS & installed packages or by utilizing SD or other external storage.

Video over HTTP

A video may be played from an HTTP source by installing the souphttpsrc plugin for Gstreamer:

opkg install gst-plugins-good-souphttpsrc

An example Gstreamer pipeline for playback over HTTP (additionally requiring the vorbis decoder and audioconvert gstreamer plugin packages):

gst-launch-0.10 souphttpsrc location=http://upload.wikimedia.org/wikipedia/commons/4/4b/MS_Diana_genom_Bergs_slussar_16_maj_2014.webm ! matroskademux name=demux demux.video_00 ! queue ! ffdec_vp8 ! ffmpegcolorspace ! ximagesink demux.audio_00 ! queue ! vorbisdec ! audioconvert ! alsasink device=hw:0,0