Cameras on Toradex System on Modules
Introduction​
This article focuses on the following topics:
- Interact with cameras on embedded Linux.
- Collect video signal with GStreamer.
To understand which cameras and interfaces you can use with Toradex Hardware, visit our article about Cameras with Toradex from the hardware perspective.
Note that the steps presented here may vary slightly depending on your hardware setup. USB cameras are usually plug-and-play, while CSI cameras may require enabling drivers via Yocto layers. The available interfaces are specific to each Toradex SoM. Always consult the datasheet to confirm that an interface is available.
The articles below provide an walktrough on using CSI cameras on both Toradex BSPs and Torizon OS. They cover setting up the hardware, enabling Yocto layers and collecting camera data.
- First Steps with Framos FSM:GO Optical Sensor Modules (Linux)
- First Steps with CSI Camera Set 5MP AR0521 Color (Linux)
- First Steps with CSI Camera Set 5MP AR0521 Color (Torizon)
We also provide a Peripheral Database with cameras already tested on Toradex hardware.
Video4Linux Command-line Tools​
On Linux, video capture devices are usually managed by Video4Linux drivers. These devices are exposed as character devices and are accessible through files named /dev/video*
. You can leverage the utilities provided by the v4l-utils
package to interact with those devices via the command-line.
List Available Video Capturing Devices​
To list the available video capture devices and identify which file corresponds to your device, run the following command:
# v4l2-ctl --list-devices
The snippet below shows an output example for a USB webcam:
Microsoft® LifeCam HD-3000: Mi (usb-xhci-hcd.1.auto-1.2):
/dev/video2
/dev/video3
/dev/media0
Note that are multiple files associated with the same device. Those files represents different data type and functionalities, as follows:
/dev/video2
: Video frames capturing./dev/video3
: Metadata capturing./dev/media0
: Represents the media controller and it is used to handle more complex multimedia pipelines.
In this article, we will use GStreamer to interact with video capturing devices and collect data. The video capturing node (in our case, /dev/video2
) will be used as the source of video frames.
Identify a Video Capturing Device​
As presented in the previous section, multiple files may be associated with your video capturing device. To identify which on you should use to capture video frames, proceed as follows:
Run the following command, replacing
/dev/video2
with the file you want to inspect. This command displays detailed information about a specific V4L device.# v4l2-ctl --device /dev/video2 -D
Look for the
Device Caps
property among the diplayed information. See if the device has theVideo Capture
capability. That means this is the device node you are going to interact with to capture and manipulate video frames. The following snippet, shows an ouput example of a video capturing device:Output example
# v4l2-ctl --device /dev/video2 -D
Driver Info:
Driver name : uvcvideo
Card type : Microsoft® LifeCam HD-3000: Mi
Bus info : usb-xhci-hcd.1.auto-1.2
Driver version : 5.15.148
Capabilities : 0x84a00001
Video Capture
Metadata Capture
Streaming
Extended Pix Format
Device Capabilities
Device Caps : 0x04200001
Video Capture
Streaming
Extended Pix Format
Media Driver Info:
Driver name : uvcvideo
Model : Microsoft® LifeCam HD-3000: Mi
Serial :
Bus info : usb-xhci-hcd.1.auto-1.2
Media version : 5.15.148
Hardware revision: 0x00000106 (262)
Driver version : 5.15.148
Interface Info:
ID : 0x03000002
Type : V4L Video
Entity Info:
ID : 0x00000001 (1)
Name : Microsoft® LifeCam HD-3000: Mi
Function : V4L2 I/O
Flags : default
Pad 0x01000007 : 0: Sink
Link 0x02000013: from remote pad 0x100000a of entity 'Extension 5' (Video Pixel Formatter): Data, Enabled, Immutable
List Available Formats and Frame Rates​
Once you have discovered which file corresponds to your camera. You can list the formats and framerates it supports with the following command:
# v4l2-ctl --device=<video-capturing-device> --list-formats-ext
Output example
# v4l2-ctl --device /dev/video2 --list-formats-ext
ioctl: VIDIOC_ENUM_FMT
Type: Video Capture
[0]: 'YUYV' (YUYV 4:2:2)
Size: Discrete 640x480
Interval: Discrete 0.033s (30.000 fps)
Interval: Discrete 0.050s (20.000 fps)
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.100s (10.000 fps)
Interval: Discrete 0.133s (7.500 fps)
Size: Discrete 1280x720
Interval: Discrete 0.100s (10.000 fps)
Interval: Discrete 0.133s (7.500 fps)
Size: Discrete 960x544
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.100s (10.000 fps)
Interval: Discrete 0.133s (7.500 fps)
Size: Discrete 800x448
Interval: Discrete 0.050s (20.000 fps)
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.100s (10.000 fps)
Interval: Discrete 0.133s (7.500 fps)
Size: Discrete 640x360
Interval: Discrete 0.033s (30.000 fps)
Interval: Discrete 0.050s (20.000 fps)
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.100s (10.000 fps)
Interval: Discrete 0.133s (7.500 fps)
Size: Discrete 424x240
Interval: Discrete 0.033s (30.000 fps)
Interval: Discrete 0.050s (20.000 fps)
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.100s (10.000 fps)
Interval: Discrete 0.133s (7.500 fps)
Size: Discrete 352x288
Interval: Discrete 0.033s (30.000 fps)
Interval: Discrete 0.050s (20.000 fps)
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.100s (10.000 fps)
Interval: Discrete 0.133s (7.500 fps)
Size: Discrete 320x240
Interval: Discrete 0.033s (30.000 fps)
Interval: Discrete 0.050s (20.000 fps)
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.100s (10.000 fps)
Interval: Discrete 0.133s (7.500 fps)
Size: Discrete 800x600
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.100s (10.000 fps)
Interval: Discrete 0.133s (7.500 fps)
Size: Discrete 176x144
Interval: Discrete 0.033s (30.000 fps)
Interval: Discrete 0.050s (20.000 fps)
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.100s (10.000 fps)
Interval: Discrete 0.133s (7.500 fps)
Size: Discrete 160x120
Interval: Discrete 0.033s (30.000 fps)
Interval: Discrete 0.050s (20.000 fps)
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.100s (10.000 fps)
Interval: Discrete 0.133s (7.500 fps)
Size: Discrete 1280x800
Interval: Discrete 0.100s (10.000 fps)
Create a Gstreamer Pipeline​
GStreamer is a powerful framework for capturing, storing, and displaying video data. For more information, see Video Processing - GStreamer.
As an example, we will create a GStreamer pipeline that takes video from a camera and displays it in a window. The pipeline will have the following structure:
# gst-launch-1.0 <videosrc> ! <capsfilter> ! <videosink>
- videosrc: The element that produces data.
- capsfilter: Defines properties of the produced data, such as height, width and framerate.
- videosink: The element that consumes data.
Or more specifically:
# gst-launch-1.0 v4l2src device='/dev/video2' ! "video/x-raw, format=YUY2, framerate=30/1, width=640, height=480" ! fpsdisplaysink video-sink=waylandsink sync=false
- v4l2src: aptures video from V4L devices. The data from cameras consists of raw video frames. To control video properties, we can use the
video/x-raw, format, framerate, width, height
capabilities. To see the properties your camera supports, see List Available Formats and Frame Rates. - fpsdisplaysink: Displays raw video frames, showing the current and average frame rates. By configuring
video-sink
aswaylandsink
, it displays the video on a Wayland display server (Weston).
Depending on your hardware setup, the pipeline configuration your camera supports may vary. If your pipeline is failing, you can use the GStreamer debugging tools to get more details about error messages.
Additional Resources​
Check the following articles for information on different use cases:
Container Environment​
How to use Cameras on Torizon OS: Describes how to identify capture devices on Torizon OS, access cameras from within containers and handle camera data with GStreamer.
How to use GStreamer on Torizon OS: Contains information on handling and displaying video data with GStreamer within container environments. It also provides pipeline examples for different SoMs, including those that support hardware acceleration with VPU.
CSI Cameras​
- First Steps with Framos FSM:GO Optical Sensor Modules (Linux): A hands-on guide that shows how to connect the Framos FSM:GO cameras, enable Yocto layers and handle camera data with GStreamer on Toradex BSPs.
- First Steps with CSI Camera Set 5MP AR0521 Color (Linux): A hands-on guide that shows how to connect cameras, enable Yocto layers and handle camera data with GStreamer on Toradex BSPs.
- First Steps with CSI Camera Set 5MP AR0521 Color (Torizon): A hands-on guide that shows how to connect cameras, enable Yocto layers and handle camera data with GStreamer on Torizon OS.
Machine learning​
- Torizon Sample: Real-Time Object Detection with Tensorflow Lite: Contains steps for running an object detection application with TensorFlow Lite on Torizon OS. In that application, GStreamer is used to handle video frames that serve as input to a Machine Learning model.