This document collects my notes written while trying to get some different cameras (IMX219, IMX708, AR1335) working on the Toradex Verdin IMX8MP modules. They are mostly brain-dump / journal style, but shared here in case they are useful for anyone else...
Overview: https://developer.toradex.com/hardware/accessories/cameras/csi-camera-set-5mp-ar0521-color/ Setting up the devicetree to get a v4l device, and also some info on running a gstreamer container: https://developer.toradex.com/torizon/application-development/multimedia/first-steps-with-csi-camera-set-5mp-ar0521-color-torizon
*Edit 2023-10-18: It seems the docs were updated to use a custom Yocto layer instead of just a DT overlay. The Yocto layer contains kernel patches and a driver for the ar0521 sensor, which is apparently not included in mainline (but was previously probably included in the default Toradex builds?)
When using the new yocto layers, that includes device tree overlays that are set (by conf/layer.conf) to be loaded during boot, but it seems that this loading is disabled when customizing the build using tcbuild.yaml (at least when adding overlays there), so to make it work anyway, you can point tcbuild.yaml to the devicetree source directly and have it integrated there.
About using a gstreamer container: https://developer.toradex.com/torizon/application-development/multimedia/how-to-use-cameras-on-torizon/#nxpi.mx8/8x/8mmini-basedsoms
Cable must be FFC 24-pin same side contacts (so RPI camera cable and our OLED interfaceboard cable are wrong). For example https://www.vanallesenmeer.nl/24Pin-100mm-AWM-20624-80%C2%B0C-60V-VW-1-0,5mm-FFC-Flexible-Flat-Cable-(A-Type) or https://nl.rs-online.com/web/p/ribbon-cable/1792560 (link wrong, should be 24p)
These are various commands used later in this document, but put together here for easy copy-pasting (but without docs...)
cd ~/torizon-samples/gstreamer/bash/simple-pipeline
docker build --build-arg BASE_NAME=wayland-base-vivante --build-arg IMAGE_ARCH=linux/arm64/v8 -t matthijskooijman/torizon_gst_example .
docker push matthijskooijman/torizon_gst_example
docker run --rm -it -v /tmp:/tmp -v /var/run/dbus:/var/run/dbus -v /dev:/dev -v /sys:/sys --device /dev/video0 --device /dev/video1 --device /dev/video2 --device /dev/media0 --device /dev/v4l-subdev0 --device /dev/v4l-subdev1 --device /dev/v4l-subdev2 --device-cgroup-rule='c 199:* rmw' --volume /home/torizon:/home/torizon --env ACCEPT_FSL_EULA=1 --name gst-example matthijskooijman/torizon_gst_example
docker exec -it $(docker ps |grep gst_example | cut -f 1 -d\ ) bash
docker run -d --rm --name=weston --net=host --cap-add CAP_SYS_TTY_CONFIG -v /dev:/dev -v /tmp:/tmp -v /run/udev/:/run/udev/ --device-cgroup-rule='c 4:* rmw' --device-cgroup-rule='c 13:* rmw' --device-cgroup-rule='c 226:* rmw' torizon/weston:$CT_TAG_WESTON --developer --tty=/dev/tty7
v4l2-ctl --list-devices
v4l2-ctl --device /dev/video2 -D
media-ctl -p
v4l2-ctl --device /dev/video2 --list-formats-ex
gst-launch-1.0 v4l2src device='/dev/video2' ! "video/x-raw, format=YUY2, framerate=5/1, width=640, height=480" ! fpsdisplaysink video-sink=waylandsink text-overlay=false sync=false
gst-launch-1.0 v4l2src device='/dev/video2' ! "video/x-raw, format=RGB, framerate=5/1, width=1536, height=864" ! videoconvert ! fpsdisplaysink video-sink=waylandsink text-overlay=false sync=false
docker run --rm -it -v /tmp:/tmp -v /var/run/dbus:/var/run/dbus -v /dev:/dev -v /sys:/sys --device /dev/video0 --device /dev/video1 --device /dev/video2 --device /dev/media0 --device /dev/v4l-subdev0 --device /dev/v4l-subdev1 --device /dev/v4l-subdev2 --device-cgroup-rule='c 199:* rmw' --volume /home/torizon:/home/torizon --env ACCEPT_FSL_EULA=1 --name gst-example matthijskooijman/torizon_gst_example gst-launch-1.0 v4l2src device='/dev/video2' ! "video/x-raw, format=RGB, framerate=5/1, width=1920, height=1080" ! videoconvert ! fpsdisplaysink video-sink=waylandsink text-overlay=false sync=false
gst-launch-1.0 v4l2src device='/dev/video2' ! "video/x-raw, format=YUY2, framerate=5/1, width=640, height=480" ! videoconvert ! x264enc ! flvmux ! filesink location=/home/torizon/xyz.flv
gst-launch-1.0 v4l2src device='/dev/video2' num-buffers=1 ! "video/x-raw, format=RGB, framerate=5/1, width=1536, height=864" ! pngenc ! filesink location=/home/torizon/capture.png
docker run --rm -it -v /tmp:/tmp -v /var/run/dbus:/var/run/dbus -v /dev:/dev -v /sys:/sys --device /dev/video0 --device /dev/video1 --device /dev/video2 --device /dev/media0 --device /dev/v4l-subdev0 --device /dev/v4l-subdev1 --device /dev/v4l-subdev2 --device-cgroup-rule='c 199:* rmw' --volume /home/torizon:/home/torizon --env ACCEPT_FSL_EULA=1 --name gst-example matthijskooijman/torizon_gst_example
gst-launch-1.0 v4l2src device='/dev/video2' num-buffers=1 ! "video/x-raw, format=RGB, framerate=5/1, width=1920, height=1080" ! pngenc ! filesink location=/home/torizon/capture.png
v4l2-ctl --device /dev/video2 --set-fmt-video=width=1920,height=1080,pixelformat=RGB3,sizeimage=0,bytesperline=0 --stream-mmap --stream-to=frame.raw --stream-count=1
convert -size 1920x1080 -depth 8 rgb:frame.raw frame.png
(from https://raspberrypi.stackexchange.com/questions/112743/v4l2-ctl-single-frame-capture-produces-image-with-green-ending)
v4l2-ctl --device /dev/video2 --set-fmt-video=width=4608,height=2592,pixelformat=RGB3,sizeimage=0,bytesperline=0 --stream-mmap --stream-to=frame.raw --stream-count=1
v4l2-ctl --device /dev/video2 --set-fmt-video=width=4608,height=2592,pixelformat=RG10,sizeimage=0,bytesperline=0 --stream-mmap --stream-to=/home/torizon/frame.raw --stream-count=1
v4l2-ctl --device /dev/video2 --set-fmt-video=width=1920,height=1080,pixelformat=RG10,sizeimage=0,bytesperline=0 --stream-mmap --stream-to=/home/torizon/frame.raw --stream-count=1
# Note that the ISI is limited to 4096 width, so the output ends up being truncated
ssh verdin6 docker cp app-filamentsensor-1:/usr/src/app/frame.png . && scp verdin6:frame.png . && feh frame.png
ssh verdin8 docker cp app-filamentsensor-1:/usr/src/app/frame.raw . && scp verdin8:frame.raw . && convert -size 4096x2592 -depth 8 rgb:frame.raw frame.png && feh frame.png
scp verdin:frame.raw . && convert -size 4096x2592 -depth 8 rgb:frame.raw frame.png && feh frame.png
scp verdin:frame.raw . && convert -size 4096x2592 -depth 16 gray:frame.raw frame.png && feh frame.png
v4l2-ctl -d /dev/v4l-subdev2 -l
v4l2-ctl -d /dev/v4l-subdev2 -c test_pattern=1
v4l2-ctl -d /dev/v4l-subdev2 -c exposure=2602,analogue_gain=960,digital_gain=1000
# Settings used on rpi:
v4l2-ctl -d /dev/v4l-subdev2 -c exposure=190,analogue_gain=921,digital_gain=256
The adapter board delivered by Toradex (or really, e-con), connected with a cable to the bottom camera module board, contains an 1.8V regulator and some level shifters. Schematic can be downloaded (behind an e-mail validation wall) from the e-con website: [[e-con-acc-ixora-wtb-adaptor-board-document.pdf]]
Note that the schematic for this adapter board was supplied with their AR1335 camera (e-CAM131), but that camera actually has the Toradex/iMX 24-pin FFC pinout directly on the bottom board of the camera stack.
- Docs:
- The overlay for the camera configures three parts:
- An I²C AR0521 device. This driver configures the sensor through I²C and probably also orchestrates the configuration for the CSI datastream. Driver is
drivers/media/i2c/ar0521.c
in an upstream linux kernel, but does not seem to exist in the Toradex kernel. Huh? - The CSI device. This configures some timings and the number of lanes to use. Driver is
drivers/staging/media/imx/imx8-mipi-csi2-sam.c
- An ISI (Image Sensing Interface) device. This seems to be a hardware block that can handle video data processing. This likely does the RAW->RGB Bayer filtering (or maybe just colorspace conversion, it seems the ISP module can do Bayer filtering, but that does not seem to be configured or even supported by the kernel at first glance)J. Driver is
drivers/staging/media/imx/imx8-isi-core.c
- An I²C AR0521 device. This driver configures the sensor through I²C and probably also orchestrates the configuration for the CSI datastream. Driver is
- The I²C and CSI device nodes are connected together in the overlay, the ISI seems independent. It also is unclear how the ISI device is set up and how it is told what it should convert into what and where it should send its stream. The overlay only enables the ISI node, and the original ISI node mostly configures clocks. And the AR0521 driver also does not seem to contain any references to the raw encoding (UYVY) used.
- The targets of the overlay are defined in
linux/arch/arm64/boot/dts/freescale/imx8mp.dtsi
- Arducam Camarray seems to be a board that actively combines multiple cameras into a single image. It seems it runs firmware on the board, which probably talks to the cameras. Firmware needs to have support for the cameras, unclear if this firmware and the design is open. https://www.arducam.com/camarray-release-multiple-mipi-camera-solution/
- Many solutions seem to use an FPGA with CSI Aggregator modules.
Questions:
- When does v4l link setup happen?
- What would be achievable frame rates?
- Can we use CSI at all given the cable lengths?
From https://www.arducam.com/mipi-csi-2-converters-bridges-other-interfaces/:
Cable Length: For a stable connection, the maximum cable length you get with CSI-2 sits around 30 centimeters/11.8 inches, this is way too short for when the camera needs to be placed far away from the host.
- Good overview over CSI/DSI/D-PHY protocol: https://www.nxp.com/docs/en/application-note/AN13573.pdf
- Verdin Plus hardware supports CSI-2, D-PHY 1.2 (C-PHY not supported), quad lane.
- CSI unit 1 supports 400/500Mhz pixel clock, CSI unit 2 supports 277Mhz, using both supports 266Mhz on both. Pixel clock is probably a bit higher than the actual processed pixels because of HSYNC/VSYNC intervals.
- ISP image processing unit supports 375Mbit/s (12MP@30fps, 4k@45fps, 2x1080p@80fps).
- The IMX8MP ISI module seems to be able to process two streams, no indication of being to handle VC interleaving.
- The IMX8MP also has an ISP (Image Signal Processing) module that can receive CSI data (as an alternative to the ISI module).
- The IMX8MP CSI module handles the D-PHY differential signals, merging 1-4 data lanes and CSI data packets. It produces a single stream of pixel data (with a pixel clock, hsync and vsync signals) to be processed by ISP/ISI (or maybe also written to RAM).
- According to this post the CSI interface should support 1.5Gbps throughput per lane, which would be 7.5FPS per lane (so 30FPS with four lanes) at 4k (which is a bit weird, given the ISP specs above state 45FPS for 375Mbit/s, but the math for 7.5FPS seems to work out correctly). That post also talks about added latency resulting from memory copying, and links to a solution.
From the i.MX 8M Plus Applications Processor Reference Manual, Rev. 1, 06/2021
13.14.1.2.1 ISP Module The following features are supported by the main ISP submodule: • ISP input interface is ITU-R BT.601 compatible • Variable sensor interface for RGB-Bayer Sensors • Input sampling on positive or negative sample clocks • Cropping of the output picture (to crop interpolation artifacts), also used for windowing • Black level compensation • Bad pixel detection/correction • Lens shade correction • Denoising pre-filter • Bayer de-mosaic filter • Chromatic Aberration Correction • Filter (Noise reduction, Sharpness, Blurring) • Programmable gamma correction for sensor adaptation and display correction • Sensor crosstalk compensation • Enhanced Chroma Noise Reduction (CNR) • Automatic white balance measurements (AWB) • Exposure measurement for AE (AEC/AGC) • Auto focus measurement (AF) • Histogram calculation • Mechanical shutter control • Flash light control • Video Stabilization
The D-PHY module inside the CSI has its own PLL, which is fed from a central clock ("M_XI" according to the D-PHY section), but it is unclear which clock. The CSI or D-PHY do not seem to be listed in the "5.1.4 System Clocks" section, neither is the Turns out the camera actually drives the CSI clock, not the imx8mp So the PLL is probably used only for DSI, where the imx8mp drives the clock.MEDIA_CAM1_PIX_CLK
that is referenced in the devicetree (in the verdin dt for CSI, not the overlay), so that (or one of the other clocks mentioned in the DT) is the PLL source clock.
Setting the CSI frequency seems to be done with the link-frequencies
tag on the link between sensor and CSI.
The overlay mentions IPP_DO_CLKO2, which seems to be clockout on the SoC that is routed to the CSI connector as CSI1_MCLK.
[ 6.922585] mxc-mipi-csi2.0: is_entity_link_setup, No remote pad found!
media-ctl -p -d1
should list media entities and pads, but needs /dev/mediaX
devices that does not seem to exist. Maybe this is because the pad setup fails?
This error seems to be because imx8-media-dev.c
mxc_md_create_links
creates a link from the camera source pad to the ISI sink pad (which works), and then calls media_entity_call(sink, link_setup, ...)
on the camera, which I think calls imx219->sd.entitity.ops->link_setup()
, which seems to be missing. Likely the Rpi CSI/mediadev implementation does not do this, so it works there.
These threads shows the same thing for another sensor, suggesting a patch with a dummy link_setup function: https://community.nxp.com/t5/i-MX-Processors/iMX8MP-MIPI-CSI2-Problem-with-ADV7280-and-Camera/m-p/1501840/highlight/true#M193500 and https://community.nxp.com/t5/i-MX-Processors/Integrate-ADV7280-M-with-IMX8X/m-p/1661895/highlight/true#M206946 and https://community.toradex.com/t/adv7280-m-problem-with-apalis-imx8/17291/27 The AR0521 driver provided by Toradex for their camera contains the same dummy function: https://github.com/toradex/meta-toradex-econ/blob/330100f0357b4daf40cd805a6b8b34f44f00a4b6/meta-ar0521/recipes-kernel/linux/linux-toradex/0002-add-driver.patch#L2934
I wonder if the imx8-media-dev.c
should instead just ignore a missing link_setup function?
There are two modules in drivers/media/i2c
that have a dummy link_setup
, and one that has an actual implementation (that configures output connectors on a video decoder chip). Most other link_setup
functions around the tree have actual contents.
mc-entity.c
also calls link_setup
, and ignores the -ENOIOCTLCMD
error.
drivers/media/platform/exynos4-is/media-dev.c
calls it and does not ignore the error.
There seem to be no other calls to link_setup()
.
The calls in mc-entity.c
seem to be the main way to call link_setup
. They happen in media_entity_setup_link
, which is called to enable or disable a link at runtime. imx8-media-dev
enables the link at link creation time (passing MEDIA_LNK_FL_ENABLED
to media_create_pad_link
), with the MEDIA_LNK_FL_IMMUTABLE
that prevents calling media_entity_setup_link
on it later. But that does not seem to call link_setup
(which may be a bug?). Other users of MEDIA_LNK_FL_IMMUTABLE
seem to just not bother about link_setup.
With debug info, I get this output (the is_entity_link_setup, No remote pad found!
no longer happens, it seems):
[ 5.746828] imx8_media_dev: module is from the staging directory, the quality is unknown, you have been warned.
[ 5.754232] ===== begin parsing endpoint /soc@0/bus@30800000/i2c@30a40000/imx219@10/port/endpoint
[ 5.754404] fwnode video bus type not specified (0), mbus type MIPI CSI-2 D-PHY (5)
[ 5.754417] no lane mapping given, using defaults
[ 5.754422] data-lanes property exists; disabling default mapping
[ 5.754426] lane 0 position 1
[ 5.754429] lane 1 position 2
[ 5.754434] clock lane position 0
[ 5.754438] non-continuous clock
[ 5.754441] no lane polarities defined, assuming not inverted
[ 5.754448] link-frequencies 0 value 456000000
[ 5.754452] ===== end parsing endpoint /soc@0/bus@30800000/i2c@30a40000/imx219@10/port/endpoint
[ 5.760789] mxc-md 32c00000.bus:camera: Media device initialized
[ 5.760893] mxc-md 32c00000.bus:camera: media_gobj_create id 1: entity 'mxc_isi.0'
[ 5.760900] mxc-md 32c00000.bus:camera: media_gobj_create id 2: sink pad 'mxc_isi.0':0
[ 5.760906] mxc-md 32c00000.bus:camera: media_gobj_create id 3: sink pad 'mxc_isi.0':1
[ 5.760912] mxc-md 32c00000.bus:camera: media_gobj_create id 4: sink pad 'mxc_isi.0':2
[ 5.760918] mxc-md 32c00000.bus:camera: media_gobj_create id 5: sink pad 'mxc_isi.0':3
[ 5.760923] mxc-md 32c00000.bus:camera: media_gobj_create id 6: sink pad 'mxc_isi.0':4
[ 5.760929] mxc-md 32c00000.bus:camera: media_gobj_create id 7: sink pad 'mxc_isi.0':5
[ 5.760935] mxc-md 32c00000.bus:camera: media_gobj_create id 8: sink pad 'mxc_isi.0':6
[ 5.760941] mxc-md 32c00000.bus:camera: media_gobj_create id 9: sink pad 'mxc_isi.0':7
[ 5.760947] mxc-md 32c00000.bus:camera: media_gobj_create id 10: sink pad 'mxc_isi.0':8
[ 5.760953] mxc-md 32c00000.bus:camera: media_gobj_create id 11: sink pad 'mxc_isi.0':9
[ 5.760959] mxc-md 32c00000.bus:camera: media_gobj_create id 12: sink pad 'mxc_isi.0':10
[ 5.760965] mxc-md 32c00000.bus:camera: media_gobj_create id 13: sink pad 'mxc_isi.0':11
[ 5.760971] mxc-md 32c00000.bus:camera: media_gobj_create id 14: source pad 'mxc_isi.0':12
[ 5.760977] mxc-md 32c00000.bus:camera: media_gobj_create id 15: source pad 'mxc_isi.0':13
[ 5.760983] mxc-md 32c00000.bus:camera: media_gobj_create id 16: source pad 'mxc_isi.0':14
[ 5.760989] mxc-md 32c00000.bus:camera: media_gobj_create id 17: sink pad 'mxc_isi.0':15
[ 5.760999] isi-capture 32e00000.isi:cap_device: mxc_isi_subdev_registered
[ 5.761003] isi-capture 32e00000.isi:cap_device: mxc_isi_register_cap_device
[ 5.768354] atmel_mxt_ts 3-004a: __mxt_read_reg: i2c transfer failed (-6)
[ 5.768583] atmel_mxt_ts 3-004a: mxt_bootloader_read: i2c recv failed (-6)
[ 5.768593] atmel_mxt_ts 3-004a: Trying alternate bootloader
address
[ 5.768798] atmel_mxt_ts 3-004a: mxt_bootloader_read: i2c recv failed (-6)
[ 5.790900] mxc-md 32c00000.bus:camera: media_gobj_create id 18: entity 'mxc_isi.0.capture'
[ 5.790917] mxc-md 32c00000.bus:camera: media_gobj_create id 19: sink pad 'mxc_isi.0.capture':0
[ 5.790925] mxc-md 32c00000.bus:camera: media_gobj_create id 20: intf_devnode v4l-video - major: 81, minor: 2
[ 5.790932] mxc-md 32c00000.bus:camera: media_gobj_create id 21: interface link id 20 ==> id 18
[ 5.790940] mx8-img-md: Registered mxc_isi.0.capture as /dev/video2
[ 5.791001] mxc-md 32c00000.bus:camera: media_gobj_create id 22: entity 'mxc-mipi-csi2.0'
[ 5.791007] mxc-md 32c00000.bus:camera: media_gobj_create id 23: sink pad 'mxc-mipi-csi2.0':0
[ 5.791014] mxc-md 32c00000.bus:camera: media_gobj_create id 24: sink pad 'mxc-mipi-csi2.0':1
[ 5.791020] mxc-md 32c00000.bus:camera: media_gobj_create id 25: sink pad 'mxc-mipi-csi2.0':2
[ 5.791026] mxc-md 32c00000.bus:camera: media_gobj_create id 26: sink pad 'mxc-mipi-csi2.0':3
[ 5.791032] mxc-md 32c00000.bus:camera: media_gobj_create id 27: source pad 'mxc-mipi-csi2.0':4
[ 5.791038] mxc-md 32c00000.bus:camera: media_gobj_create id 28: source pad 'mxc-mipi-csi2.0':5
[ 5.791044] mxc-md 32c00000.bus:camera: media_gobj_create id 29: source pad 'mxc-mipi-csi2.0':6
[ 5.791049] mxc-md 32c00000.bus:camera: media_gobj_create id 30: source pad 'mxc-mipi-csi2.0':7
[ 5.791065] ===== begin parsing endpoint /soc@0/bus@32c00000/camera/csi@32e40000/port@0/endpoint
[ 5.791086] fwnode video bus type not specified (0), mbus type not specified (0)
[ 5.791092] lane 0 position 1
[ 5.791096] lane 1 position 2
[ 5.791100] clock lane position 0
[ 5.791104] non-continuous clock
[ 5.791107] no lane polarities defined, assuming not inverted
[ 5.791110] assuming media bus type MIPI CSI-2 D-PHY (5)
[ 5.791115] ===== end parsing endpoint /soc@0/bus@32c00000/camera/csi@32e40000/port@0/endpoint
[ 5.791188] isi-capture 32e00000.isi:cap_device: mxc_isi_subdev_unregistered
[ 5.792426] imx-sdma 30e10000.dma-controller: firmware found.
[ 5.835386] imx219 2-0010: device orientation: 2
[ 5.835407] imx219 2-0010: device rotation: 180
[ 5.835435] mxc-md 32c00000.bus:camera: media_gobj_create id 31: entity 'imx219 2-0010'
[ 5.835442] mxc-md 32c00000.bus:camera: media_gobj_create id 32: source pad 'imx219 2-0010':0
[ 5.835450] mxc-md 32c00000.bus:camera: subdev_notifier_bound
[ 5.835455] mx8-img-md: Registered sensor subdevice: imx219 2-0010 (1)
[ 5.835461] mxc-md 32c00000.bus:camera: subdev_notifier_complete
[ 5.835466] mxc-md 32c00000.bus:camera: mxc_isi.0 entity is found
[ 5.835470] mxc-md 32c00000.bus:camera: mxc_isi.0.capture entity is found
[ 5.835476] mxc-md 32c00000.bus:camera: media_gobj_create id 33: data link id 14 ==> id 19
[ 5.835482] mxc-md 32c00000.bus:camera: media_gobj_create id 34: data link id 14 ==> id 19
[ 5.835487] mx8-img-md: created link [mxc_isi.0] => [mxc_isi.0.capture]
[ 5.835492] mxc-md 32c00000.bus:camera: mxc_isi.0 entity is found
[ 5.835496] mxc-md 32c00000.bus:camera: mxc-mipi-csi2.0 entity is found
[ 5.835501] mxc-md 32c00000.bus:camera: media_gobj_create id 35: data link id 27 ==> id 2
[ 5.835507] mxc-md 32c00000.bus:camera: media_gobj_create id 36: data link id 27 ==> id 2
[ 5.835513] mx8-img-md: created link [mxc-mipi-csi2.0] => [mxc_isi.0]
[ 5.835518] mxc-md 32c00000.bus:camera: mxc-mipi-csi2.0 entity is found
[ 5.835522] mxc-md 32c00000.bus:camera: media_gobj_create id 37: data link id 32 ==> id 23
[ 5.835528] mxc-md 32c00000.bus:camera: media_gobj_create id 38: data link id 32 ==> id 23
[ 5.835534] mx8-img-md: subdev_notifier_complete error exit
[ 5.835545] mxc-md 32c00000.bus:camera: media_gobj_destroy id 38: data link id 32 ==> id 23
[ 5.835551] mxc-md 32c00000.bus:camera: media_gobj_destroy id 37: data link id 32 ==> id 23
[ 5.835558] mxc-md 32c00000.bus:camera: media_gobj_destroy id 32: source pad 'imx219 2-0010':0
[ 5.835564] mxc-md 32c00000.bus:camera: media_gobj_destroy id 31: entity 'imx219 2-0010'
[ 5.835571] imx219 2-0010: failed to register sensor sub-device: -515
[ 5.835979] imx219: probe of 2-0010 failed with error -515
[ 5.836454] mxc-md 32c00000.bus:camera: media_gobj_destroy id 21: interface link id 20 ==> id 18
[ 5.836473] mxc-md 32c00000.bus:camera: media_gobj_destroy id 20: intf_devnode v4l-video - major: 81, minor: 2
[ 5.836485] mxc-md 32c00000.bus:camera: media_gobj_destroy id 33: data link id 14 ==> id 19
[ 5.836492] mxc-md 32c00000.bus:camera: media_gobj_destroy id 34: data link id 14 ==> id 19
[ 5.836499] mxc-md 32c00000.bus:camera: media_gobj_destroy id 19: sink pad 'mxc_isi.0.capture':0
[ 5.836506] mxc-md 32c00000.bus:camera: media_gobj_destroy id 18: entity 'mxc_isi.0.capture'
[ 5.836519] mxc-md 32c00000.bus:camera: media_gobj_destroy id 35: data link id 27 ==> id 2
[ 5.836525] mxc-md 32c00000.bus:camera: media_gobj_destroy id 36: data link id 27 ==> id 2
[ 5.836531] mxc-md 32c00000.bus:camera: media_gobj_destroy id 2: sink pad 'mxc_isi.0':0
[ 5.836537] mxc-md 32c00000.bus:camera: media_gobj_destroy id 3: sink pad 'mxc_isi.0':1
[ 5.836543] mxc-md 32c00000.bus:camera: media_gobj_destroy id 4: sink pad 'mxc_isi.0':2
[ 5.836549] mxc-md 32c00000.bus:camera: media_gobj_destroy id 5: sink pad 'mxc_isi.0':3
[ 5.836555] mxc-md 32c00000.bus:camera: media_gobj_destroy id 6: sink pad 'mxc_isi.0':4
[ 5.836561] mxc-md 32c00000.bus:camera: media_gobj_destroy id 7: sink pad 'mxc_isi.0':5
[ 5.836567] mxc-md 32c00000.bus:camera: media_gobj_destroy id 8: sink pad 'mxc_isi.0':6
[ 5.836573] mxc-md 32c00000.bus:camera: media_gobj_destroy id 9: sink pad 'mxc_isi.0':7
[ 5.836578] mxc-md 32c00000.bus:camera: media_gobj_destroy id 10: sink pad 'mxc_isi.0':8
[ 5.836584] mxc-md 32c00000.bus:camera: media_gobj_destroy id 11: sink pad 'mxc_isi.0':9
[ 5.836590] mxc-md 32c00000.bus:camera: media_gobj_destroy id 12: sink pad 'mxc_isi.0':10
[ 5.836596] mxc-md 32c00000.bus:camera: media_gobj_destroy id 13: sink pad 'mxc_isi.0':11
[ 5.836602] mxc-md 32c00000.bus:camera: media_gobj_destroy id 14: source pad 'mxc_isi.0':12
[ 5.836608] mxc-md 32c00000.bus:camera: media_gobj_destroy id 15: source pad 'mxc_isi.0':13
[ 5.836614] mxc-md 32c00000.bus:camera: media_gobj_destroy id 16: source pad 'mxc_isi.0':14
[ 5.836620] mxc-md 32c00000.bus:camera: media_gobj_destroy id 17: sink pad 'mxc_isi.0':15
[ 5.836626] mxc-md 32c00000.bus:camera: media_gobj_destroy id 1: entity 'mxc_isi.0'
[ 5.836631] unregister ISI channel: mxc_isi.0
Patching the kernel to ignore link_setup missing helps to create /dev/media0
, allowing to enumerate topology (which seems correct):
/# media-ctl -p
Media controller API version 5.15.129
Media device information
------------------------
driver mxc-md
model FSL Capture Media Device
serial
bus info
hw revision 0x0
driver version 5.15.129
Device topology
- entity 1: mxc_isi.0 (16 pads, 2 links)
type V4L2 subdev subtype Unknown flags 0
pad0: Sink
<- "mxc-mipi-csi2.0":4 [ENABLED]
pad1: Sink
pad2: Sink
pad3: Sink
pad4: Sink
pad5: Sink
pad6: Sink
pad7: Sink
pad8: Sink
pad9: Sink
pad10: Sink
pad11: Sink
pad12: Source
-> "mxc_isi.0.capture":0 [ENABLED]
pad13: Source
pad14: Source
pad15: Sink
- entity 18: mxc_isi.0.capture (1 pad, 1 link)
type Node subtype V4L flags 0
device node name /dev/video2
pad0: Sink
<- "mxc_isi.0":12 [ENABLED]
- entity 22: mxc-mipi-csi2.0 (8 pads, 2 links)
type Node subtype V4L flags 0
device node name /dev/v4l-subdev0
pad0: Sink
<- "imx219 2-0010":0 [ENABLED,IMMUTABLE]
pad1: Sink
pad2: Sink
pad3: Sink
pad4: Source
-> "mxc_isi.0":0 [ENABLED]
pad5: Source
pad6: Source
pad7: Source
- entity 31: imx219 2-0010 (1 pad, 1 link)
type V4L2 subdev subtype Sensor flags 0
device node name /dev/v4l-subdev1
pad0: Source
-> "mxc-mipi-csi2.0":0 [ENABLED,IMMUTABLE]
But actually capturing from the device seems to fail:
# gst-launch-1.0 v4l2src device='/dev/video2' ! "video/x-raw, format=YUY2, framerate=5/1, width=640, height=480" ! videoconvert ! x264enc ! flvmux ! filesink location=xyz.flv
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Redistribute latency...
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Failed to allocate required memory.
Additional debug info:
../sys/v4l2/gstv4l2src.c(976): gst_v4l2src_decide_allocation (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
Buffer pool activation failed
Execution ended after 0:00:00.016572551
Setting pipeline to NULL ...
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Internal data stream error.
Additional debug info:
../libs/gst/base/gstbasesrc.c(3132): gst_base_src_loop (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
streaming stopped, reason not-negotiated (-4)
Freeing pipeline ...
Then dmesg says:
[ 1275.702029] isi-capture 32e00000.isi:cap_device: mxc_isi_cap_g_selection
[ 1275.702065] isi-capture 32e00000.isi:cap_device: mxc_isi_cap_enum_fmt
[ 1275.702076] isi-capture 32e00000.isi:cap_device: mxc_isi_cap_enum_fmt
[ 1275.702101] isi-capture 32e00000.isi:cap_device: mxc_isi_cap_enum_fmt
[ 1275.702109] isi-capture 32e00000.isi:cap_device: mxc_isi_cap_enum_fmt
[ 1275.702117] isi-capture 32e00000.isi:cap_device: mxc_isi_cap_enum_fmt
[ 1275.702125] isi-capture 32e00000.isi:cap_device: mxc_isi_cap_enum_fmt
[ 1275.702133] isi-capture 32e00000.isi:cap_device: mxc_isi_cap_enum_fmt
[ 1275.702141] isi-capture 32e00000.isi:cap_device: mxc_isi_cap_enum_fmt
[ 1275.702149] isi-capture 32e00000.isi:cap_device: mxc_isi_cap_enum_fmt
[ 1275.702157] isi-capture 32e00000.isi:cap_device: mxc_isi_cap_enum_fmt
[ 1275.702164] isi-capture 32e00000.isi:cap_device: mxc_isi_cap_enum_fmt
[ 1275.702177] isi-capture 32e00000.isi:cap_device: mxc_isi_cap_g_selection
[ 1275.702211] isi-capture 32e00000.isi:cap_device: mxc_isi_cap_try_fmt_mplane
[ 1275.702220] isi-capture 32e00000.isi:cap_device: mxc_isi_cap_try_fmt_mplane
[ 1275.702665] isi-capture 32e00000.isi:cap_device: mxc_isi_cap_try_fmt_mplane
[ 1275.702687] isi-capture 32e00000.isi:cap_device: mxc_isi_cap_try_fmt_mplane
[ 1275.702760] isi-capture 32e00000.isi:cap_device: mxc_isi_cap_try_fmt_mplane
[ 1275.702769] isi-capture 32e00000.isi:cap_device: mxc_isi_cap_try_fmt_mplane
[ 1275.702829] isi-capture 32e00000.isi:cap_device: mxc_isi_cap_try_fmt_mplane
[ 1275.702838] isi-capture 32e00000.isi:cap_device: mxc_isi_cap_try_fmt_mplane
[ 1275.702886] isi-capture 32e00000.isi:cap_device: mxc_isi_cap_try_fmt_mplane
[ 1275.702894] isi-capture 32e00000.isi:cap_device: mxc_isi_cap_try_fmt_mplane
[ 1275.702950] isi-capture 32e00000.isi:cap_device: mxc_isi_cap_try_fmt_mplane
[ 1275.702958] isi-capture 32e00000.isi:cap_device: mxc_isi_cap_try_fmt_mplane
[ 1275.703005] isi-capture 32e00000.isi:cap_device: mxc_isi_cap_try_fmt_mplane
[ 1275.703013] isi-capture 32e00000.isi:cap_device: mxc_isi_cap_try_fmt_mplane
[ 1275.703070] isi-capture 32e00000.isi:cap_device: mxc_isi_cap_try_fmt_mplane
[ 1275.703078] isi-capture 32e00000.isi:cap_device: mxc_isi_cap_try_fmt_mplane
[ 1275.704069] isi-capture 32e00000.isi:cap_device: mxc_isi_cap_try_fmt_mplane
[ 1275.715361] isi-capture 32e00000.isi:cap_device: mxc_isi_cap_g_selection
[ 1275.715399] isi-capture 32e00000.isi:cap_device: mxc_isi_cap_s_fmt_mplane, fmt=0x56595559
[ 1275.715407] isi-capture 32e00000.isi:cap_device: mxc_isi_cap_try_fmt_mplane
[ 1275.715440] isi-capture 32e00000.isi:cap_device: mxc_isi_cap_g_selection
[ 1275.716133] isi-capture 32e00000.isi:cap_device: cap_vb2_queue_setup, buf_n=2, size=614400
[ 1275.718121] isi-capture 32e00000.isi:cap_device: cap_vb2_buffer_prepare
[ 1275.718167] isi-capture 32e00000.isi:cap_device: cap_vb2_buffer_prepare
[ 1275.718199] isi-capture 32e00000.isi:cap_device: mxc_isi_cap_streamon
[ 1275.718207] mxc_isi.0: Call subdev s_power fail!
imx8-isi-cap.c
callss_power
, and then errors out about it.imx219
does not implement (deprecated)v4l2_subdev_core_ops.s_power
, but does runtime power management instead.atmel-isi.c
also callss_power
, but ignores-ENOIOCTLCMD
results as well.
With that fixed in the kernel (to ignore ENOIOCTLCMD), the error goes away, but streaming still does not work. Everything is set up and streaming seems to start, but no frames are returned.
strace v4l2-ctl --device /dev/video2 --set-fmt-video=width=1280,height=720,pixelformat=RGB3 --stream-mmap --stream-to=frame.raw --stream-count=1
... Hangs on:
ioctl(3, VIDIOC_DQBUF, {type=V4L2_BUF_TYPE_VIDEO_CAPTURE_MPLANE
VIDIOC_DQBUF submits a buffer to be filled with video data, so apparently it never actually starts streaming.
[ 1471.454986] isi-capture 32e00000.isi:cap_device: mxc_isi_cap_g_selection
[ 1471.455172] isi-capture 32e00000.isi:cap_device: mxc_isi_cap_g_fmt_mplane
[ 1471.455185] isi-capture 32e00000.isi:cap_device: mxc_isi_cap_enum_fmt
[ 1471.455195] isi-capture 32e00000.isi:cap_device: mxc_isi_cap_enum_fmt
[ 1471.455205] isi-capture 32e00000.isi:cap_device: mxc_isi_cap_enum_fmt
[ 1471.455218] isi-capture 32e00000.isi:cap_device: mxc_isi_cap_enum_fmt
[ 1471.455228] isi-capture 32e00000.isi:cap_device: mxc_isi_cap_enum_fmt
[ 1471.455239] isi-capture 32e00000.isi:cap_device: mxc_isi_cap_enum_fmt
[ 1471.455247] isi-capture 32e00000.isi:cap_device: mxc_isi_cap_enum_fmt
[ 1471.455257] isi-capture 32e00000.isi:cap_device: mxc_isi_cap_enum_fmt
[ 1471.455270] isi-capture 32e00000.isi:cap_device: mxc_isi_cap_enum_fmt
[ 1471.455278] isi-capture 32e00000.isi:cap_device: mxc_isi_cap_enum_fmt
[ 1471.455288] isi-capture 32e00000.isi:cap_device: mxc_isi_cap_enum_fmt
[ 1471.455358] isi-capture 32e00000.isi:cap_device: mxc_isi_cap_g_selection
[ 1471.455368] isi-capture 32e00000.isi:cap_device: mxc_isi_cap_g_selection
[ 1471.455528] isi-capture 32e00000.isi:cap_device: cap_vb2_queue_setup, buf_n=4, size=16163840
[ 1471.483631] isi-capture 32e00000.isi:cap_device: cap_vb2_buffer_prepare
[ 1471.483655] isi-capture 32e00000.isi:cap_device: cap_vb2_buffer_prepare
[ 1471.483667] isi-capture 32e00000.isi:cap_device: cap_vb2_buffer_prepare
[ 1471.483677] isi-capture 32e00000.isi:cap_device: cap_vb2_buffer_prepare
[ 1471.483690] isi-capture 32e00000.isi:cap_device: mxc_isi_cap_g_fmt_mplane
[ 1471.483702] isi-capture 32e00000.isi:cap_device: mxc_isi_cap_streamon
[ 1471.488720] bypass csc
[ 1471.488728] input fmt RGB4
[ 1471.488731] output fmt RGB3
[ 1471.488737] mxc-isi 32e00000.isi: input_size(1640,1232), output_size(1280,720)
[ 1471.488753] isi-capture 32e00000.isi:cap_device: cap_vb2_start_streaming
[ 1471.492936] isi-capture 32e00000.isi:cap_device: cap_vb2_start_streaming: num_plane=0 discard_size=16166912 discard_buffer=00000000f2984b39
[ 1471.806008] mxc-md 32c00000.bus:camera: begin graph walk at 'mxc_isi.0.capture'
[ 1471.806025] mxc-md 32c00000.bus:camera: walk: pushing 'mxc_isi.0' on stack
[ 1471.806032] mxc-md 32c00000.bus:camera: walk: skipping entity 'mxc_isi.0.capture' (already seen)
[ 1471.806038] mxc-md 32c00000.bus:camera: walk: pushing 'mxc-mipi-csi2.0' on stack
[ 1471.806044] mxc-md 32c00000.bus:camera: walk: skipping entity 'mxc_isi.0' (already seen)
[ 1471.806050] mxc-md 32c00000.bus:camera: walk: pushing 'imx219 2-0010' on stack
[ 1471.806056] mxc-md 32c00000.bus:camera: walk: skipping entity 'mxc-mipi-csi2.0' (already seen)
[ 1471.806064] mxc-md 32c00000.bus:camera: walk: returning entity 'imx219 2-0010'
[ 1471.879590] mxc-md 32c00000.bus:camera: walk: returning entity 'mxc-mipi-csi2.0'
[ 1471.915] mxc-md 32c00000.bus:camera: walk: returning entity 'mxc_isi.0'
[ 1471.915723] mxc-md 32c00000.bus:camera: walk: returning entity 'mxc_isi.0.capture'
[ 1471.915730] isi-capture 32e00000.isi:cap_device: mxc_isi.0.capture is no v4l2 subdev
[ 1471.915757] isi-capture 32e00000.isi:cap_device: mxc_isi_cap_g_fmt_mplane
That last message might be a problem that gets ignored somewhere? Though it seems that that code just calls enable on all v4l2_subdevs in the graph, and skips over other entities (with debug logging only, no error). This is probably expected. Though media-ctl -p
output suggests that mxc-mipi-csi2.0
might also not be a subdev (it has type Node
like the capture entity, though it does have /dev/v4l-subdev0
, so maybe it is a subdev)
Runtime power for the IMX219 seems to work as expected, /sys/class/video4linux/v4l-subdev1/device$/power/runtime_status
switches from suspended
to active
when trying to capture.
Interestingly enough it also says input is RGB4, but that is not a supported format according to v4l-ctl, so you cannot capture directly in RGB4?
Adding imx219 debug output shows it gets a command to start streaming:
[ 248.166614] isi-capture 32e00000.isi:cap_device: cap_vb2_start_streaming
[ 248.171112] isi-capture 32e00000.isi:cap_device: cap_vb2_start_streaming: num_plane=0 discard_size=16166912 discard_buffer=000000007ed5d8be
[ 248.483772] mxc-md 32c00000.bus:camera: begin graph walk at 'mxc_isi.0.capture'
[ 248.483790] mxc-md 32c00000.bus:camera: walk: pushing 'mxc_isi.0' on stack
[ 248.483796] mxc-md 32c00000.bus:camera: walk: skipping entity 'mxc_isi.0.capture' (already seen)
[ 248.483803] mxc-md 32c00000.bus:camera: walk: pushing 'mxc-mipi-csi2.0' on stack
[ 248.483809] mxc-md 32c00000.bus:camera: walk: skipping entity 'mxc_isi.0' (already seen)
[ 248.483815] mxc-md 32c00000.bus:camera: walk: pushing 'imx219 2-0010' on stack
[ 248.483821] mxc-md 32c00000.bus:camera: walk: skipping entity 'mxc-mipi-csi2.0' (already seen)
[ 248.483827] mxc-md 32c00000.bus:camera: walk: returning entity 'imx219 2-0010'
[ 248.483837] imx219 2-0010: imx219_start_streaming: start
[ 248.491973] imx219 2-0010: imx219_write_reg: 0x0100 = 0x00000000 (len = 1)
[ 248.493734] imx219 2-0010: imx219_write_reg: 0x30eb = 0x0000000c (len = 1)
[ 248.495327] imx219 2-0010: imx219_write_reg: 0x30eb = 0x00000005 (len = 1)
[ 248.495668] imx219 2-0010: imx219_write_reg: 0x300a = 0x000000ff (len = 1)
[ 248.495989] imx219 2-0010: imx219_write_reg: 0x300b = 0x000000ff (len = 1)
[ 248.496308] imx219 2-0010: imx219_write_reg: 0x30eb = 0x00000005 (len = 1)
[ 248.496629] imx219 2-0010: imx219_write_reg: 0x30eb = 0x00000009 (len = 1)
[ 248.496949] imx219 2-0010: imx219_write_reg: 0x0301 = 0x00000005 (len = 1)
[ 248.497268] imx219 2-0010: imx219_write_reg: 0x0303 = 0x00000001 (len = 1)
[ 248.497590] imx219 2-0010: imx219_write_reg: 0x0304 = 0x00000003 (len = 1)
[ 248.497910] imx219 2-0010: imx219_write_reg: 0x0305 = 0x00000003 (len = 1)
[ 248.498228] imx219 2-0010: imx219_write_reg: 0x0306 = 0x00000000 (len = 1)
[ 248.498548] imx219 2-0010: imx219_write_reg: 0x0307 = 0x00000039 (len = 1)
[ 248.498866] imx219 2-0010: imx219_write_reg: 0x030b = 0x00000001 (len = 1)
[ 248.499187] imx219 2-0010: imx219_write_reg: 0x030c = 0x00000000 (len = 1)
[ 248.499508] imx219 2-0010: imx219_write_reg: 0x030d = 0x00000072 (len = 1)
[ 248.499829] imx219 2-0010: imx219_write_reg: 0x455e = 0x00000000 (len = 1)
[ 248.500151] imx219 2-0010: imx219_write_reg: 0x471e = 0x0000004b (len = 1)
[ 248.500471] imx219 2-0010: imx219_write_reg: 0x4767 = 0x0000000f (len = 1)
[ 248.500789] imx219 2-0010: imx219_write_reg: 0x4750 = 0x00000014 (len = 1)
[ 248.501110] imx219 2-0010: imx219_write_reg: 0x4540 = 0x00000000 (len = 1)
[ 248.501430] imx219 2-0010: imx219_write_reg: 0x47b4 = 0x00000014 (len = 1)
[ 248.501750] imx219 2-0010: imx219_write_reg: 0x4713 = 0x00000030 (len = 1)
[ 248.502070] imx219 2-0010: imx219_write_reg: 0x478b = 0x00000010 (len = 1)
[ 248.502388] imx219 2-0010: imx219_write_reg: 0x478f = 0x00000010 (len = 1)
[ 248.502707] imx219 2-0010: imx219_write_reg: 0x4793 = 0x00000010 (len = 1)
[ 248.503027] imx219 2-0010: imx219_write_reg: 0x4797 = 0x0000000e (len = 1)
[ 248.503346] imx219 2-0010: imx219_write_reg: 0x479b = 0x0000000e (len = 1)
[ 248.503611] imx219 2-0010: imx219_write_reg: 0x0162 = 0x0000000d (len = 1)
[ 248.503957] imx219 2-0010: imx219_write_reg: 0x0163 = 0x00000078 (len = 1)
[ 248.504279] imx219 2-0010: imx219_write_reg: 0x0170 = 0x00000001 (len = 1)
[ 248.504937] imx219 2-0010: imx219_write_reg: 0x0171 = 0x00000001 (len = 1)
[ 248.505257] imx219 2-0010: imx219_write_reg: 0x0114 = 0x00000001 (len = 1)
[ 248.505565] imx219 2-0010: imx219_write_reg: 0x0128 = 0x00000000 (len = 1)
[ 248.505871] imx219 2-0010: imx219_write_reg: 0x012a = 0x00000018 (len = 1)
[ 248.506179] imx219 2-0010: imx219_write_reg: 0x012b = 0x00000000 (len = 1)
[ 248.506485] imx219 2-0010: imx219_write_reg: 0x0164 = 0x00000000 (len = 1)
[ 248.506789] imx219 2-0010: imx219_write_reg: 0x0165 = 0x00000000 (len = 1)
[ 248.507095] imx219 2-0010: imx219_write_reg: 0x0166 = 0x0000000c (len = 1)
[ 248.507401] imx219 2-0010: imx219_write_reg: 0x0167 = 0x000000cf (len = 1)
[ 248.507708] imx219 2-0010: imx219_write_reg: 0x0168 = 0x00000000 (len = 1)
[ 248.508012] imx219 2-0010: imx219_write_reg: 0x0169 = 0x00000000 (len = 1)
[ 248.508318] imx219 2-0010: imx219_write_reg: 0x016a = 0x00000009 (len = 1)
[ 248.508623] imx219 2-0010: imx219_write_reg: 0x016b = 0x0000009f (len = 1)
[ 248.508926] imx219 2-0010: imx219_write_reg: 0x016c = 0x00000006 (len = 1)
[ 248.509231] imx219 2-0010: imx219_write_reg: 0x016d = 0x00000068 (len = 1)
[ 248.509538] imx219 2-0010: imx219_write_reg: 0x016e = 0x00000004 (len = 1)
[ 248.509841] imx219 2-0010: imx219_write_reg: 0x016f = 0x000000d0 (len = 1)
[ 248.510162] imx219 2-0010: imx219_write_reg: 0x0624 = 0x00000006 (len = 1)
[ 248.510471] imx219 2-0010: imx219_write_reg: 0x0625 = 0x00000068 (len = 1)
[ 248.510775] imx219 2-0010: imx219_write_reg: 0x0626 = 0x00000004 (len = 1)
[ 248.511081] imx219 2-0010: imx219_write_reg: 0x0627 = 0x000000d0 (len = 1)
[ 248.511389] imx219 2-0010: imx219_write_reg: 0x018c = 0x0000000a (len = 1)
[ 248.511698] imx219 2-0010: imx219_write_reg: 0x018d = 0x0000000a (len = 1)
[ 248.512005] imx219 2-0010: imx219_write_reg: 0x0309 = 0x0000000a (len = 1)
[ 248.512316] imx219 2-0010: imx219_write_reg: 0x0174 = 0x00000101 (len = 2)
[ 248.512663] imx219 2-0010: imx219_write_reg: 0x0160 = 0x000006e3 (len = 2)
[ 248.513007] imx219 2-0010: imx219_write_reg: 0x015a = 0x00000640 (len = 2)
[ 248.513349] imx219 2-0010: imx219_write_reg: 0x0157 = 0x00000000 (len = 1)
[ 248.513599] imx219 2-0010: imx219_write_reg: 0x0158 = 0x00000100 (len = 2)
[ 248.513955] imx219 2-0010: imx219_write_reg: 0x0172 = 0x00000000 (len = 1)
[ 248.514265] imx219 2-0010: imx219_write_reg: 0x0172 = 0x00000000 (len = 1)
[ 248.514583] imx219 2-0010: imx219_write_reg: 0x0600 = 0x00000000 (len = 2)
[ 248.514932] imx219 2-0010: imx219_write_reg: 0x0602 = 0x000003ff (len = 2)
[ 248.515284] imx219 2-0010: imx219_write_reg: 0x0604 = 0x000003ff (len = 2)
[ 248.515622] imx219 2-0010: imx219_write_reg: 0x0606 = 0x000003ff (len = 2)
[ 248.515959] imx219 2-0010: imx219_write_reg: 0x0608 = 0x000003ff (len = 2)
[ 248.516297] imx219 2-0010: imx219_write_reg: 0x0100 = 0x00000001 (len = 1)
[ 248.516604] imx219 2-0010: imx219_start_streaming: success
[ 248.516612] mxc-md 32c00000.bus:camera: walk: returning entity 'mxc-mipi-csi2.0'
[ 248.543485] mxc-md 32c00000.bus:camera: walk: returning entity 'mxc_isi.0'
[ 248.543498] mxc-md 32c00000.bus:camera: walk: returning entity 'mxc_isi.0.capture'
[ 248.543503] isi-capture 32e00000.isi:cap_device: mxc_isi.0.capture is no v4l2 subdev
[ 248.543529] isi-capture 32e00000.isi:cap_device: mxc_isi_cap_g_fmt_mplane
ex
https://community.nxp.com/t5/i-MX-Processors/Creating-a-MIPI-CSI-Camera-driver-on-i-MX-8MP-EVK/m-p/1382985/page/2 has debug info for similar problems. It talks about imx8-isp / start_isp.sh and .drv and .xml files (for camera-specific settings tuning?) that might be relevant (later?). It also suggests there exists a MIPI frame counter than can be used to check if (valid) frames are received by CSI. It also has some DT values that can be wrong to prevent CSI frames from being valid.
2024-03-27: Modifying the devicetree overlay based on the IMX708 DT that I later made working (mostly setting data-lanes = <2>
in the CSI endpoint) also makes the IMX219 work (with the kernel patches applied).
Curent prototype uses IMX708, which is not present in the Toradex 5.15 or mainline master kernel . Rpi 6.1 kernel has a driver, so we might need to port it over.
This is the camera supplied by Toradex. Tried to get it working again using the yocto custom layer, which compiled okay (with one fix), but could also not record an image (maybe displaying something on the screen did work, but I did not have an LVDS screen ready and did not get the HDMI working directly - probably needs an overlay).
Using a self-compiled Yocto image (according to Toradex instructions) with econ ar0521 layer, installed using easy installer (so no tcbuild modifications that disable overlays), with weston on the HDMI output, with the gstreamer docker as shown above, with the gstreamer command from Toradex, I get a streaming image. At one point, after removing the lenscap while streaming, the stream halted. After a restart, everything would initialize without errors, but streaming would not start, just like with the IMX219. Replugging all cables seems to have solved this, so it seems a bad connection can prevent the stream from starting. Using the gstreamer command above for capturing video to a flv file produces a working capture. Using v4l-ctl produces working single frame, except it is truncated.
Clean yocto image (installed via easyinstaller) produces working images. Adding our tcbuild.yaml to it breaks things again (playback is set up without errors, but then does not start rolling). Reverting basically all devicetree changes (removing actuatorboard, gpio and hmp-fix) does not fix things (but the hdmi and ar0521 devicetrees might be different from the clean image). Reapplying clean yocto (with ostree) still does not work. Powercycle (without further changes) fixes things, so it looks like something breaks it and needs a powercycle (not just working code) to fix it. Switching to minimally modified image with powercycle does no work, so there are further changs. Switching back to clean image with a powercycle (just shutdown and pressing powerbutton this time, not removing power from the board entirely) fixes things again.
Tried finding differences. /proc/device-tree
differences (dt.working vs dt.broken) were a lot (unexpectedly). Tried minimizing differences (dt.broken2). Unsetting fdtfile in uEnv.txt for modified image minimized devicetree differences, but did not fix things. Switching to clean initramfs file (by modifying uEnv.txt
, s/ramdisk_image2=/ramdisk_image=/
) seemed to fix things, booted twice in a row, but then switching to the original initramfs still worked once, then broke a second and third time :-S Confirmed in u-boot output that all three times used the original initramfs... Switched to clean initramfs again - works twice, then fails once. Seems unreliable...
Also errors in different amounts and combinations, no obvious link to working/non-working:
Starting version 250.5+
[ 11.248507] fsl-aud2htx 30cb0000.aud2htx: failed to pcm register
[ 11.294698] fsl-aud2htx 30cb0000.aud2htx: failed to pcm register
[ 11.397927] fsl-aud2htx 30cb0000.aud2htx: failed to pcm register
[ 11.576068] imx-hdmi sound-hdmi: snd_soc_register_card failed (-517)
Next steps:
- Check with scope if data flows in both cases?
- Try again with new cable - Did not help
- Panic?
- Create test script (on verdin) to automate docker and produce a result, and test script on laptop to power on (via FTDI GPIO?) or reboot via ssh, run verdin test script, save result, output, uEnv.txt, ostree info and serial log to do repeated tests - DONE
FOUND IT: The SODIMM 222 pin was used as CS pin for INA current sensors, but that pin was actually connected to the CSI_GPIO8 == CAM_1_CON_PWRCTRL. If that pin was high (which CS pins are by default), no data flows somehow (but apparenly the sensor does still respond to I2C - it seems this pin is connecte to the CAM_TRIGGER pin, which is an output from the camera, but somehow pulling it high influences the camera apparently).
Modules from arducam and econ seem to have an integrated ISP, which needs a binary firmware blob and preprocesses the data somehow, so their drivers might not really be drivers for the AR1335 itself (i.e. maybe they do not even talk to the AR1335 directly?).
This seems have two links to driver code that does talk to the AR1335 directly: https://community.nxp.com/t5/i-MX-Processors/imx8m-plus-Onsemi-AR1335-camera/m-p/1505442#M193779 (but also no suggestions on modules for it).
-
The toradex_5.4-2.3.x-imx kernel branch has a ar1335.c driver included (as well as ar0521.c, added in this commit, which adds these two drivers plus firmware blobs plus seemingly boilerplate changes to
imx8-mipi-csi2.c
), but not anymore in the toradex_5.15-2.2.x-imx. It seems that Toradex just decided to not port the ar1335 driver into their fork and stick closer the NXP version. Note that the 2.2.x and 2.3.x version numbers are only valid within the kernel version (5.4 and 5.15), so these versions do not indicate a downgrade from 2.3 to 2.2 in any way. See https://community.toradex.com/t/kernel-version-numbering/21558/2 -
The econ driver bundle patches the ar1335.c in toradex_5.4-2.3.x-imx (so does not apply to the 5.15 branch). The patch applies cleanly. The patch only changes ar1335.c fairly trivially (just sets a default "auto" value for
V4L2_CID_EXPOSURE_AUTO=0x009a0901
), it mostly adds some extra (seemingly boilerplate) functions to imx8-mipi-csi2-sam.c, which look the same as the original changes toimx8-mipi-csi2.c
. -
The prebuilt image from econ for the ar1335 does not work with tc-builder:
No root file system content section found in <_io.TextIOWrapper name='/storage/tezi/image.json' mode='r' encoding='UTF-8'>
. Maybe try with tezi? -
ar1335.c:
- Seems to talk only to the MCU
- Does minimal configuration of the ISP (there is an ISP configure command, which seems to be used only to switch between 2 and 4-lane mode, which seems hardcoded to 4-lane in
ar1335_probe
), plus argumentless ISP power up and power down commands. - The MCU is also queried for v4l ctrl, ui and formats information, which is mostly passed to v4l verbatim (which seems fragile - I am not sure if v4l makes any guarantees about its internal driver API, especially when not using the proper header constants). This might mean the MCU firmware is tied to a particular firmware version.
- Code is overly verbose, lots of boilerplate documentation that does not seem necessary at first glance. Separation between MCU access functions and v4l callbacks does seem proper.
- Weirdly enoug
ar1335_s_stream
which seems to be for setting the stream enable on or off does not do anything. - Driver contains empty
s_power
andlink_setup
callbacks, which prevents issues with the CSI driver not handling omitting these. - MCU firmware update seems to use a bootloader, which is triggered by fiddling the pdown and reset pins (so it is likely that this will work even after a failed update).
- reset and powerdown (
pwn-gpios
in dt) seem to be the only gpios used. - It seems
ar1335_probe
does not do any regulator setup because it is not needed on some board, but unclear if this means it will break if it is needed (or maybe there is some higher level driver that handles it already). - There is no indication of what type of MCU is present.
-
The patches also patch:
- The v4l2 subdev driver to allow a sensor to expose controls
- The ISI driver to forward controls to the remote subdev (probably the CSI)
- The CSI driver (which does not normally expose controls) to forward controls to the remote subdev (probably the sensor).
-
Note that
imx8-mipi-csi2-sam.c
andimx8-mipi-csi2.c
use e.g.mxc-mipi-csi2-sam
as the driver name in the source, so that's probably what is printed in the dmesg. -
dmesg in the prebuilt image shows
mxc-mipi-csi2-sam
so it probably usesimx8-mipi-csi2-sam.c
, which is the "MIPI CSI2 Samsung driver for i.MX8MN platform" (as opposed to the non-sam version which is for "i.MX8QM/QXP platform"). Unclear what these platform names refer to exactly. -
Tried prebuilt image from econ: Works, but is BSP image (not torizoncore, no ostree) based on kernel 5.4.
-
Ported over kernel driver to 5.15 (including some ISI/ISP changes, leaving behind some mxc platform changes and other sensors): https://github.com/matthijskooijman/linux/tree/toradex_5.15-2.2.x-imx-ar1335
-
Tried with overlay from econ ftp: Fails. Creates /dev/video2, but that seems to be the isi m2m device, not the capture (ar0521 enables only cap, ar1335 also enables m2m - overlays are otherwise pretty much identical). Also no /dev/media0 is created.
-
Next steps:
- Enable dyndbg for more info
- Check with i2c-detect if sensor responds
- Compare ar1335 and ar0521 drivers, and compare isi/isp drivers in both branches (are available inside yocto as different branches - maybe rebase the ar0521 to make them more comparable)
- Try disabling m2m device
2023-01-22 Sent e-con a message through their support form:
Hi,
We're currently evaluating your e-CAM131 module for the IMX8 platform, to included it in a product we are developing. We've been looking at the Linux drivers provided, and saw that these drivers talk to an on-board microcontroller that configures the camera and the ISP. The firmware for this microcontroller is included as a binary blob with the linux driver.
Is the sourcecode for this microcontroller firmware available, or could it be made available to us? In order to better understand the module and possible limitations, and to prevent a situation where we run into problems we cannot solve or diagnose without knowing what this microcontroller does, it is important to us to have access to this firmware.
Please let us know if this is possible.
Kind regards,
Matthijs Kooijman
P.S. Our SO# is 38660000334747174
- Official docs at https://www.arducam.com/faq/kernel-camera-driver/, which refer to https://github.com/ArduCAM/Arducam_OBISP_MIPI_Camera_Module and https://github.com/ArduCAM/MIPI_Camera for Jetson.
- https://github.com/ArduCAM/MIPI_Camera says it provides a userspace-only driver for Rpi (in the RPI subfolder), which seems to support other cameras only (not the OBISP cameras). This repo is littered with precompiled libraries, unclear if all sources are included. It also talks about Jetvariety and Pivariety for a single kernel driver that can handle all camera modules, but it is unclear if that is just the OBISP cameras or really all of their cameras. Also, for Jetvariety it points to the Jetson subfolder and the OBISP repository, so that's a bit vague. For PiVariety it points only to the OBISP repository (https://github.com/ArduCAM/Arducam_OBISP_MIPI_Camera_Module).
- https://github.com/ArduCAM/Arducam_OBISP_MIPI_Camera_Module has some source and docs. There is no mention of AR1335 in the source, just a generic arducam.c, so maybe the MCU just presents a generic camera to the kernel which does not need to know about any camera.
- There is does not seem to be any firmware included in these repositories, so maybe the idea is to have a (never updateable?) factory-programmed firmware? This page talks about pivariety firmware upates, but only for IMX230/IMX298 and only with a ELF binary blob for updating.
- Both repositories have a deprecation notice saying they are deprecated on bullseye or later, but without pointing to an alternative.
- https://docs.arducam.com/Raspberry-Pi-Camera/Pivariety-Camera/Introduction/ further documents the pivariety system, arguing it supports more sensors, higher resolutions, more automatic features (autofocus, autowhitebalance, global shutter). Seems like the arducam folks have trouble to configure the (closed? badly documentd?) Rpi ISP module, so are dependent on the rpi foundation to do this. By adding an MCU in between makes it easier to implement stuff in their MCU and own ISP. Risk: That means things get implemented without proper review...
- arducam.c:
- At first glance seems to defines a fixed list of controls (exposure, digital and analog gain and test pattern) in
arducam_init_controls
and maps that to MCU registers explicitly inarducam_set_ctrl
. However, these functions are unused (referenced only in commented code or not at all) and insteadarducam_enum_controls
andarducam_s_ctrl
enumerate and set ctrls from the MCU, passing values verbatim just like econ does. - There is no code for firmware update, or even firmware version check. There is a device ID register to be read, but there is only one valid value which is checked for (but the defines suggest there is also a SENSOR_ID_REG, but it is not currently used).
- The included overlay is full of hardcoded values and seems to be processed already (with explicit fragment override stuff), so hard to read and adapt. Conclusion:
- At first glance seems to defines a fixed list of controls (exposure, digital and analog gain and test pattern) in
- Arducam driver and docs look more fragile, e-con likely has better support and better engineering, so try e-con first (but is also not ideal).
Options:
- https://github.com/bogsen/STLinux-Kernel/blob/master/drivers/media/platform/tegra/ar1335.c For tegra
- https://source.codeaurora.org/external/imx/isp-vvcam/tree/vvcam/v4l2/sensor/ar1335?h=lf-5.15.y_2.0.0 Broken link
- https://github.com/nxp-imx/isp-vvcam/tree/lf-5.15.y_2.0.0/vvcam/v4l2/sensor/ar1335 (also needs isp-imx apparently, see https://community.nxp.com/t5/i-MX-RT/AR1335-camera-sensor-driver/m-p/1668563/highlight/true#M25510) was made to work by malik_cisse on imx.
This driver is reported to work on imx8 with ar1335 on the nxp forum by malik cisse. They also provided use with an example dts file.
- The repo has branches for the different NXP BSP versions (matching toradex kernel branches).
- The repo contains a
build-all-vvcam.sh
script that:- Needs to be pointed to the kernel source dir
- Has two buildmodes: native and v4l2 which respectively set ENABLE_IRQ to no and yes (
vvcam/readme.txt
suggests that originally there was also avvcam/native
directory). This is forwarded to the preprocessor, but also decides between usingisp_driver.c
anddwe_driver.c
whenENABLE_IRQ=no
and the_of
variants whenENABLE_IRQ=yes
. - Defers to the makefile in
vvcam/v4l2
for building and then copies some of the built modules. - That makefile:
- Includes various objects from
vvcam/isp
andvvcam/dwe
as well, and adds these andvvcam/common
to the include path. - Then defers to the main kernel makefile to build a module.
- Which probably includes
vvcam/v4l2/Makefile
again, as well as makefiles in subdirectories. - Which seems to produce a dozen or two kernel modules, of which ar1335.c is one. Unclear if that actually depends on the rest (for e.g. driving the ISP?) or can just function on its own.
- Includes various objects from
- It does seem that the ISP module is defined in the DT, but there is no driver in the main kernel that handles
compatible=fsl,imx8mp-isp
, which is what the isp-vvcam driver does handle. So it seems if we want to use the ISP (mostly for bayer it seems, maybe lens correction) we need that code as well. The other cameras use the ISI instead of the ISP, which can do downscaling, color space conversions (but not bayer), flipping, cropping, deinterlacing. But maybe we could do bayer in software as well (or not at all?).
To compile with torizoncore-builder:
torizoncore-builder --log-level debug kernel build_module ~/docs/MKIT/3Devo/Toradex/AR1335-bare/isp-vvcam/vvcam/v4l2/
The isp-vvcam Makefile has an issue with using PWD (which is not set when running through torizoncore-builder), so some Makefile changes are needed. Unclear if this is something to be merged with isp-vvcam, since their makefile for 5.15 is a bit of a mess, and this might be (accidentally) resolved for 6.x.
This produces an ar1335.ko plus a bunch of other drivers (for the ISP?) that we might not need.
- Has some OTP memory for "for storing shading correction coefficients, individual module, and sensor specific information. The user may program which set to be used". AR1335 datasheet does not specify the format, though it does list on-chip shading correction as a feature, so it must support a particular format.
- The sensor supports reading a subset of pixels, but only allows defining one window (with x/y start/end registers).
- Sensor is available in two CRA (Chief Ray Angle?) variations, unclear what we would need (depends on lens I think).
- Datasheet has info on programming the sensor (i.e. register addresses and values), but not complete list of registers. In particular, OTP memory programming and readout is not documented, as well as how to configure shading correction.
To connect to Dahlia/dev board: Use different-side FFC cable (displayboard cable) to connect to Koen's converterboard (this is mirrored vs the econ converter board)and different-side FFC rpi cable to connect to camera.
With kernel driver ported over from rpi kernel (needs disabling some things that mainline/toradex does not support), and IMX patches made for IMX219 earlier, the driver loads and recognizes the camera.
Like with the IMX219, gstreamer starts the capture, but the clock never starts.
IMX708 driver has seen one attempt at mainlining: https://lore.kernel.org/linux-media/[email protected]/ Improvements from that attempt have been merged at rpi: https://github.com/raspberrypi/linux/commit/7be3d49d5abbffe3d16d98c823b5228b4a07eb13
On the scope, superficially valid data is seen on CLK, D0 and D1 lanes (matches data-lanes = <1 2>
in the DT and seems to be the only supported number by driver or sensor). Data looks like a valid SoT sequence followed by HS differential data. The HS data is more clear on the data lines than the clock lines, probably because my 100Mhz scope is too slow and the clock line is faster/more regular so averages out. Polarity of all three lanes seems correct (based on the HS-0 settle seen during the SoT) on the devboard side (which means connections are correct - sensor drives all D-PHY lanes).
Changing the link-frequency
in the DT is not possible - imx708.c
(so maybe the sensor) only allows 450Mhz. The xclk (changed to inclk, though the driver just gets the single configured clock ignoring name it seems) for the sensor must be 24Mhz according to the source, which is not explicitly configured but just happens to be the default (default parent clock is osc_24m).
Scope shows xclk (CSI_1_MCLK pin) is not enabled (always high). /sys/kernel/debug/clk/clk_summary
does show the clock gets enabled when capture starts. Problem is that nothing configures the pinmux for this pin (and for the econ cameras the DT was based on the MCLK is probably not needed).
Setting up the pin mux (for now by overriding &pinctrl_gpio_hog3
that sets it to GPIO mode normally) lets a 24Mhz clock show up on the scope. Still no data in gstreamer.
24Mhz does not show up on scope on sensor module end, turns out Koen's adapter board does not actually connect it. However, it seems the IMX708 module also does not actually use the mclk input pin at all (checked PCB and did some continuity testing, but seems unused). PE pin is connected, but is also hardwired to 3V3 on connector.
Setting gstreamer resolution to max supported by sensor (according to driver) width=4608, height=2592, gives ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Internal data stream error.
Using streamer RGB format (to prevent CSC) and a smaller resolution (to prevent ISI chaining) also does not help.
While "streaming", some CSI-related interrupt is firing according to /proc/interrupts
Changing data-lanes = <1 2>
to data-lanes = <2>
(apparently the imx csi driver mipi_csis_parse_dt
expects a single int for the number of lanes instead of a list of lanes like of in the CSI endpoint DT causes a frame to be captured. Still all black (or all green when doing RGB->YUV conversion), but it's something at least.
With gstreamer RGB format, I could capture images but not stream to the screen. Fixed by adding videoconvert
(waylandsink probably accepts YUV but not RGB video).
With the default settings, only black images are produced. Bright light into the lens produces a blue image, so there is probably an exposure issue. Also, the blueness is probably because (10-bit?) raw pixel data is put LSB-aligned into RGB, feeding only the blue pixel (changing gstreamer format to BGR does not change the coloring, probably because then the ISI just shuffles bits).
With some focus adjustments and live streaming on screen, an actual image can be produced from looking at the outside window too (lots of contrast between window and walls).
Input from sensor is RAWx. ISI driver supports only RGB->YUV and v.v. We can probably fool it (and v4l2/gstreamer) to do passthrough do bayer processing in software. Width is limited to 4096 pixels.
ISI format setup in kernel:
mxc_isi_subdev_set_fmt
:- Looks in
mxc_isi_out_formats_size
(inimx8-isi-fmt.c
) for the requested format (based onmbus_code
) and sets it indst_f
. - The list of output formats does not include a RAW format, but the hardware does seem to have a code for it (according to
imx8-isi-core.h
), so adding RAW output is likely a matter of adding it inimx8-isi-fmt.c
.
- Looks in
mxc_isi_cap_streamon
callsmxc_isi_config_parm
callsmxc_isi_source_fmt_init
which:- tries to configure
MEDIA_BUS_FMT_UYVY8_2X8
, calling the sensor set_fmt (imx708_set_pad_format
). The sensor configures a mode based on the requested size only, ignoring the pixel format. edit: This might actually call the CSI set_fmt, which maybe forwards to the sensor? - queries the resulting format (using
get_fmt
). imx708 always returnsMEDIA_BUS_FMT_SRGGB10_1X10
(or different orders depending on flip settings). - Calls
mxc_isi_get_src_fmt
to figure out the corresponding ISI src format, which is V4L2_PIX_FMT_YUV32 if the sensor produces something YUV-like, and V4L2_PIX_FMT_RGB32 otherwise. This is stored insrc_f->fmt
. - Of
src_f->fmt
, only thefourcc
field seems to be used for CSC settings below, and thembus_code
is exposed through theget_fmt
API when the source pad is queried (but that is always unset). All other fields seem to be unused. - At startup,
src_f
is copied fromdst_fmt
, soget_fmt
returnsRGB565_1X16
for all pads (sinks and sources). After doing an RGB3 capture, the source pads showRGB888_1X24
as expected, and because thembus_code
field is indeed never set on the source formats, sink pads now show:[fmt:unknown/1920x1080 colorspace:srgb
in themedia-ctl -p
output.
- tries to configure
mxc_isi_config_parm
then callsmxc_isi_channel_config
to applysrc_f
anddst_f
which:- Sets up chaining if resolution is > 2048
- Sets up YUV->RGB or RGB->YUV CSC if needed (based on the
fourcc
fields). This also configures the output format, and the conversion apparently implies the input format. - Sets up scaling and cropping if needed
- Sets up the source selection and line pitch
- Sets up flip (marked as TODO) and some others
- If not doing CSC or scaling, sets a channel bypass flag
- TODO: How is number of bits-per-pixel configured for the input? Maybe the CSI module normalizes into a single alignment format somehow, but the datasheet suggests it packs bits rather than using a fixed alignment. How does the ISI know what the input data is? Maybe it just reads the configured format code? That would be somewhat weird (since that seems a CSI detail, and the source can also be from other modules).
CSI formats (
imx8-mipi-csi2-sam.c
)
- This driver has private ioctls on the v4l subdev to reset, stream/power on/off, set format, set HDR and query caps. The format setting calls
csis_s_fmt
which accepts RAW10 and RAW12 formats with various pixel orders. This setsstate->csis_fmt
and applies the format directly as well. This code path is probably not used in practice right now. - The v4l subdev pad op
set_fmt
(implemented bymipi_csis_set_fmt
) gets a format (probably from the ISI driver), seems to overwrite it withMEDIA_BUS_FMT_UYVY8_2X8
and passes that to the sensorset_fmt
, which updates the format with whatever it actually supports (in case of theimx708
, it ignores the passed format and just returns RAW10). Then it setsstate->csis_fmt
based on that overwritten config. Note that theget_fmt
function just forwards to the sensor, without looking atstate->csi_s_fmt
at all. Also note that theset_fmt
handlers updating their arguments seems buggy, but it actually seems how this was intended (e.g.__ceu_try_fmt
inrenesas-ceu.c
in mainline relies on this and commentsApply size returned by sensor
after callingset_fmt
). - Then
mipi_csis_start_stream
callsdisp_mix_gasket_config
andmipi_csis_set_params
which apply thecsis_fmt
value to the hardware. This writes the format to theMIPI_CSIx_ISP_CONFIGn
(using thefmt_reg
field frommipi_csis_formats
, values match CSI DT values) and to theDISP_MIX_GASKET_0_CTRL
register. The latter might be a sort of shared register between the CSI module and the ISI/ISP (it also contains some alignment and single/dual pixel stuff)? Or the former? Not clearly documented. Datasheet CSI section Figure 13-23 shows how each of these CSI datatypes are mapped into (max) 56 bits (when configured for dual/quad pixel, then it seems to group multiple pixels into one sample). Driver uses dual pixel mode only forMEDIA_BUS_FMT_YUYV8_2X8
, single pixel mode otherwise.
Overall format setting
- It seems the general flow is that a
set_fmt
call trickles down from the ISI to the CSI to the sensor, each essentially ignoring the input value (except ISI which uses it to configure the output format) and replacing it with a default input format to pass down. Then on the way back up, the passed format variable is updated and each of the modules configures itself based on the format value passed back up. - In
mxc_isi_cap_s_fmt_mplane
, thebytesperline
andsizeimage
fields are only calculated when they are not already set. Since v4l2-ctl seems to get the current format and only overwrite explicitly specified fields, you have the explicitly set these to 0 in thev4l2-ctl
call, otherwise the output filesize will remain unchanged. The pyrav4l2 lib does this by default.
MEDIA_BUS_FMT
constants
MEDIA_BUS_FMT
constants have a suffix, according tomedia-bus-format.h
e.g.2X8
means one pixel is transferred in two 8-bit samples, while1X16
means 16 bits in a single transfer. All values are1X
except2X8
and3X8
(but there is also1X16
and1X24
with an otherwise identical format)- https://www.kernel.org/doc/html/v4.15/media/uapi/v4l/subdev-formats.html has an extensive list with fully specified bit ordering.
- CSI module uses various suffixes,
2X8
for 16-bits, but also1X10
,1X12
and1X24
for other sizes.
Format matching with gstreamer:
- This defines the list of names for the gstreamer-supported formats: https://gstreamer.freedesktop.org/documentation/additional/design/mediatype-video-raw.html?gi-language=c. With
gst-inspect-1.0 v4l2src
you get all formats supported by the v4l2 source, you can get the actual advertised formats from debug output:GST_DEBUG=v4l2src:6 gst-launch-1.0 v4l2src device=/dev/video2 num-buffers=0 ! fakesink 2>&1 | grep 'sorted and normalized caps' | sed 's/;/\n/g'
drivers/staging/media/imx/imx8-isi-fmt.c
lists the formats exposed by the ISI driver (which are formats also listed by gstreamer, but using different naming). These are also listed by `v4l2-ctl --device /dev/video2 --list-formats-ex- The
v4l-ctl
pixelformat
option usesPIX_FMT_XXX
formats by specifying the corresponding fourcc codes as listed on https://docs.kernel.org/userspace-api/media/v4l/pixfmt.html
- The hardware supports additional formats (in particular RAW8/10/12 with different packing), but the code does not expose these yet. Here is some info about modifying the format lists in the source for using RAW: https://community.nxp.com/t5/i-MX-Processors/How-to-set-up-the-MIPI-CSI-2-and-ISI-to-transfer-RAW10-images-on/m-p/1149510
- Patching the ISI to add a RAW10 format to the dst formats list seems to work and allow full-resolution capture with (tested with IMX219):
This produces 16-bit aligned 10-bit RAW output, with LSB-zero-padding (which is a bit weird, since this page suggests the RG10 format should have MSB-zero-padding). This usedv4l2-ctl --device /dev/video2 --set-fmt-video=width=3280,height=2464,pixelformat=RG10,sizeimage=0,bytesperline=0 --stream-mmap --stream-to=frame.raw --stream-count=1
.color = MXC_ISI_OUT_FMT_RAW10
even though the forum post suggests that produces only 8-bit of data with 8-bit padding, needingMXC_ISI_OUT_FMT_RAW16
to get all 10 bytes, but that does not seem to be the case here.
While streaming, I got once:
[10054.456915] mxc-isi 32e00000.isi: mxc_isi_irq_handler, IRQ Panic OFLW Error stat=0x40080100`
A while later, streaming did not want to start (but maybe related to having chose a test pattern as well). After a reboot, streaming only worked once again, so seems unrelated to the overflow error, probably related to this error:
isi-capture 32e00000.isi:cap_device: mxc_isi.0.capture is no v4l2 subdev
Nope, that is just a debug message that one element in the treewalk is skipped probably, it also shows up the first time when streaming does work.
Turns out this "streaming not starting" is caused when the stream is terminated while test_pattern=1
is still effect. Setting it to 0 before stopping prevents the issue, setting it to 0 after stopping no longer helps. This also means that capturing a single test_pattern frame needs a reboot after every frame...
Test pattern shows 0x3f (6 bit) blue pixels, clearly showing the bayer pattern. According to the driver, the default (non-flipped) pixel order is RGGB, meaning the test pattern has white (all pixels 3f), yellow (RGG 3f), cyan (GGB 3f), green (GG 3f), magenta (RB 3f), red (R 3f), blue (B 3f), black (this matches the test pattern in the AR1335 datasheet, so probably standard). Assuming that the original test pattern has full-intensity color, that suggest some scaling is happening somewhere.
Test pattern returns max 0x3f. On an actual capture, with max exposure and analogue gain, the max itensity also shows up as 0x3f, so there is probably some truncation happening.
The ISP_CONFIG register in the CSI module is set to FORMAT=0x2B (RAW10) as sort of expected, single pixel mode, except that how mipi_csis_set_fmt
is implemented would suggest it would be set to some YUV format...
/workdir/torizon/build-torizon/tmp/work/verdin_imx8mp-tdx-linux/linux-toradex/5.15.129+gitAUTOINC+4cd71d3b7d_6f32493eb8-r0/build/.config
/workdir/torizon/build-torizon/tmp/work/verdin_imx8mp-tdx-linux/linux-toradex/5.15.129+gitAUTOINC+4cd71d3b7d_6f32493eb8-r0/temp/log.do_kernel_metadata
It seems the ISI returns 6 bits, where the camera supplies 10, so where are the other 4 bits? Also, do we get decent exposure already with these extra bits, or only when we use unnaturally much gain?
Maybe teach ISI to work with RAW first, and then see if any bits remain missing (likely they are a victim of the interpret-RAW-as-RGB issue).
Useful links:
- https://community.nxp.com/t5/i-MX-Processors-Knowledge-Base/Guide-for-building-imx8mp-isp-standalone/ta-p/1368850
- It seems the ISP driver is intended to be built using a yocto recipe: https://github.com/nxp-imx/meta-imx/blob/mickledore-6.1.22-2.0.0/meta-bsp/recipes-kernel/kernel-modules/kernel-module-isp-vvcam_4.2.2.22.0.bb The same recipe seems to exist in the meta-freescale layer used by Toradex: https://github.com/Freescale/meta-freescale/blob/kirkstone/recipes-kernel/kernel-modules/kernel-module-isp-vvcam_4.2.2.19.0.bb
- This topic suggests there is an ISP mediaserver https://community.nxp.com/t5/i-MX-Processors/i-MX8M-Plus-ISP-standalone/m-p/1466354
- This topic talks about needing .drv and .xml files to configure the ISP, and also mentions the mediaserver with json configuration. It also suggests the sensor code might need modifications for VVCAM-specific ioctls: https://community.nxp.com/t5/i-MX-Processors/ISP-working-in-i-MX8M-Plus/m-p/1287753/highlight/true#M175066
- The ISP apparently also needs the isp-imx driver/yocto layer, e.g. from https://www.nxp.com/lgfiles/NMG/MAD/YOCTO/isp-imx-4.2.2.16.0.bin This is a self-unpacking archive, with a NXP (probably propietary) license / EULA that you need to accept (which is probably why it is not just on github). It seems to contain some vvcam code as well (but not the ar1335 sensor driver). It also contains scripts to start the mediaserver, but not the mediaserver itself it seems. Unclear what parts are in vvcam and what parts are in isp-imx.
- The isp-imx license has a bunch of limitations, in particular that software is only usable in relation to NXP hardware. Other limitations also arise from the particular wordings that might or might not be problematic for us - I dislike the license in any case.
- Topic for someone using the ISP for a custom sensor: https://community.nxp.com/t5/i-MX-Processors/Creating-a-MIPI-CSI-Camera-driver-on-i-MX-8MP-EVK/m-p/1385111
- There are also two camera porting guides (search for "camera" in the doc list on this page), but they do not want to load for me (might require signin, but I get no login prompt).
- https://colour-demosaicing.readthedocs.io
- https://github.com/cruxopen/openISP
- OpenCV also has debayering and other operations
- https://github.com/antmicro/pyrav4l2 (https://opensource.antmicro.com/projects/pyrav4l2/)
- Recent commits
- Supports MMAP streaming API, but only single-planar version
- Python-only implementation, using ctypes.
- https://pypi.org/project/v4l2capture/ (https://www.arducam.com/faq/opencv-v4l2-python-rpi/)
- Source on launchpad, last revision from 2016
- Supports MMAP streaming API, but only single-planar version
- Use C file to call V4L2 API, with higher-level python wrapping
- https://github.com/antmicro/python3-v4l2
- Last commit 3 years ago
- Defines only structs and constants, expects user to call the right ioctls
- https://pypi.org/project/v4l2py/
- No longer maintained, version 3 just wraps linuxpy.video
- https://github.com/tiagocoutinho/linuxpy
- Recent commits
- Supports video as well as other stuff (input, usb, midi)
- Supports MMAP streaming and read/write API, but only single-planar version
- Python-only implementation with ctypes. Generates constants and structs from header files instead of duplicating info.
- Contains a few layers of abstraction objects, maybe too much to easily add multi-plane.
- V4L2 API supports different capture methods:
- Read/write from file descriptor. This is very basic and involves at least one data copy from kernel to user space.
- Streaming mode. This enqueues a number of buffers with the V4L2 driver, which will then fill buffers in turn. Using an ioctl, the application can dequeue filled buffers (blocking until one is available), process it and queue it again. This has three variants:
- Using mmap to map device or kernel memory into the application space
- Using user pointers to hand userspace memory to the kernel to write data into
- Using DMABUF objects to have buffers that are (I think) opaque/inaccessible to the application, but can be used to move data from one device to another. Apparently the user pointers method is simplest and should be preferred (mmap is similar, but apparently locks data in physical memory instead of virtual memory, which could cause issues?).
- Our ISI driver supports streaming mode, but not read/write (according to v4l2-ctl). The driver (
mxc_isi_register_cap_device
) indicates only MMAP and DMABUF mode is supported. - The ISI driver configures output buffer addresses in
mxc_isi_channel_set_outbuf
. It always seems to configure three buffers (YUV), even though RGB data is written to just the Y buffer according to the datasheet. However, it seems thebuf->dma_addr
is only initialized for the used memplanes, so the other addresses are probably just written zeroes (and unused by the hardware). There is a field to mark an incoming buffer as "discard", which does not come from userspace it seems, making it write to dummy memory. It seems that's used to discard data if userspace does not supply empty buffers fast enough. - The v4l2-ctl output only lists multi-planar capture, not single-planar, but it seems that is just a more generic API that can also work with a single plane.
- OpenCV implements both single and multiplanar capture. The v4l2 driver (
modules/videoio/src/cap_v4l.cpp
) does a memcpy on the frame (it dequeues a buffer, the processes it without copying, but then copies into a spare buffer inretrieveFrame()
and the requeues the original buffer. Missed opportunity...)
Tried pyrav4l2, modified to support multi-planar, which works in mmap mode. This does use 200% CPU for capturing for some reason, which is weird because Python should be busy waiting for a new frame most of the time and no data should be copied (in theory). Turns out this was the dummy filament sensor python, the actual capturing only takes 0.3% in this case.
Also tried opencv python bindings, which support multi-planar out of the box. It returns a numpy array, unclear if that is/can be zero-copy (I suspect it can, numpy arrays can be backed by memory buffers). OpenCV defaults to BGR, but that is probably just the default mode and the reordering done by the ISI. Have not figured out how to set the format/mode yet, setting frame w/h did not work (always returns 640x480).
Exposure settings, set with v4l-ctl
do not actually seem to affect the output levels, gain settings do (not sure if both settings work).
Conclusion: probably best to just cherry-pick code from pyrav4l2 and not use the libraries at all.
Setting frame rate (or frame interval) with pyrav4l2 did not seem to work - maybe needs to be set on subdev? Maybe easier with opencv?
Seems that configuring the frame interval can be done using the S_PARM ioctl and the timeperframe parameter, which the isi driver forwards (via the generic v4l2_s_parm_cap
) to the CSI s_frame_interval
which forwards to the sensor s_frame_interval
, which is not implemented by IMX708 or AR1335.
For the IMX708, it seems the only way to influence the framerate is by changing the vblank time. Looking at imx708_set_framing_limits()
in imx708.c
, it seems that for a given mode, the pixel rate and number of pixels per line is fixed (so hblank is fixed to line_length_pix - width
and does not influence framerate).
A max resolution, the framerate using python (without processing) is around 14FPS, which matches the expected rate, changing the vblank so the framerate is expected to halve (v4l2-ctl -d /dev/v4l-subdev2 -c vertical_blanking=2708
) indeed produces around 7FPS:
>>> line_length_pix = 0x3d20
>>> vblank_min = 58
>>> height = 2592
>>> pixel_rate = 595200000
>>> pixel_rate / (line_length_pix * (height + vblank))
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
NameError: name 'vblank' is not defined
>>> pixel_rate / (line_length_pix * (height + vblank_min))
14.353513138094687
>>> pixel_rate / (line_length_pix * (height + 2708))
7.176756569047344
See also https://forums.raspberrypi.com/viewtopic.php?t=281994#p1708104 Apparently exposure time is also in units of lines.
One thought is that the ISI has two channels, so maybe both could be used to do two different crop regions, discarding a lot of unused data already in the ISI. This is likely possible by configuring both channels to use the same input and different output cropping, but breaks when the input resolution is more than 2048 pixels, because then both channels need to be chained to make a bigger line buffer. The kernel source shows that in this case, only the first channel is configured (the second one just has its clock enabled), so no opportunity to configure a different crop on the second channel. Unless the cropping happens on the input side (and cropping to < 2048 pixels removes the need for channel chaining), but this is not documented.
- The IMX ISI and CSI drivers are not in mainline yet. It seems they might originate in this repository: https://github.com/Freescale/linux-fslc/commits/6.1-2.2.x-imx/drivers/staging/media/imx
- Online I have seen somewhere that this is the "Community BSP" repository, and that the official "Release BSP" releases are based on that.
- That repo also has mainline branches (without the -2.2 BSP version numbering), but that contains mainline with just important bugfixes it seems (no ISI/CSI drivers).
- That repo has a lot of merged PR's, usually in
k.k-b.b.x-imx
branches, so submitting patches to 6.1-2.2.x (the most recent current branch) is probably good. Maybe also see if we can make a build with that version? - 2024-04-03: It seems a 6.6-1.0.x branch has appeared, so a new BSP was recently released (2024-03-29 according to https://www.nxp.com/docs/en/release-note/IMX_LINUX_RELEASE_NOTES.pdf)
- Since 6.4, mainline linux contains a driver for the ISI as well (commit). The parent directory contains two CSI drivers:
imx8mq-mipi-csi2.c
for imx8mq andimx-mipi-csis.c
which has compatible strings for imx7 and imx8-mm, but the register definitions are equal to the freescale-tree stagingimx8-mipi-csi2-sam.c
, so it is likely usable (directly or with some small definitions). CSI driver was added in this commit and generalized to include imx8 in this commit.meta-freescale
layer is still starting to be updated to the new BSP: Freescale/meta-freescale#1777 (that issue also has a nice list of things taken from the BSP). - This page shows the various kernel and BSP versions used by Toradex: https://developer.toradex.com/software/toradex-embedded-software/embedded-linux-release-matrix
- This page suggests there is an experimental mainline kernel for IMX8MP as well: https://developer.toradex.com/software/toradex-embedded-software/toradex-embedded-linux-support-strategy/
Next steps
- Figure out why python needs 200% CPU for just capturing at 12FPS into mmap buffers. Copying data after all? Or is kernel CPU time accounted to Python (even then, the kernel should have an easy time at 12FPS just exchanging buffer pointers). Looking at the dmesg output, I see 15FPS, so maybe some frames are discarded, but not much (and maybe none - could be rounding errors). See if OpenCV has the same or not.
- Try debayering, in python -> Works with opencv, though still too green
- Kailaptech camera on IMX219 board? -> Nope, different connector
- Put kailaptech connectors with cameras -> Done (at Erwin's desk)
- Submit kernel patches -> But where? Latest version has different driver... -> https://community.toradex.com/t/plans-for-updating-nxp-freescale-bsp-kernel-versions/22303/2
- Test -EINVAL for enum and get format for pad 15 (
mxc_isi.0: mxc_isi_subdev_get_fmt, Pad is not support now!
) -> Done, both work. -
- Figure out exposure and missing bits -> Try RAW ISI patch and rpi settings -> Updated torizon image did not include 708 driver, forgot to reconfig kernel -> Done building yocto full image build
- Try v4l2-compliance.
Capturing an image from python and using opencv to debayer produces a reasonable image:
- Using the rpi exposure settings gives reasonable exposure
- Color is still dominantly green, maybe bayer needs different weightings?
- OTOH, if using a test pattern and processing in the same way gives a clean test pattern with exactly the expected colors.
- Maybe the gains notify control is relevant, it is not supported by our kernel so was removed. I put hardcoded register writes for them with the default values in the driver, which seems to have improved the output a little (but not sure - did not save the old image), but did not resolve the green-ness.
- Looking at the raw file, in some areas it seems the green pixels are not equal in intensity, which is somewhat weird. But the pixel order should be good? Later it seemed that the green pixels are close enough.
Comparing with rpi
- Tried to compare with rpi with same settings, but could not produce a workable raw image so far.
- Libcamera has different units for exposure, 5000 -> 190 in v4l2 and gain 10 -> 921 in v4l.
- Using picam2 pythonscript (
picam2_dump_raw.py
) produced a raw image, seems that has the same values (about 0x60 for green pixels) as the raw image produced on toradex. This script does produce 10-bit values, which are shifted 6 bits manually in the script. Maybe this is wrong and the Toradex ISI does the same thing wrong? But if there is indeed 2 lost bits, then the max (green) pixel value should be 0x40, not 0x60? - Using
libcamera-jpeg -o test.jpg -r --shutter 5000 --gain 10
gives white pixels with around 0xe0 brightness, which is a lot more than the input 0x60 green pixels (and even darker red and blue pixels). Maybe the debayering or other filtering raises the brightness? - Using explicit whitebalance gains with libcamera (
libcamera-jpeg -o test.jpg -r --shutter 5000 --gain 10 --awb custom --awbgains 1,1
) produces similar results as the opencv-debayered raw from the toradex setup. Using red,blue gains 1,1 has a similar image (but a little greener, or maybe just brighter), using 0.1,0.1 produces much greener image, using 8,8 produces a purple image, using 2,2 produces a reasonably white image. The difference in red/blue pixel values between 1,1 and 2,2 is actually a factor 2, so the WB can probably heavily influence brightness. - Conclusion: The WB can bring the red and blue pixels on the same level as green, and then there is still a factor 2 brightness that is happening maybe in the debayering/white balance filtering?
- But since the RAW images are pretty much equal (both in color balance as well as in absolute values) , the captured raw images can probably be used as-is.
- Applying manual whitebalance (multiplying rggb pixels with numpy slices) improves the balance (values 2.1, 1, 1.5 for RGB seem about right, though the black is still a bit reddish). Maybe doing black-level correction first would help a bit more.
- Example DT for ISI in
arch/arm64/boot/dts/freescale/imx8mp.dtsi
- ISI driver consists of xbar subdev with num_ports (2) sink and source pads (plus one sink for m2m), num_channels (2) pipeline subdevs with 1 sink and source and a capture subdev for each channel with 1 sink. The xbar supports internal routing of sinks to sources (default is 1-to-1 routing), each xbar port/source pad is connected to a single pipeline, the pipeline sources are connected to the capture sinks. All this happens in code. DT connects CSI to ISI (xbar).
- Made start with backporting in https://github.com/3devo/FM23/tree/wip/imx8-isi-from-6.6 see commit message for progress.
- With custom v4l code, removing all copies and undistortion, processing time is about 30ms per frame (so can run at full 14fps at 50% single core CPU). Adding undistortion raises processing time to 200ms (so 5fps) at 280% CPU. Adding the two frame copies that were in there (but probably not needed) raises time to 40ms, still 14fps. Removing all pixel processing (just cropping and scanlines) has 8ms processing.
- Recorder produces 8-bit PNG, but unclear how 10-bit (or probably 16-bit LSB-padded produced by v4l) to 8 bit conversion is done. Test pattern shows clear bayer patterns with full white or full black pixels, but that might still only 8 LSBs.
- The ISI only supports 4096 pixel wide images, so implicitly crops
- Cropping can be explicitly configured using the selection API (there is also an old and deprecated CROP API that is not supported by imx708/ISI).
- IMX708 only supports get_selection, to query the hardcoded cropping of black pixels
- ISI supports set_selection on its capture device, but that only allows configuring the output / dst_f cropping.
- ISI also supports subdev_set_selection on its subdev, which allows setting both src_f and dst_f.
- subdev_set_selection sets
c_width
andc_height
, while subdev_get_selection returnswidth
andheight
, so the configured size cannot be queried back. This is probably a bug. The non-subdev g_selection does not have this problem. The actual hw writes (mxc_isi_channel_set_crop
) use thec_*
versions. - In practice, setting either CROP (src_f) or COMPOSE (dst_f)
left=256
does not actually seems to produce cropping, the image is not shifted. - From the datasheet, it seems that cropping is done at the output stage, so the 4096-pixel truncation likely has already happened by the time the configurable crop is applied.
- Solution: Apply image cropping in IMX708 driver. Info on relevant registers: https://github.com/Hermann-SW/imx708_regs_annotated
- imx274.c can serve as example. imx296 is a bit simpler (only allows setting crop selection, not compose/output), but that also does not support binning, so is probably not appropriate for us. imx335.c only supports get_selection.
Next steps:
- Implement cropping in IMX708. Example from imx274.c -> TODO
- Fix background calibration. Does it work right now? -> Works
- Remove background images -> Done
- Serve live images from memory? -> Save for golang
- Also serve background calibration image? -> Done
2024-07: Tested AR1335 raw module ordered by Koen. Code at https://github.com/3devo/FM23-kernel-modules/tree/ar1335, overlay in commit message. Works right away (after some hardware patches, R19 and all 10k resistors were accidentally 680k, replaced R19 with 2.7k to make it work).
Capture in RGB (i.e. RAW-interpreted-as-RGB so blue image) works. Exposure and gain seems somewhat low, hardcoded higher gain in code (driver only supports private ioctls for this). Capture in RAW10 does not work yet. Capturing in RG10 like imx708 just stalls waiting for data. AR1335 actually has GRBG/BA10 order instead of RGGB/RG/0. Capturing with pixelformat=RG10 fails with invalid format, because isi driver hardcodes RGGB. Modifying isi driver for GRBG and capturing with BA10 stalls again. Did not investigate further.
Focus did not seem entirely correct with the used lens, but the blueness and limited bit depth (due to some incorrect translation somewhere probably) makes it hard to really see how the focus is.