Power & Source of Big Ideas

MIPI camera MCAM400 (OV4689) on NanoPC-T6 (RK3588)

Moderators: chensy, FATechsupport

Camera module MCAM400 (https://wiki.friendlyelec.com/wiki/index.php/Matrix_-_MCAM400) with OV4689 sensor was designed for NanoPi-M4 and similar boards.
But it's also compatible with NanoPC-T6.

See attached archive with DTS file rk3588-nanopi6-ov4689.dtsi.
Add line:
#include "rk3588-nanopi6-ov4689.dtsi"
at the beginning of file arch/arm64/boot/dts/rockchip/rk3588-nanopi6-rev01.dts and put rk3588-nanopi6-ov4689.dtsi to the same directory. Compile kernel and flash kernel.img and resource.img (see https://wiki.friendlyelec.com/wiki/index.php/NanoPC-T6#How_to_Compile).
After reboot camera MCAM400 will be available on slot CSI-0 (in this case use /dev/video22 below) or CSI-1 (use /dev/video31).

Capture 100 frames of uncompressed video to file:
v4l2-ctl -d /dev/video22 --set-fmt-video=width=1920,height=1080,pixelformat='NV12' --stream-mmap=4 --stream-poll --stream-to=test.yuv --stream-count=100
or
gst-launch-1.0 v4l2src device=/dev/video22 io-mode=4 num-buffers=100 ! video/x-raw,format=NV12,width=1920,height=1080,framerate=30/1 ! filesink location=test.yuv
Then you can convert and compress test.yuv:
ffmpeg -y -pix_fmt nv12 -s 1920x1080 -r 30 -i test.yuv -c:v libx264 -pix_fmt yuv420p test.mp4
or
ffmpeg -y -pix_fmt nv12 -s 1920x1080 -r 30 -i test.yuv -c:v mjpeg -pix_fmt yuv420p test.avi
Note that capturing of uncompressed video requires fast "disk", use NVMe SSD or RAM disk (tmpfs).

Capture video and compress in single command:
gst-launch-1.0 v4l2src device=/dev/video22 io-mode=4 ! video/x-raw,width=1920,height=1080,format=NV12,framerate=30/1 ! mpph264enc ! queue ! h264parse ! mpegtsmux ! filesink location=test.ts

Show video on display:
gst-launch-1.0 v4l2src device=/dev/video22 io-mode=4 ! video/x-raw,width=1920,height=1080,format=NV12,framerate=30/1 ! videoconvert ! xvimagesink

If you want to get high speed video 1920x1080 at 120 fps, then you must replace file (keep the old one!) driver/media/i2c/ov4689.c with provided file ov4689.c (sensor register values in this file were got from https://blog.csdn.net/u010018991/article/details/102687614), recompile and flash kernel (keep old kernel with original driver to restore later!).
Measurement of speed:
v4l2-ctl -d /dev/video22 --set-fmt-video=width=1920,height=1080,pixelformat='NV12' --stream-mmap=4 --stream-skip=5 --stream-count=1000 --stream-poll --stream-to=/dev/null
If you want to save video to file instead of /dev/null, use very fast disk, as transfer rate of video is about 360 MB/sec.
Restore the old kernel if you do not need high speed more.

Also I included file rk3588-nanopi6-hdmiclk.dtsi for whose who use 2K display and cannot set resolution 2560x1440. I use this overlay (I do not remember where I found it first time) on different RK3588 boards. To apply it, add line:
#include "rk3588-nanopi6-hdmiclk.dtsi"
to rk3588-nanopi6-rev01.dts and put file to the same directory. Recompile and flash resource.img.

Attachments

Some more notes.

What is told above is not a complete solution. If you will capture video, you will see that picture is too dark and green.
Partially to fight with darkness :) , you can increase sensor's gain (see below). But color correction is performed by ISP (image signal processor) and you must setup ISP to do it. Unfortunately, the last task (setup ISP) is not solved yet.

But let's go step by step.

Camera sensor has additional controls, the most interesting are exposure and gain.
First of all we must find V4L subdevice corresponding to the sensor. The simplest way is to look at sysfs for I2C device with address 0x36 (MCAM400 uses this address):
ls /sys/bus/i2c/drivers/ov4689/*-0036/video4linux/
If camera is present then you will get, for example: v4l-subdev2. This is our sensor's subdevice.
But we are not looking for easy ways. We will find the subdevice in more complex, but more correct (and usefull for understanding) way. We will investigate topology of media devices. Commands:
media-ctl -d /dev/media0 -p
or
media-ctl -d /dev/media0 --print-dot
outputs topology (graph) of elements (called "entity") for media device /dev/media0. We should execute such commands for the rest media devices (/dev/media1 and so on) until find entity with name containing "ov4689". It will be the last entity in a graph. For example:
- entity 63: m02_f_ov4689 7-0036 (1 pad, 1 link)
type V4L2 subdev subtype Sensor flags 0
device node name /dev/v4l-subdev2
pad0: Source
[fmt:SBGGR10_1X10/2688x1520@10000/300000 field:none]
-> "rockchip-csi2-dphy3":0 [ENABLED]

Here we see that name of this entity is "m02_f_ov4689" and it's subdevice is /dev/v4l-subdev2 (also we see format of video from camera). You can view the rest of this graph to make sure that entities are inter-connected according to dtsi-file from my previous post.
Now we can query for additional controls of sensor's subdevice:
v4l2-ctl -d /dev/v4l-subdev2 -l
We see "exposure", "analogue_gain" and other controls. We can get current value:
v4l2-ctl -d /dev/v4l-subdev2 -C analogue_gain
and set new value:
v4l2-ctl -d /dev/v4l-subdev2 -c analogue_gain=1500
You can do it dynamicaly, that is during video capturing.

Now about ISP.
I'm not an expert in this topic. Unfotunately it's poorly covered, so I will tell in more detail, almost all that I know :).
Different Rockchip SoCs have different ISP (image signal processor), or, as they say, they have different versions of ISP. You can look at file drivers/media/platform/rockchip/isp/Kconfig in kernel sources and learn that version 1 was used in SoCs up to rk3399, version 2.0 in rv1126/rv1109, version 2.1 in rk356x. And ISP in rk3588 has version 3.0. These ISP versions use different APIs that are incompatible with each other. Of course, if you know appropriate API, you can setup ISP directly with your own application. But ordinary people :) use already existing opportunities. Typically OS already has some tool to handle ISP. In our case this tool is called rkaiq_3A_server. I use FriendlyCore image dated with 20230515 and I found there rkaiq_3A_server (or it was installed with some package later). But it seems not usable: when running it always crashes with segmentation fault. So I compiled rkaiq_3A_server from Firelfy repo (see attached archive). But that's not enough. We must learn ISP how to process data from our sensor. The bad news is that it hard to do! Sensors ans ISPs has many specific parameters, so settings for one sensor are not suitable for other one. Usually we use already existing files with such settings for specific sensor and, moreover, for specific ISP. I have not yet met settings file for OV4689 sensor for ISP version 3.0. So we cannot capture nice pictures from MCAM400. But we can at least learn how it works :)
When rkaiq_3A_server starts, it searches for available sensors and tries to find settings for them. Settings are JSON files (for ISP version 3.0) located in directory /etc/iqfiles. Name of file with settings consists of: driver name ("ov4689" in our case), camera module name (defined by parameter "rockchip,camera-module-name" in DTS file) and lens name (parameter "rockchip,camera-module-lens-name" in DTS). As I set value "MCAM400" for both these parameters in DTS, the filename must be: ov4689_MCAM400_MCAM400.json. Without this file ISP will not properly process data from camera.
Unfortunately we do not have JSON file for any ov4689 module. There exists OV4689 settings file for old version of ISP, as it works on rk3399 boards. It's in XML format, not JSON. Also there exists special converter iqConverTer that translates settings from XML file to JSON, but current version of this converter produces JSON files for ISP versions 2.0 and 2.1 only. And I'm not sure that translation to version 3.0 is easy to do. So we must hope that somebody will do this work for us :)
But now for learning purposes we can do a hack: get settings for some other sensor and see what happens :)
I took JSON file imx464_CMK-OT1980-PX1_SHG102.json (obviously, for IMX464 sensor) from the same Firefly repo (file with the same name from FriendlyCore is not suitable) and rename it to ov4689_MCAM400_MCAM400.json. Then put this file to /etc/iqfiles, put rkaiq_3A_server and librkaiq.so to some directory and started server from this directory:
sudo LD_LIBRARY_PATH=. ./rkaiq_3A_server
Server successfully started and began to wait for capture stream start. I set analogue_gain of sensor (see above) to 2000 and start capture of video (see the previous post how to do it). Picture from camera was bright but it occupied only about a half of frame height and was mirrored. So we can conclude that settings for IMX464 are not suitable for OV4689 :)

That's all.
Cannot attach the archive as it too large, so download it here: https://disk.yandex.ru/d/tPnXrLB5xSy0vg
It contains rkaiq_3A_server, librkaiq.so and fake sensor settings file ov4689_MCAM400_MCAM400.json.
The next step.

Here is some preliminary version of JSON file with settings of MCAM400 for RK3588 (that is, for ISP 3.0).
Put JSON file from the attached archive to directory /etc/iqfiles and run rkaiq_3A_server (see the previous post).
Now you can capture video from your camera. For example:
gst-launch-1.0 v4l2src device=/dev/video22 io-mode=4 ! video/x-raw,width=2688,height=1520,format=NV12,framerate=30/1 ! xvimagesink

Colors are not perfect, but at least you will be able to understand what is displayed by camera :)

This JSON file uses static exposure and gain. You can find their values in section "LinearAeCtrl" / "InitExp":
"InitTimeValue": 0.03,
"InitGainValue": 8,
These values correspond to exposure 1399 and analogue_gain 1012 that you can obtain from sensor with command:
v4l2-ctl -d /dev/v4l-subdev2 -C analogue_gain -C exposure
You can change these values in JSON file (and restart rkaiq_3A_server) or modufy them on the fly:
v4l2-ctl -d /dev/v4l-subdev2 -c analogue_gain=2000 -c exposure=1500

I do not know how to setup ISP for automatic control of exposure/gain, color enhance and so on. There are too many parameters here that I don't understand :) . May be more qualified person can do it for us.

Attachments

Nice work.
How is your progress on the json file?

I have nanopi-m4 with a dual camera (ov4689) working fine, maybe there is a way to convert the XML to the new json version (i am just speculating). i have seen someone trying to do this with another sensor. he had some success doing this, but sure it was a lot of work.

I see you have set 120 fps as default, is it working at 120fps? My board is on the way, hope i get it next month.

Can you share some pictures? (to have an idea how is the image)
Awesome work so far @sergei_gagarin, congrats. I tried to get this to work based on the instructions given but I'm missing something, probably from an additional kernel config?

I am using the provided DTSI and the OV4689.c file on the rockchip-kernel 5.10.160 and the MCAM400 show up on the I2C bus and get initialized. The driver also seems to be loaded for the devices.

However, I can neither capture a video with v4l-ctl as you described, nor can I use gst-launch so far. I tried to visualize the media trees with this outcome:

https://www.dropbox.com/scl/fi/5e2evasrczfkznhyp55xx/graph0.pdf?rlkey=cv9ycjosjq1glduoemta2f2vv&dl=0
https://www.dropbox.com/scl/fi/31z11c5xrgj8qdj7arh5n/graph1.pdf?rlkey=mwmxcefkgexfxmttb3q90r5k0&dl=0
https://www.dropbox.com/scl/fi/3gbyknh3f27espy0sb1pa/graph2.pdf?rlkey=kwddds004u8qcwkehamb4il1s&dl=0
https://www.dropbox.com/scl/fi/7karx6d673vk339ylrwpn/graph3.pdf?rlkey=704on7pz86qbk9ld13gdu5mdt&dl=0

The OV4689 don't show up anywhere as /dev/videoXX device, it's only the main paths from the RKISPs... On the kernel log I get something like this:

Code: Select all

[    9.927317] rockchip-csi2-dphy0: No link between dphy and sensor
[    9.927333] rkcif-mipi-lvds2: rkcif_update_sensor_info: stream[0] get remote terminal sensor failed!
[    9.927341] stream_cif_mipi_id0: update sensor info failed -19
[    9.928508] rockchip-csi2-dphy0: No link between dphy and sensor
[    9.928527] rkcif-mipi-lvds2: rkcif_update_sensor_info: stream[3] get remote terminal sensor failed!
[    9.928532] stream_cif_mipi_id3: update sensor info failed -19
[    9.928858] rockchip-csi2-dphy0: No link between dphy and sensor
[    9.928873] rkcif-mipi-lvds2: rkcif_update_sensor_info: stream[2] get remote terminal sensor failed!
[    9.928878] stream_cif_mipi_id2: update sensor info failed -19
[    9.928984] rockchip-csi2-dphy0: No link between dphy and sensor
[    9.929003] rkcif-mipi-lvds2: rkcif_update_sensor_info: stream[2] get remote terminal sensor failed!
[    9.929010] rkcif_tools_id2: update sensor info failed -19
[    9.929611] rockchip-csi2-dphy0: No link between dphy and sensor
[    9.929619] rkcif-mipi-lvds2: rkcif_update_sensor_info: stream[1] get remote terminal sensor failed!
[    9.929623] stream_cif_mipi_id1: update sensor info failed -19
[    9.929868] rockchip-csi2-dphy0: No link between dphy and sensor
[    9.929875] rkcif-mipi-lvds2: rkcif_update_sensor_info: stream[0] get remote terminal sensor failed!
[    9.929879] rkcif_scale_ch0: update sensor info failed -19
[    9.930421] rockchip-csi2-dphy0: No link between dphy and sensor
[    9.930448] rkcif-mipi-lvds2: rkcif_update_sensor_info: stream[1] get remote terminal sensor failed!
[    9.930453] rkcif_scale_ch1: update sensor info failed -19
[    9.930851] rockchip-csi2-dphy0: No link between dphy and sensor
[    9.930861] rkcif-mipi-lvds2: rkcif_update_sensor_info: stream[2] get remote terminal sensor failed!
[    9.930865] rkcif_scale_ch2: update sensor info failed -19
[    9.931232] rockchip-csi2-dphy0: No link between dphy and sensor
[    9.931240] rkcif-mipi-lvds2: rkcif_update_sensor_info: stream[3] get remote terminal sensor failed!
[    9.931244] rkcif_scale_ch3: update sensor info failed -19
[    9.931386] rockchip-csi2-dphy0: No link between dphy and sensor
[    9.931394] rkcif-mipi-lvds2: rkcif_update_sensor_info: stream[0] get remote terminal sensor failed!
[    9.931397] rkcif_tools_id0: update sensor info failed -19
[    9.931411] rockchip-csi2-dphy0: No link between dphy and sensor
[    9.931418] rkcif-mipi-lvds2: rkcif_update_sensor_info: stream[1] get remote terminal sensor failed!
[    9.931421] rkcif_tools_id1: update sensor info failed -19
[    9.931923] rockchip-csi2-dphy3: No link between dphy and sensor
[    9.931936] rkcif-mipi-lvds4: rkcif_update_sensor_info: stream[0] get remote terminal sensor failed!
[    9.931939] stream_cif_mipi_id0: update sensor info failed -19
[    9.933034] rockchip-csi2-dphy3: No link between dphy and sensor
[    9.933035] rockchip-csi2-dphy3: No link between dphy and sensor
[    9.933051] rkcif-mipi-lvds4: rkcif_update_sensor_info: stream[1] get remote terminal sensor failed!
[    9.933056] stream_cif_mipi_id1: update sensor info failed -19
[    9.933058] rkcif-mipi-lvds4: rkcif_update_sensor_info: stream[2] get remote terminal sensor failed!
[    9.933060] stream_cif_mipi_id2: update sensor info failed -19
[    9.933263] ov4689 3-0036: driver version: 00.01.08
[    9.933354] ov4689 3-0036: Looking up avdd-supply from device tree
[    9.933358] ov4689 3-0036: Looking up avdd-supply property in node /i2c@feab0000/ov4689@36 failed
[    9.933375] ov4689 3-0036: supply avdd not found, using dummy regulator
[    9.933421] ov4689 3-0036: Looking up dovdd-supply from device tree
[    9.933424] ov4689 3-0036: Looking up dovdd-supply property in node /i2c@feab0000/ov4689@36 failed
[    9.933427] ov4689 3-0036: supply dovdd not found, using dummy regulator
[    9.933437] ov4689 3-0036: Looking up dvdd-supply from device tree
[    9.933439] ov4689 3-0036: Looking up dvdd-supply property in node /i2c@feab0000/ov4689@36 failed
[    9.933442] ov4689 3-0036: supply dvdd not found, using dummy regulator
[    9.933986] rockchip-csi2-dphy3: No link between dphy and sensor
[    9.933999] rkcif-mipi-lvds4: rkcif_update_sensor_info: stream[3] get remote terminal sensor failed!
[    9.934004] stream_cif_mipi_id3: update sensor info failed -19
[    9.935247] rockchip-csi2-dphy3: No link between dphy and sensor
[    9.935248] rockchip-csi2-dphy3: No link between dphy and sensor
[    9.935250] rockchip-csi2-dphy3: No link between dphy and sensor
[    9.935257] rockchip-csi2-dphy3: No link between dphy and sensor
[    9.935260] rkcif-mipi-lvds4: rkcif_update_sensor_info: stream[3] get remote terminal sensor failed!
[    9.935265] rkcif_scale_ch3: update sensor info failed -19
[    9.935266] rkcif-mipi-lvds4: rkcif_update_sensor_info: stream[1] get remote terminal sensor failed!
[    9.935268] rkcif_scale_ch1: update sensor info failed -19
[    9.935270] rkcif-mipi-lvds4: rkcif_update_sensor_info: stream[0] get remote terminal sensor failed!
[    9.935272] rkcif_scale_ch0: update sensor info failed -19
[    9.935274] rkcif-mipi-lvds4: rkcif_update_sensor_info: stream[0] get remote terminal sensor failed!
[    9.935279] rkcif_tools_id0: update sensor info failed -19
[    9.935478] ov4689 3-0036: Detected OV004688 sensor
[    9.935505] rockchip-csi2-dphy3: No link between dphy and sensor
[    9.935516] rkcif-mipi-lvds4: rkcif_update_sensor_info: stream[2] get remote terminal sensor failed!
[    9.935523] rkcif_scale_ch2: update sensor info failed -19
[    9.935848] ov4689 7-0036: driver version: 00.01.08
[    9.935926] ov4689 7-0036: Looking up avdd-supply from device tree
[    9.935930] ov4689 7-0036: Looking up avdd-supply property in node /i2c@fec90000/ov4689@36 failed
[    9.935947] ov4689 7-0036: supply avdd not found, using dummy regulator
[    9.935994] ov4689 7-0036: Looking up dovdd-supply from device tree
[    9.935997] ov4689 7-0036: Looking up dovdd-supply property in node /i2c@fec90000/ov4689@36 failed
[    9.936000] ov4689 7-0036: supply dovdd not found, using dummy regulator
[    9.936014] ov4689 7-0036: Looking up dvdd-supply from device tree
[    9.936016] ov4689 7-0036: Looking up dvdd-supply property in node /i2c@fec90000/ov4689@36 failed
[    9.936019] ov4689 7-0036: supply dvdd not found, using dummy regulator
[    9.936173] rockchip-csi2-dphy3: No link between dphy and sensor
[    9.936180] rkcif-mipi-lvds4: rkcif_update_sensor_info: stream[1] get remote terminal sensor failed!
[    9.936183] rkcif_tools_id1: update sensor info failed -19
[    9.937264] rockchip-csi2-dphy3: No link between dphy and sensor
[    9.937274] rkcif-mipi-lvds4: rkcif_update_sensor_info: stream[2] get remote terminal sensor failed!
[    9.937279] rkcif_tools_id2: update sensor info failed -19
[    9.938114] ov4689 7-0036: Detected OV004688 sensor


Is there something I missed? I somewhere read that the w/IR and wo/IR cams behave differently - I have the w/IR Filter one. Thx...

sergei_gagarin wrote:
Camera module MCAM400 (https://wiki.friendlyelec.com/wiki/index.php/Matrix_-_MCAM400) with OV4689 sensor was designed for NanoPi-M4 and similar boards.
But it's also compatible with NanoPC-T6.

See attached archive with DTS file rk3588-nanopi6-ov4689.dtsi.
Add line:
#include "rk3588-nanopi6-ov4689.dtsi"
at the beginning of file arch/arm64/boot/dts/rockchip/rk3588-nanopi6-rev01.dts and put rk3588-nanopi6-ov4689.dtsi to the same directory. Compile kernel and flash kernel.img and resource.img (see https://wiki.friendlyelec.com/wiki/index.php/NanoPC-T6#How_to_Compile).
After reboot camera MCAM400 will be available on slot CSI-0 (in this case use /dev/video22 below) or CSI-1 (use /dev/video31).

Capture 100 frames of uncompressed video to file:
v4l2-ctl -d /dev/video22 --set-fmt-video=width=1920,height=1080,pixelformat='NV12' --stream-mmap=4 --stream-poll --stream-to=test.yuv --stream-count=100
or
gst-launch-1.0 v4l2src device=/dev/video22 io-mode=4 num-buffers=100 ! video/x-raw,format=NV12,width=1920,height=1080,framerate=30/1 ! filesink location=test.yuv
Then you can convert and compress test.yuv:
ffmpeg -y -pix_fmt nv12 -s 1920x1080 -r 30 -i test.yuv -c:v libx264 -pix_fmt yuv420p test.mp4
or
ffmpeg -y -pix_fmt nv12 -s 1920x1080 -r 30 -i test.yuv -c:v mjpeg -pix_fmt yuv420p test.avi
Note that capturing of uncompressed video requires fast "disk", use NVMe SSD or RAM disk (tmpfs).

Capture video and compress in single command:
gst-launch-1.0 v4l2src device=/dev/video22 io-mode=4 ! video/x-raw,width=1920,height=1080,format=NV12,framerate=30/1 ! mpph264enc ! queue ! h264parse ! mpegtsmux ! filesink location=test.ts

Show video on display:
gst-launch-1.0 v4l2src device=/dev/video22 io-mode=4 ! video/x-raw,width=1920,height=1080,format=NV12,framerate=30/1 ! videoconvert ! xvimagesink

If you want to get high speed video 1920x1080 at 120 fps, then you must replace file (keep the old one!) driver/media/i2c/ov4689.c with provided file ov4689.c (sensor register values in this file were got from https://blog.csdn.net/u010018991/article/details/102687614), recompile and flash kernel (keep old kernel with original driver to restore later!).
Measurement of speed:
v4l2-ctl -d /dev/video22 --set-fmt-video=width=1920,height=1080,pixelformat='NV12' --stream-mmap=4 --stream-skip=5 --stream-count=1000 --stream-poll --stream-to=/dev/null
If you want to save video to file instead of /dev/null, use very fast disk, as transfer rate of video is about 360 MB/sec.
Restore the old kernel if you do not need high speed more.

Also I included file rk3588-nanopi6-hdmiclk.dtsi for whose who use 2K display and cannot set resolution 2560x1440. I use this overlay (I do not remember where I found it first time) on different RK3588 boards. To apply it, add line:
#include "rk3588-nanopi6-hdmiclk.dtsi"
to rk3588-nanopi6-rev01.dts and put file to the same directory. Recompile and flash resource.img.
got it working - the OV4689 needs to be compiled into the kernel and should not be a module (that's what I was missing from the description) :D

thx again
I just received the board and i was able to test and grab some images.
I rendered the frames at ~25 fps, so 3A is possibly capped at 30 fps. Maybe with libcamera one can have 120 fps, but I am not really sure if the driver can get 120 fps. I kept the original driver and added the 120 fps patch, maybe i did something wrong.

I was particularly interested in the dual-camera setup but the second camera image is dark and greenish, kind of.
i think FriendlyElec should invest in tuning the camera.

Here is what i get:

Attachments

Camera 1 and Camera 2:

Attachments

Fixed my 120 FPS mistake, I can't have other modes (or I couldn't find a way to set 120 FPS mode as the current mode).

Camera 2 ( CSI-1 sensor with IR filter), 120 FPS, rendering frames at ~50 FPS.
that's all folks.

screenshot (jpeg, 55% quality to fit 256Kb)

Attachments

By adjusting analog_gain and exposure we can get pretty nice images. see cam1 vs cam2 screenshot attached.
cam1 (left) : w/o IR filter (3.3 mm F2.2)
cam2 (right) : w IR filter (3.6 mm F2.0)

[code]"InitTimeValue": 0.03,
"InitGainValue": 8,
These values correspond to exposure 1399 and analogue_gain 1012[/code]

If i can understand how to balance analog_gain / exposure, it can be implemented in the capture program and then released the code, or we wait for the official tunning file.

Attachments

I do not know how to setup ISP for automatic control of exposure/gain, color enhance and so on. There are too many parameters here that I don't understand :) . May be more qualified person can do it for us.
I have nanopi-m4 with a dual camera (ov4689) working fine, maybe there is a way to convert the XML to the new json version (i am just speculating). i have seen someone trying to do this with another sensor. he had some success doing this, but sure it was a lot of work.
indigocard activate
jjsploit.click

Who is online

In total there are 36 users online :: 0 registered, 0 hidden and 36 guests (based on users active over the past 5 minutes)
Most users ever online was 5185 on Wed Jan 22, 2020 1:44 pm

Users browsing this forum: No registered users and 36 guests