What hardware/platform were you working on?
Raspberry Pi 4
Hello,
I have got the MT9J001 to work with a Rev.D shield both using the C++ streaming demo and the Python ROS node.
However, the Python ROS node doesn’t work at full resolution (10MP), only with the cfg files at lower resolution.
The C++ demo works at full resolution, so I attempted to write a ROS node for it, but I cannot compile it in ROS. I get the following error:
/home/ubuntu/build_ws/src/camera/src/camera_node.cpp:35:1: error: ‘ArduCamCfg’ does not name a type
ArduCamCfg cameraCfg;
^
/home/ubuntu/build_ws/src/camera/src/camera_node.cpp:51:18: error: ‘uint8’ was not declared in this scope
cv::Mat JPGToMat(uint8* bytes, int length) {
^
/home/ubuntu/build_ws/src/camera/src/camera_node.cpp:51:25: error: ‘bytes’ was not declared in this scope
cv::Mat JPGToMat(uint8* bytes, int length) {
^
/home/ubuntu/build_ws/src/camera/src/camera_node.cpp:51:32: error: expected primary-expression before ‘int’
cv::Mat JPGToMat(uint8* bytes, int length) {
^
/home/ubuntu/build_ws/src/camera/src/camera_node.cpp:51:44: error: expected ‘,’ or ‘;’ before ‘{’ token
cv::Mat JPGToMat(uint8* bytes, int length) {
^
make[2]: *** [CMakeFiles/camera_node.dir/src/camera_node.cpp.o] Error 1
make[1]: *** [CMakeFiles/camera_node.dir/all] Error 2
make: *** [all] Error 2
cd /home/ubuntu/build_ws/build/camera; catkin build --get-env camera | catkin env -si /usr/bin/make --jobserver-fds=6,7 -j; cd -
The exact errors I will have to check later, it is something like USB_CAMERA_DATA_LEN_ERROR RECEIVE_LENGTH
In Python I tried to software trigger the camera. It works, but the frame rate is under 1Hz which really isn’t enough (I would like at least 2Hz).
I expect that the equivalent function in C++( sensor_msgs::ImagePtr msg = cv_bridge::CvImage(std_msgs::Header(), “bgr8”, image).toImag) is faster.
It is possible that a ROS wrapper for the C++ code would solve the problem. To do that, I would need to build a c++ code containing both Arducam libraries etc and ROS. My knowledge of make and catkin is currently not sufficient for this.
Hi, I had to work on something else for a while. Now I’m testing the MT9J001 again.
Previously, I was using a usb 2.0 shield. Now Im using a usb 3.0 shield.
With the USB 2.0 shield, the Python ROS node worked, however, with the USB 3.0 shield (UC-593 Rev.C) I can get the Streaming demo to work but when using the ROS node I get ‘open fail, rtn val=65281’
I have replaced the ArducamSDK.so in the arducam_usb2_ros/src file with the one in the RaspberryPi/Python/Streaming_demo file. (this is on a Raspberry Pi 4)
Thanks, that worked. I wrote a ROS node based on the Python Streaming example as the existing ROS node can’t seem to handle large images for some reason.
Now I’m having another issue: although I have set all the analog gains (and global analog gain) to the same value and all the digital gains to the same value, I still get patterning in the image. The green pixels are lighter, despite having exactly the same gain values. I have tried adjusting the gains individually but I don’t understand why this should be necessary.
I have set
REG = 0x0204, 0x65 //this is global analog gain
REG = 0x3032, 0x100 // these are all the digital gains
REG = 0x3034, 0x100
REG = 0x3036, 0x100
REG = 0x3038, 0x100
I tried a variety of the config files in the folder:
ArduCAM_USB_Camera_Shield-master\Config\USB3.0_UC-593 Rev.C\DVP\MT9J001
INcluding MT9J001_MONO_12b_3664x2748.cfg
In the demo picture on your website there aren’t any artifacts. Do you know which config file was used to make it?