Installing Arducam Jetvariety driver does not work on brand new Jetson Nano

From Eirikur:
Error states: eiki@nano:~/camera/MIPI_Camera/Jetson/Jetvariety/driver$ sudo apt install ./arducam-nvidia-l4t-kernel_4.9.140-32.3.1-20200331173033_arm64.deb … arducam-nvidia-l4t-kernel : PreDepends: nvidia-l4t-ccp-t210ref but it is not installable

Arducam Support Reply:
The deb installation package of the MIPI driver on Jetson Nano supports L4T32.3.1 which is JetPack 4.3.
Our engineer just updated it and make it support L4T32.4.2 in commit 462c762.

From Eirikur:
Thanks for the quick response! I had a feeling this was the case so I downgraded my nano with the older JetPack and got both UC-599 cameras to work. Obviously I want to use the latest JetPack so I will test the new commit later today and get back to you.

On a related note. Are there any more examples on programmatically controlling the cameras? I’m planning to use the external trigger function to slightly offset capturing on the two cameras and combine their frames for high speed object analysis. I’m wondering if the same thing could be achieved in code without the physical triggering?

Also, any good examples on streaming the cameras with gstreamer or similar realtime streaming would be appreciated. The nano might not be powerful enough to train the deep learning model I’m doing so my backup plan is to stream the frames to a Jetson TX2 or Xavier for training and inference.

Arducam Support Reply:

  1. The external trigger function can only work with physical triggering at the present.
  2. Sorry to let you know that there is no gstreamer examples, we will keep you posted if there is any good news from them.

From Eirikur:
Thanks for that update but where is the documention for the api/features of the monochrome jetson nano cameras like the UC-599 camera board? There is just one example of using that on github and that is only for the external trigger function…
e.g. howcan I configure it for the max fps, do single frame captures in code, find out which pixelformat is supported etc… am I missing some general Arducam examples section that works for all cameras?

Looking for something like this or better…
https://www.arducam.com/docs/cameras-for-raspberry-pi/mipi-camera-modules/camera-userland-driver-sdk-and-examples/

I can confirm that the new driver works on JetPack 4.4. However I notice a strange effect now when recording a stream in VLC. The video recording has double width. This didn’t happen before with JetPack 4.3 and the older driver. See screenshot. Also…I mentioned that I needed more examples if you can share. In the screenshot you can see I’m trying to use “jetson-inference” project but I’m not having success. Note that I have 2 cameras connected but was only showing one when I used the record function in VLC.
screenshot

I can confirm now that the new driver on Github works on JetPack 4.4.

However I notice a strange effect now when recording a stream in VLC. The video recording has double width (half of it is black). This didn’t happen before with JetPack 4.3 and the older driver. See attached screenshot. Note that I have 2 cameras connected but was only showing one when I used the record function in VLC.

Also…I mentioned that I needed more coding examples. These high speed cameras are great but USELESS if one cannot use them for training models or inferring like other cameras on the Jetson platform. I have a Nano, TX2 and Xavier if you guys need a good alpha/beta tester. A good priority would be to get Nvidia’s official “Two days to a demo” working on Jetson Nano with your Jetvariety cameras. This is what most developers will try first (works out of the box on TX2 with the built in camera).

See here: https://developer.nvidia.com/embedded/twodaystoademo
and more specifically here is the code to compile with lots of examples to run on various pre-trained neural networks:
https://github.com/dusty-nv/jetson-inference

The output of the examples seems to recognize and initialize the cameras ok but the whole thing freezes when it tries to display the camera feed live.

Here’s how you get up and running with it:

git clone --recursive https://github.com/dusty-nv/jetson-inference
cd jetson-inference/
mkdir build
cd build/
cmake …/

go through wizards

make
sudo make install
sudo ldconfig

probably reboot just in case…

Thanks!

I can confirm now that the new driver on Github works on JetPack 4.4.

However I notice a strange effect now when recording a stream in VLC. The video recording has double width (half of it is black). This didn’t happen before with JetPack 4.3 and the older driver. See attached screenshot. Note that I have 2 cameras connected but was only showing one when I used the record function in VLC.

Also…I mentioned that I needed more coding examples. These high speed cameras are great but USELESS if one cannot use them for training models or inferring like other cameras on the Jetson platform. I have a Nano, TX2 and Xavier if you guys need a good alpha/beta tester. A good priority would be to get Nvidia’s official “Two days to a demo” working on Jetson Nano with your Jetvariety cameras. This is what most developers will try first (works out of the box on TX2 with the built in camera).

See here: https://developer.nvidia.com/embedded/twodaystoademo
and more specifically here is the code to compile with lots of examples to run on various pre-trained neural networks:
https://github.com/dusty-nv/jetson-inference

The output of the examples seems to recognize and initialize the cameras ok but the whole thing freezes when it tries to display the camera feed live.

Here’s how you get up and running with it:

git clone --recursive https://github.com/dusty-nv/jetson-inference
cd jetson-inference/
mkdir build
cd build/
cmake …/

go through wizards

make
sudo make install
sudo ldconfig

probably reboot just in case…

Thanks!

I can confirm now that the new driver on Github works on JetPack 4.4.

However I notice a strange effect now when recording a stream in VLC. The video recording has double width (half of it is black). This didn’t happen before with JetPack 4.3 and the older driver. See attached screenshot. Note that I have 2 cameras connected but was only showing one when I used the record function in VLC.

Also…I mentioned that I needed more coding examples. These high speed cameras are great but USELESS if one cannot use them for training models or inferring like other cameras on the Jetson platform. I have a Nano, TX2 and Xavier if you guys need a good alpha/beta tester. A good priority would be to get Nvidia’s official “Two days to a demo” working on Jetson Nano with your Jetvariety cameras. This is what most developers will try first (works out of the box on TX2 with the built in camera).

See here: https://developer.nvidia.com/embedded/twodaystoademo
and more specifically here is the code to compile with lots of examples to run on various pre-trained neural networks:
https://github.com/dusty-nv/jetson-inference

The output of the examples seems to recognize and initialize the cameras ok but the whole thing freezes when it tries to display the camera feed live.

Here’s how you get up and running with it:

git clone --recursive https://github.com/dusty-nv/jetson-inference
cd jetson-inference/
mkdir build
cd build/
cmake …/

go through wizards

make
sudo make install
sudo ldconfig

probably reboot just in case…

Thanks!

I can confirm now that the new driver on Github works on JetPack 4.4.
However I notice a strange effect now when recording a stream in VLC. The video recording has double width (half of it is black). This didn’t happen before with JetPack 4.3 and the older driver. See attached screenshot. Note that I have 2 cameras connected but was only showing one when I used the record function in VLC.

Also…I mentioned that I needed more coding examples. These high speed cameras are great but USELESS if one cannot use them for training models or inferring like other cameras on the Jetson platform. I have a Nano, TX2 and Xavier if you guys need a good alpha/beta tester. A good priority would be to get Nvidia’s official “Two days to a demo” working on Jetson Nano with your Jetvariety cameras. This is what most developers will try first (works out of the box on TX2 with the built in camera).
See here: https://developer.nvidia.com/embedded/twodaystoademo
and more specifically here is the code to compile with lots of examples to run on various pre-trained neural networks:
https://github.com/dusty-nv/jetson-inference
The output of the examples seems to recognize and initialize the cameras ok but the whole thing freezes when it tries to display the camera feed live.
Here’s how you get up and running with it:
git clone --recursive https://github.com/dusty-nv/jetson-inference
cd jetson-inference/
mkdir build
cd build/
cmake …/

go through wizards

make
sudo make install
sudo ldconfig

probably reboot just in case…

Thanks!

Is the vlc playback normal? Does the problem occur only when recording?

@wong Yes the playback is normal. The extra width is only in the recorded file.

Hi @eikish ,

Then I don’t think it’s a problem of the camera itself. This may be a problem of vlc. Can you tell me how you recorded it with vlc? I want to try to reproduce this problem.

We will continue to follow up on “Two days to a demo” and “jetson-inference”, but it will take some time.

thanks @wong

see my Issue on GitHub for more information on the jetson inference (gstreamer issue)

https://github.com/ArduCAM/MIPI_Camera/issues/44

It’s like it almost works. Inference is the reason why people buy Jetson Nano.

On the VLC capture strangeness…I actually cannot reproduce it now. I may have updated some linux packages so the environment is not 100% the same. I’ll post again if it ever happens.

 

@wong I added a bunch of “working” examples with gst-launch-1.0 on the GitHub issue, please check it out. Would be great if you guys can look into what changes are needed on your side and/or Jetson-inference code to have your camera work out of the box with the downloaded neural networks on Jetson Nano.

https://github.com/ArduCAM/MIPI_Camera/issues/44

Hi @eikish ,

Thank you for the information, I will check it out, but it will take some time, please be patient.