How to use 2328x1748 sensor in NVIDIA Jetson Nano B01?

  1. Where did you get the camera module(s)?

  1. Model number of the product(s)?


  1. What hardware/platform were you working on?

NVIDIA Jetson Nano B01.

  1. Instructions you have followed. (link/manual/etc.)

ArduCAM/Jetson_IMX519_Focus_Example (

[email protected]:~/Arducam-IMX519$ v4l2-ctl --list-formats-ext
        Index       : 0
        Type        : Video Capture
        Pixel Format: 'RG10'
        Name        : 10-bit Bayer RGRG/GBGB
                Size: Discrete 4656x3496
                        Interval: Discrete 0.100s (10.000 fps)
                Size: Discrete 3840x2160
                        Interval: Discrete 0.048s (21.000 fps)
                Size: Discrete 1920x1080
                        Interval: Discrete 0.017s (60.000 fps)
                Size: Discrete 1280x720
                        Interval: Discrete 0.008s (120.000 fps)
  1. Problems you were having?

I can’t use 2328x1748 sensor which exists if I plug the IMX519 to Raspberry Pi 4B. This sensor does not exist in NVIDIA Jetson Nano B01.

  1. Troubleshooting attempts you’ve made?

I use 4656x3496 sensor because it has the same aspect ratio which is 1,3:1 . However it is slow, it only have 10 fps instead of 30 fps.

  1. What help do you need?

I need help on how to use the 2328x1748 sensor or any sensor with aspect ratio 1,3:1 (other available sensor have aspect ratio 1,7:1) that have higher fps in NVIDIA Jetson Nano B01.


First of all, thank you for the information.

There is indeed a missing resolution on the jetson. I don’t know why this resolution is missing, so I will try it.

You can tell me your jetson system version, if there is no problem, I will send you the installation package to update this resolution

cat /etc/nv_tegra_release

If the test is ok. Because there are many versions, I will not update all versions, I will fix this problem in your version and the latest version later.please understand.


cat /etc/nv_tegra_release


# R32 (release), REVISION: 7.1, GCID: 29818004, BOARD: t210ref, EABI: aarch64, DATE: Sat Feb 19 17:05:08 UTC 2022


I tried it out today but haven’t finished it yet. Tomorrow is the weekend, I will continue testing on Monday.
Happy weekend to you in advance.

Happy weekend.

TL:DR, I hope Arducam can be our partner in building Computer Vision product for the next 1 year. Down below are issues that I am having with Arducam IMX519 and Synchornized Quad-Camera Kit IMX519.

Will there be an update for IMX519?

  1. ArduCAM/Jetson_IMX519_Focus_Example (
  2. ArduCAM/MIPI_Camera (

The Jetson_IMX519_Focus_Example is buggy. So, currently I wrote a C++ code to control the Autofocus.

I am hoping that there will be a standard library to control IMX519 in C++ since running Computer Vision in Python is not a good idea with small device i.e. NVIDIA Jetson Nano or Xavier NX.

I am also hoping there will be a solution to extend the Synchronized Quad-Camera Kit IMX519. The currently available cable is only 30cm. Meanwhile, i.e. a conveyor belt can be as long as 100cm.

I found another bug but the solution is quite simple. Is this expected or it should not be happening as well?

  1. If the frame_rate is set to the suggested frame rate by $ v4l2-ctl --list-formats-ext, the captured frame will have black pixels at the bottom of the frame.

An example that will have black pixels at the bottom of the frame:

def gstreamer_pipeline(

The solution is to substract the suggested frame rate by 1

def gstreamer_pipeline(


The amount of data is too fast. This will happen. Reducing the frame rate is also our strategy of solving this problem.

But this black border problem should not occur. I will try it later.

Can you explain it in detail? If there is a problem, I will update it.

Not currently not.

For these questions you need to contact [email protected].

I also welcome you to be our partner, but I will not discuss matters related to cooperation here. You can contact [email protected] Including the two questions mentioned above, if you have an idea, you can say it in the email.

In the forum, I will mainly take care of your technical questions.


Complement my 2328x1748 results on the nano.

2328x1748 cannot meet jetson’s alignment requirements, so it cannot be used on jetson.

I have sent you an email, kindly check.


If you use X11 Forwarding, the ROI window need to have delay to avoid error. Not sure whether the frame is the culprit or the cv2.imshow is the culprit.

def statsThread(camera, focuser: Focuser, focusState: FocusState):
    maxPosition = focuser.opts[focuser.OPT_FOCUS]["MAX_VALUE"]
    lastPosition = 0
    focuser.set(Focuser.OPT_FOCUS, lastPosition) # init position
    lastTime = time.time()

    sharpnessList = []

    # Known issue due to X11 forwarding.
    # the error: can't create 2 new windows at the same time.
    # solution: delay imshow("ROI") by 1 second.   
    while not focusState.isFinish():
        frame = camera.getFrame(1000)
        if frame is None:

        roi_frame = getROIFrame(focusState.roi, frame)

        if focusState.verbose:
            # Known issue due to X11 forwarding
            # the error: can't update 2 new windows at the same time.
            # solution: delay imshow("ROI") by 100ms.
            cv2.imshow("ROI", roi_frame)

You have to add a delay before updating the ROI window for the last time. Otherwise, you will more likely to get blurry frame.

    # End of stats.
    focusState.sharpnessList.put((-1, -1))

    if focusState.verbose:
        for sharpness in sharpnessList:
        frame = camera.getFrame(1000)
        roi_frame = getROIFrame(focusState.roi, frame)
        cv2.imshow("ROI", roi_frame)
        print("stats done.")

If the app crash, you need to reboot the computer. The solution is to close the camera. In C++ you can use deconstructor. In Python, you can use atexit.

if __name__ == "__main__":
    args = parse_cmdline()
    imageCount = args.image_count
    if args.i2c_bus == 7:
        sensor_id = 0
    elif args.i2c_bus == 8:
        sensor_id = 1
        raise ValueError("The I2C Bus is not supported")
    camera = Camera(sensor_id=sensor_id)
    # close the camera when an unknown error occurs.

This is what happen if there is no atexit. Not sure how to fix this. Do you have documentation regarding i2cset for IMX519?

nvbuf_utils: Could not get EGL display connection
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, execute:751 Failed to create CaptureSession
[ WARN:0] global /home/nvidia/host/build_opencv/nv_opencv/modules/videoio/src/cap_gstreamer.cpp (933) open OpenCV | GStreamer warning: Cannot query video position: status=0, value=-1, duration=-1
Error: Write failed
write: 0
Error: Write failed
i2cset -y 7 0xc 0x0 0x0
Error: Write failed
i2cset -y 7 0xc 0x1 0x0
Traceback (most recent call last):
  File "", line 58, in <module>
    cv2.imshow(winName, frame)
cv2.error: OpenCV(4.1.1) /home/nvidia/host/build_opencv/nv_opencv/modules/highgui/src/window.cpp:352: error: (-215:Assertion failed) size.width>0 && size.height>0 in function 'imshow'


The problem does exist, and the workaround works. You can submit a pr and I’ll merge it.

I haven’t found this error, can you tell me how to reproduce it?
Or you can submit a pr and if it doesn’t affect usage, I’ll agree to merge it too.

Thanks for reporting bugs in our code, it’s very helpful.

And sorry.
I don’t have permission to share any manuals, they belong to the company. If you have a need for a manual, you can raise it in the email.