Camera output as dmabuf?

1.Which seller did you purchase the product(s) from?

2.The Model number of the product(s) you have purchased?
3.Which Platform are you using the product(s) on?
Raspberry CM4
4.Which instruction are you following?

5.Has your product ever worked properly?

6.What problems are you experiencing?
Is there any way to access camera output as a dmabuf so I can just pass GPU memory pointer to the next processing software?

Our use case involves passing data from arducam to a OpenGL processing program, then to the jpeg encoder of the rpi, then to another output program.

We would like to eliminate as much CPU copying as possible as it’s a big blocker in our pipeline (>90% of the time the program is copying pixel data to and from OpenGL!).

What we would like to achieve is like what the people at RPi did in this repo, but of course with data passed down from arducam instead of the H264 decoder.

Relevant lines in the repo FYR:

Binding textures to an EGLImage constructed from dmabuf:
Obtaining dmabuf handles:

7.What attempts at troubleshooting have you already made?

8.How would you like us to help you?

Realized I didn’t fill

8.How would you like us to help you?
Here is the answer:

I would like to know ways to obtaining dmabuf pointers of the camera capture. AFAIK, the only way to obtain camera captures is through callback functions that give us data through CPU memory. Are there any ways to get pointers to GPU memory / are there any plans on implementing this function at ArduCam?


I have not tested the OpenGL now. I am not sure how to connect with our driver.


Hello bin,

For this application, I don’t need integration with opengl. Just need a mmal buffer (dmabuf) as provided by RPI. OpenGL can then bind to the dmabuf through an extension.


I know what you need. We have a demo,which will throw a data buf to the user when the frame is interrupted. You can get the data buf in the callback function. Maybe you can refer to it.

Feel free to let me know if you need more help. What’s more, can you share your source code?

I am not very clear about the OpenGL.



I am looking for Linux DMABUF support also. This will be more useful over time, as apps like OBS Studio and others use GPU for video processing instead of CPU. If there was a (open source) v4l2 kernel driver for Arducam USB 3 plus board, for example, then this would be possible. The existing closed source ArduCam “SDK” is a userspace driver, and won’t be able to support DMABUF.

Hi @edgecase

Can you send an email to [email protected]? Maybe we can organize an online meeting to discuss this issue.

I would love to know the solution too

(or the discussion to approach the problem at least)

Hello @mendoku-anson

It seems that your problem is different from edgecase.
You are using OV2311 on Raspberry Pi.
For your problem, there is a solution to your problem is to use OV2311 with jetvariety adapter board with v4l2 driver.

You mean there are actually ways to access the dmabuf directly in opengl for GPU processing?

OV2311 on Raspberry Pi can use the standard V4L2 driver. Whether dmabuf can be used depends on whether the V4L2 framework provides related functions.

I don’t know much about opengl, but the dmabuf currently provided by v4l2 can be sent directly to RaspberryPi’s ISP (no need to copy).

Oh the ISP… Thanks the pointer, will look for more in this direction

If you are familiar with C++, you can refer to libcamera, but it is a bit complicated.

1 Like