V4L2 UVC camera
-
I am trying to get a camera stream of a OV5640 camera (DELOCK 96368) which is connected over usb to the omega2 with uvc.
With the provided example of the mjpg-streamer it works nice to stream about 20-30 fps at fullHD.Now I want to use the camera in my c++ application, so I am using the v4l2 api. I looked at different examples and also the mjpg-streamer source code, but I could not find a solution.
My program is working, it detects the camera, and also gets images, but only at 10 fps..
The settings of the image format, this is supported by the camera, the v4l2 api reports these as available.v4l2_format imageFormat; imageFormat.type = V4L2_BUF_TYPE_VIDEO_CAPTURE; imageFormat.fmt.pix.width = 640; imageFormat.fmt.pix.height = 480; imageFormat.fmt.pix.pixelformat = V4L2_PIX_FMT_MJPEG; imageFormat.fmt.pix.field = V4L2_FIELD_ANY; // tell the device you are using this format if(ioctl(m_fd, VIDIOC_S_FMT, &imageFormat) < 0){ perror("Device could not set format, VIDIOC_S_FMT"); return -1; }
I am setting the fps manually:
struct v4l2_streamparm setfps; memset(&setfps, 0, sizeof(struct v4l2_streamparm)); setfps.type = V4L2_BUF_TYPE_VIDEO_CAPTURE; setfps.parm.capture.timeperframe.numerator = 1; setfps.parm.capture.timeperframe.denominator = 30; ret = ioctl(m_fd, VIDIOC_S_PARM, &setfps); printf("set fps ret: %d, framerate set to %d/%d\n", ret, setfps.parm.capture.timeperframe.numerator, setfps.parm.capture.timeperframe.denominator);
Setting the buffer to be memory mapped:
v4l2_requestbuffers requestBuffer = {0}; requestBuffer.count = 1; // one request buffer requestBuffer.type = V4L2_BUF_TYPE_VIDEO_CAPTURE; // request a buffer wich we an use for capturing frames requestBuffer.memory = V4L2_MEMORY_MMAP; if(ioctl(m_fd, VIDIOC_REQBUFS, &requestBuffer) < 0){ printf("Could not request buffer from device, VIDIOC_REQBUFS"); return 1; } v4l2_buffer queryBuffer = {0}; queryBuffer.type = V4L2_BUF_TYPE_VIDEO_CAPTURE; queryBuffer.memory = V4L2_MEMORY_MMAP; queryBuffer.index = 0; if(ioctl(m_fd, VIDIOC_QUERYBUF, &queryBuffer) < 0){ perror("Device did not return the buffer information, VIDIOC_QUERYBUF"); return 1; } // use a pointer to point to the newly created buffer // mmap() will map the memory address of the device to // an address in memory m_buffer = (char*)mmap(NULL, queryBuffer.length, PROT_READ | PROT_WRITE, MAP_SHARED, m_fd, queryBuffer.m.offset); memset(m_buffer, 0, queryBuffer.length);
Then I do the start streaming once (code not shown here) and then I dequery and query the buffers:
// queue if (ioctl(m_fd, VIDIOC_QBUF, &bufferinfo) < 0) { perror("Could not queue buffer, VIDIOC_QBUF"); return 1; } // dequeue if (ioctl(m_fd, VIDIOC_DQBUF, &bufferinfo) < 0) { perror("Could not dequeue the buffer, VIDIOC_DQBUF"); return 1; }
I am stopping the timing of the two functions, the dequeuing need about 100ms, which explain the 10 fps.
But no matter what I change it does not work faster.
The mjpg-streamer uses almost the same api calls, one difference is the/* * recent linux-uvc driver (revision > ~#125) requires to use dynctrls * for pan/tilt/focus/... * dynctrls must get initialized */ if(dynctrls) initDynCtrls(cams[id].videoIn->fd); enumerateControls(cams[id].videoIn, cams[id].pglobal, id); // enumerate V4L2 controls after UVC extended mapping
But I think thats only for advanced options.