cvpixelbuffer Questions

0

So I'm trying to record videos using AVAssetWriter, process each camera frame (e.g.: adding some watermark or text overlays) by creating CIImage from camera buffer ( CVImageBuffer), add some filter...
Hierogram asked 9/10 at 11:3

1

Solved

I have a project where I use ScreenCaptureKit. For various reasons out of the scope of the question, the format that I configure ScreenCaptureKit to use is kCVPixelFormatType_32BGRA -- I need the r...
Dorladorlisa asked 2/2, 2023 at 19:31

1

Solved

Is there a standard performant way to edit/draw on a CVImageBuffer/CVPixelBuffer in swift? All the video editing demos I've found online overlay the drawing (rectangles or text) on the screen and d...
Jaundice asked 18/3, 2022 at 2:15

4

I have an IOSurface-backed CVPixelBuffer that is getting updated from an outside source at 30fps. I want to render a preview of the image data in an NSView -- what's the best way for me to do that?...
Darrondarrow asked 16/11, 2020 at 19:24

2

Solved

I'm building a video export feature for one of my apps. In essence, the video is a series of one of six different images lasting for different (short) durations. The export works fine when I expo...
Gabelle asked 22/12, 2015 at 20:53

1

I am making an swift video app. In my app, I need to crop and horizontally flip CVPixelBuffer and return result which type is also CVPixelBuffer. I tried few things. First, I used 'CVPixelBuf...
Shopwindow asked 21/3, 2019 at 18:36

1

How to draw into CIImage (or maybe into CVPixelBuffer, but I guess it easier to add text to CIImage)? not to UIImage I record video (.mp4 file) using AVAssetWriter and CMSampleBuffer (from video, ...
Bronk asked 1/3, 2018 at 11:34

1

I'm getting the depth data from the TrueDepth camera, and converting it to a grayscale image. (I realize I could pass the AVDepthData to a CIImage constructor, however, for testing purposes, I want...
Justinajustine asked 9/12, 2020 at 3:14

2

Solved

In CVPixelBuffer object, have one or many planes. (reference) We have methods to get number, heigh, the base address of plane. So what exactly the plane is? And how it constructed inside a CVPixelB...
Elah asked 1/11, 2016 at 2:1

3

Solved

I am using the iPhone camera to capture live video and feeding the pixel buffer to a network that does some object recognition. Here is the relevant code: (I won't post the code for setting up the ...
Latonia asked 25/4, 2016 at 10:54

3

Solved

I am recording filtered video through an iPhone camera, and there is a huge increase in CPU usage when converting a CIImage to a UIImage in real time while recording. My buffer function to make a C...
Gitt asked 24/1, 2019 at 19:36

2

Solved

I'm having a great deal of difficulty coming up with code that reliably copies a CVPixelBuffer on any iOS device. My first attempt worked fine until I tried it on an iPad Pro: extension CVPixelBuf...
Oversleep asked 3/11, 2018 at 15:11

1

I am using AVCaptureSession & AVCapturePhotoOutput to capture RAW photo data from device's camera in the kCVPixelFormatType_14Bayer_RGGB format. I have got as far as getting the raw photo samp...
Hughie asked 25/5, 2017 at 16:26

4

Solved

How can I get the RGB (or any other format) pixel value from a CVPixelBufferRef? Ive tried many approaches but no success yet. func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleB...
Compton asked 2/1, 2016 at 19:21

5

I'm trying to get a CVPixelBuffer in RGB color space from the Apple's ARKit. In func session(_ session: ARSession, didUpdate frame: ARFrame) method of ARSessionDelegate I get an instance of ARFrame...
Electroluminescence asked 7/6, 2017 at 14:53

3

I need to create a copy of a CVPixelBufferRef in order to be able to manipulate the original pixel buffer in a bit-wise fashion using the values from the copy. I cannot seem to achieve this with CV...
Velour asked 24/5, 2016 at 15:58

3

I get pixels by OpenGLES method(glReadPixels) or other way, then create CVPixelBuffer (with or without CGImage) for video recording, but the final picture is distorted. This happens on iPhone6 when...
Snout asked 7/5, 2016 at 3:50

1

I have a CVPixelBuffer that I'm trying to efficiently draw on screen. The not-efficient way of turning into an NSImage works but is very slow, dropping about 40% of my frames. Therefore, I've tri...
Beardless asked 19/9, 2017 at 12:48

3

I have temporary variable tmpPixelBuffer with pixel buffer data, which is not nil, and when metadata objects are detected I want to create image from that buffer, so I could crop metadata images fr...
Inept asked 31/3, 2015 at 18:8

1

Solved

I am working on a function in my app to write images from my sample buffer to an AVAssetWriter. Curiously, this works fine on a 10.5" iPad Pro, but causes a crash on a 7.9" iPad Mini 2. I can't fat...
Panchromatic asked 15/2, 2018 at 22:29

0

I am currently attempting to change the orientation of a CMSampleBuffer by first converting it to a CVPixelBuffer and then using vImageRotate90_ARGB8888 to convert the buffer. The problem with my c...

2

I'm trying to resize a CVPixelBuffer to a size of 128x128. I'm working with one that is 750x750. I'm currently using the CVPixelBuffer to create a new CGImage, which I resize then convert back into...
Marquittamarr asked 12/6, 2017 at 22:1

1

Solved

I want to detect ball and have AR model interact with it. I used opencv for ball detection and send center of ball which I can use in hitTest to get coordinates in sceneView. I have been converting...
Mast asked 26/1, 2018 at 4:58

1

Solved

I try to add a B&W filter to the camera images of an ARSCNView, then render colored AR objects over it. I'am almost there with the following code added to the beginning of - (void)renderer:(...
Artieartifact asked 28/8, 2017 at 13:2

1

My camera app captures a photo, enhances it in a certain way, and saves it. To do so, I get the input image from the camera in the form of a CVPixelBuffer (wrapped in a CMSampleBuffer). I perform ...
Cass asked 11/7, 2017 at 15:34

© 2022 - 2024 — McMap. All rights reserved.