core-video Questions
3
Solved
So I've setup CVPixelBuffer's and tied them to OpenGL FBOs successfully on iOS. But now trying to do the same on OSX has me snagged.
The textures from CVOpenGLTextureCacheCreateTextureFromImage r...
Kirwin asked 18/12, 2012 at 12:54
2
Solved
I'm trying to figure out how the ownership works with the function CVMetalTextureGetTexture:
CVMetalTextureRef textureRef;
// ... textureRef is created
id<MTLTexture> texture = CVMetalTextur...
Sluiter asked 6/11, 2017 at 16:3
3
Solved
I am attempting to perform a deep clone of CMSampleBuffer to store the output of a AVCaptureSession. I am receiving the error kCMSampleBufferError_InvalidMediaFormat (OSStatus -12743) when I run th...
Blackguard asked 22/2, 2016 at 19:2
3
Solved
Is there any way using AVFoundation and CoreVideo to get color info, aperture and focal length values in real-time?
Let me explain. Say when I am shooting video I want to sample the color in a sma...
Undervalue asked 22/9, 2010 at 18:22
3
Solved
I have a project where I need to decode h264 video from a live network stream and eventually end up with a texture I can display in another framework (Unity3D) on iOS devices. I can successfully de...
Aureliaaurelian asked 20/10, 2015 at 19:14
1
Solved
In the UI of my iOS app, I display a complex hierarchy of CALayers. One of these layers is a AVPlayerLayer that displays a video with CIFilters applied in real time (using AVVideoComposition(asset:...
Piccadilly asked 24/10, 2019 at 10:12
0
I have been following the apple's live stream camera editor code to get the hold of live video editing.
So far so good, but I need a way out to crop a sample buffer into 4 pieces and then process...
Ashil asked 20/2, 2019 at 18:11
8
Solved
I'm accessing the camera in iOS and using session presets as so:
captureSession.sessionPreset = AVCaptureSessionPresetMedium;
Pretty standard stuff. However, I'd like to know ahead of time the r...
Kassab asked 17/10, 2011 at 7:6
6
Solved
I'm sure something's wrong with my buffer attributes, but it's not clear to me what -- it's not well documented what's supposed to go there, so I'm guessing based on CVPixelBufferPoolCreate -- and ...
Bloodstain asked 27/4, 2011 at 21:43
4
I'm recording video and audio using AVCaptureVideoDataOutput and AVCaptureAudioDataOutput and in the captureOutput:didOutputSampleBuffer:fromConnection: delegate method, I want to draw text onto ea...
Goring asked 3/6, 2015 at 1:16
4
I am writing some video processing code using AVComposition. Giving only the necessary background details, I receive a CVPixelBuffer from an apple API that I do not control. This CVPixel buffer, co...
Disannul asked 17/11, 2014 at 5:8
1
What's the difference between CVImageBuffer which is returned by CMSampleBufferGetmageBuffer and CVPixelBuffer? I just want to retrieve the image planes (RGBA) but I can't figure out how to retriev...
Ahl asked 30/3, 2018 at 4:55
1
Solved
I get a CVPixelBuffer from ARSessionDelegate:
func session(_ session: ARSession, didUpdate frame: ARFrame) {
frame.capturedImage // CVPixelBufferRef
}
But another part of my app (that I can't c...
Coppice asked 27/12, 2017 at 13:57
1
I have received a CMSampleBufferRef from a system API that contains CVPixelBufferRefs that are not RGBA (linear pixels). The buffer contains planar pixels (such as 420f aka kCVPixelFormatType_420Yp...
Jink asked 2/10, 2017 at 11:44
4
Solved
I'm creating a MTLTexture from CVImageBuffers (from camera and players) using CVMetalTextureCacheCreateTextureFromImage to get a CVMetalTexture and then CVMetalTextureGetTexture to get the MTLTextu...
Spaak asked 21/4, 2017 at 19:41
2
Solved
Is there a way to list all CVPixelBuffer formats for CVPixelBufferCreate() that will not generate error -6683: kCVReturnPixelBufferNotOpenGLCompatible when used with CVOpenGLESTextureCacheCreateTex...
Fiddle asked 26/11, 2014 at 12:33
1
According to Apple docs, it's an image stored in main memory, but how are those images used to make a movie?
Dunker asked 24/1, 2017 at 7:12
1
Solved
I'm trying to render I420 (YCbCr planner) via MetalKit
most of examples are using the CMSampleBuffer which from Camera,
but my goal is using a given I420 bytes.
I do something like this:
let da...
Rootstock asked 30/6, 2016 at 10:16
3
Solved
I am trying to convert a pixelBuffer extracted from AVPlayerItemVideoOutput to CIImage but always getting nil.
The Code
if([videoOutput_ hasNewPixelBufferForItemTime:player_.internalPlayer.curren...
Retool asked 9/3, 2013 at 22:15
1
I'm refactoring my iOS OpenGL-based rendering pipeline. My pipeline consist of many rendering steps, hence I need a lot of intermediate textures to render to and read from. Those textures are of va...
Woodson asked 17/10, 2014 at 16:39
3
I'm trying to rotate a video to its correct orientation using an AVAssetExportSession and I always get the following error:
Error Domain=AVFoundationErrorDomain Code=-11841 "The operation couldn’t...
Rascal asked 13/9, 2012 at 15:48
1
I am trying to create a 3-channel CVOpenGLESTexture in iOS.
I can successfully create a single-channel texture by specifying kCVPixelFormatType_OneComponent8 in CVPixelBufferCreate() and GL_LUMINA...
Tyrannize asked 25/11, 2014 at 14:46
3
Solved
I'm looking for a way to retrieve the individual frames of a video using iOS API.
I tried using AVAssetImageGenerator but it seems to only provide frame to the nearest second which is a bit too rou...
Matthia asked 21/7, 2011 at 21:45
1
I'm trying to place some text of a part of a video that I'm creating.
I realize that I can use a layer with text over an entire length of the video using CALayers, but how do I do it over only par...
Stative asked 10/10, 2013 at 2:50
1
How to choose the a pixel format type (kCVPixelBufferPixelFormatTypeKey) for use with AVAssetReader?
We are using AVAssetReader and AVAssetWriter somewhat in the style as noted in Video Encoding using AVAssetWriter - CRASHES basically to read a video what we got from the photo gallery / asset libr...
Mortie asked 7/1, 2013 at 7:41
1 Next >
© 2022 - 2025 — McMap. All rights reserved.