core-media Questions

3

I need to convert CMSampleBuffer to Data format. I am using one Third party framework for audio related task. That framework gives me the streaming (i.e Real Time audio) audio in CMSampleBuffer obj...
Crusade asked 12/7, 2017 at 10:12

1

The CoreMediaIO Device Abstraction Layer (DAL) is analogous to CoreAudio’s Hardware Abstraction Layer (HAL). Just as the HAL deals with audio streams from audio hardware, the DAL handles video (and...
Siderolite asked 24/9, 2021 at 11:36

1

Solved

This is a continuation of a previous question asked a few years ago: Are MacOS Virtual Webcams inherently incompatible with 10.14's Hardened Runtime Library Validation? I notice that above ques...
Aculeus asked 7/6, 2022 at 23:22

3

After heavy usage of my app which running AVCaptureSession instance It's suffering DroppedFrameReason(P) = OutOfBuffers This is the details from SampleBuffer object in - (void)captureOutput:(...
Knapweed asked 14/12, 2016 at 14:36

3

Initial Observation Zoom for Mac 4.6.9, which addresses scary security flaws, removes the disable-library-validation entitlement. With the same release, Snap Camera, a virtual webcam app, stopped...
Haunch asked 9/4, 2020 at 5:50

3

Solved

I am attempting to perform a deep clone of CMSampleBuffer to store the output of a AVCaptureSession. I am receiving the error kCMSampleBufferError_InvalidMediaFormat (OSStatus -12743) when I run th...
Blackguard asked 22/2, 2016 at 19:2

4

I am writing some video processing code using AVComposition. Giving only the necessary background details, I receive a CVPixelBuffer from an apple API that I do not control. This CVPixel buffer, co...
Disannul asked 17/11, 2014 at 5:8

1

Solved

I get a CVPixelBuffer from ARSessionDelegate: func session(_ session: ARSession, didUpdate frame: ARFrame) { frame.capturedImage // CVPixelBufferRef } But another part of my app (that I can't c...
Coppice asked 27/12, 2017 at 13:57

1

I have received a CMSampleBufferRef from a system API that contains CVPixelBufferRefs that are not RGBA (linear pixels). The buffer contains planar pixels (such as 420f aka kCVPixelFormatType_420Yp...
Jink asked 2/10, 2017 at 11:44

2

Solved

This has been asked before, but something must have changed in Swift since it was asked. I am trying to store CMSampleBuffer objects returned from an AVCaptureSession to be processed later. After s...
Flesher asked 17/2, 2016 at 21:11

2

Solved

I have some code that creates CMBlockBuffers and then creates a CMSampleBuffer and passes it to an AVAssetWriterInput. What's the deal on memory management here? According to the Apple documentat...
Snowmobile asked 1/9, 2012 at 12:18

0

What I would like to is to play a video (either from a local file and from a remote URL) and its audio track and retrieve the pixel buffer of each frame of the video to draw it to an OpenGL texture...
Autumnautumnal asked 1/6, 2013 at 13:14

2

I'm working with AVFoundation for capturing and recording audio. There are some issues I don't quite understand. Basically I want to capture audio from AVCaptureSession and write it using AVWriter...

1

I need to decode mp4 file and draw it using OpenGL in ios app. I need to extract and decode h264 frames from mp4 file and I heard what it posible to do using CoreMedia. Anybody has any idea how to ...
Biochemistry asked 4/4, 2012 at 17:16

1

Solved

(Preface: This is my first audio-related question on Stack Overflow, so I'll try to word this as best as I possibly can. Edits welcome.) I'm creating an application that'll allow users to loop mus...
Ng asked 20/2, 2012 at 2:34

1

Solved

I’d like to convert a CGImage to CMSampleBufferRef and append it to a AVAssetWriterInput using the appendSampleBuffer: method. I’ve managed to get the CMSampleBufferRef using the following code, bu...
Thinking asked 20/9, 2010 at 12:57
1

© 2022 - 2024 — McMap. All rights reserved.