video-toolbox Questions
7
Solved
I had a lot of trouble figuring out how to use Apple's Hardware accelerated video framework to decompress an H.264 video stream. After a few weeks I figured it out and wanted to share an extensive ...
Suziesuzuki asked 8/4, 2015 at 20:44
3
I'm using ffmpeg 4.3.1 to convert videos from h264 to h265 and initially I was excited to discover that I can use my Mac's GPU to speed up the conversion with the flag hevc_videotoolbox.
My Mac har...
Prehistory asked 20/11, 2020 at 6:20
3
I'm using an AVSampleBufferDisplayLayer to decode and display H.264 video streamed from a server. When my app goes into the background and then returns to the foreground, the decoding process gets ...
Debi asked 3/3, 2015 at 21:7
1
I am facing the same problem as described here when trying to decode a frame on iPad Pro OS v14.3 (I am also using Olivia Stork's example):
25% of the picture data is decoded correctly, the rest of...
Meantime asked 5/1, 2021 at 9:37
3
Solved
I have a project where I need to decode h264 video from a live network stream and eventually end up with a texture I can display in another framework (Unity3D) on iOS devices. I can successfully de...
Aureliaaurelian asked 20/10, 2015 at 19:14
3
I am using an AVSampleBufferDisplayLayer to display CMSampleBuffers which are coming over a network connection in the h.264 format. Video playback is smooth and working correctly, however I cannot ...
Precipitous asked 13/9, 2015 at 22:0
2
I am able to compress video captured from camera device to h264 format using video toolbox framework, but when I tried to play that h264 file in VLC player I am not able to hear the audio of the vi...
Platelet asked 30/9, 2016 at 5:29
2
I have CMSampleBufferRef(s) which I decode using VTDecompressionSessionDecodeFrame which results in CVImageBufferRef after decoding of a frame has completed, so my questions is..
What would be the...
Segmentation asked 29/9, 2015 at 17:16
1
I'm trying to create an H.264 Compression Session with the data from my screen. I've created a CGDisplayStreamRef instance like so:
displayStream = CGDisplayStreamCreateWithDispatchQueue(0, 100, 1...
Ettaettari asked 10/3, 2017 at 14:57
1
So this is a more theoretical question/discussion, as I haven't been able to come to a clear answer reading other SO posts and sources from the web. It seems like there are a lot of options:
Brad ...
Beadroll asked 14/7, 2015 at 17:46
0
I followed this post to decode my h264 video stream frames.
My data frames as bellow:
My code:
NSString * const naluTypesStrings[] =
{
@"0: Unspecified (non-VCL)",
@"1: Coded slice of a n...
Monarda asked 22/10, 2016 at 16:56
0
I`ve catch a problem to create a compression session for the MPEG4 encoder with VideoToolbox after migration on Swift 3.0. Before the migration it worked fine.
Here is my upgraded code:
let ima...
Musclebound asked 11/10, 2016 at 21:53
2
Did anyone experience an issue where VideoToolbox can't decode Media Foundation Transform (MFT) encoded H264 properly? The decoded frame has green block distortion more than half of the frame. I tr...
Supersonic asked 5/6, 2015 at 22:33
1
Since VideoToolbox isn't available for tvOS, how do I decode video?
I have an app where I have frames of h.264 in memory (streams in over the network) and I was handling the decoding with VideoToo...
Jiggered asked 16/9, 2015 at 18:52
1
© 2022 - 2024 — McMap. All rights reserved.