AVFoundation record 10 bit HDR video on iPhone 12
Asked Answered
M

5

7

iPhone 12/12 pro supports recording Dolby vision HDR video in 10 bit format instead of 8 bits but it is not clear from iOS 14.1 SDK if AVCaptureVideoDataOutput supports delivery of 10 bit sample buffers that can be appended to video file using AVAssetWriter. Has anyone figured it out whether it is possible or not in SDK?

EDIT: A number of apps such as Apple's Clips app have started supporting Dolby Vision 10 bit video recording. But I tried every available API including videoHDREnabled but it doesn't work. So the clear question is how to record HDR (Dolby vision) video using AVFoundation APIs?

EDIT2: I was able to figure out device formats that support 10 bit pixel buffer formats (which is 'x420', NOT the ones with 420v or 420f as media subtype) . On iPhone 12 mini, 4 device formats support 10 bit pixel buffer delivery in kCVPixelFormatType_420YpCbCr10BiPlanarVideoRange, even though AVFoundation documentation says this is not a supported pixel format (Quoting - "On iOS, the only supported key is kCVPixelBufferPixelFormatTypeKey. Supported pixel formats are kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange, kCVPixelFormatType_420YpCbCr8BiPlanarFullRange and kCVPixelFormatType_32BGRA."). Next step is to identify if HDR format for recording can be manually chosen to be Dolby Vision, HLG, or HDR10.

Melodeemelodeon answered 14/10, 2020 at 16:17 Comment(0)
M
4

Ok none of the answers given were correct, so I researched after grabbing an iPhone 12 mini in hand and this is what I found.

AVFoundation documentation is silent and even incorrect at times. One can infer from the documentation that it is not possible to get 10 bit HDR sample buffers, specifically if one reads the documentation of videoSettings property of AVCaptureVideoDataOutput:

   On iOS, the only supported key is kCVPixelBufferPixelFormatTypeKey. 
   Supported pixel formats are kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange, 
   kCVPixelFormatType_420YpCbCr8BiPlanarFullRange and kCVPixelFormatType_32BGRA

It appears from the documentation that one could never get 10 bit frames. But on probing -[AVCaptureDevice formats], one can find 4 formats that are different and have mediaSubtype as 'x420', which is kCVPixelFormatType_420YpCbCr10BiPlanarVideoRange, a 10 bit format. The moment -[AVCaptureDevice activeFormat] is set to one of these 4 formats, AVCaptureVideoDataOutput changes the sample buffer format to kCVPixelFormatType_420YpCbCr10BiPlanarVideoRange! Active Color Space of AVCaptureDevice also changes to AVCaptureColorSpace_HLG_BT2020.

Melodeemelodeon answered 26/11, 2020 at 15:25 Comment(0)
U
2

Nov 26 update.

As @Deepak posted in his own answer and comments, 'x420' tagged format will let the camera work in HLG mode. All the available HLG enabled formats from IP12 Pro are updated below.

Original answer

For iOS 14.2, I can dump all the available formats from AVCaptureDevice instance, seems the log output fairly explains itself. As commented below, hopefully setting the AVCaptureDevice.activeFormat to one of the HDR+wide color format will do the job.

<AVCaptureDeviceFormat: 0x282d8daf0 'vide'/'x420' 1280x 720, { 1- 30 fps}, HRSI:4096x2304, fov:68.161, supports vis, max zoom:120.00 (upscales @2.91), AF System:2, ISO:34.0-3264.0, SS:0.000014-1.000000, supports wide color, supports depth>
<AVCaptureDeviceFormat: 0x282d8dac0 'vide'/'x420' 1280x 720, { 1- 60 fps}, HRSI:4096x2304, fov:68.161, supports vis, max zoom:120.00 (upscales @2.91), AF System:2, ISO:34.0-3264.0, SS:0.000014-1.000000, supports wide color, supports depth>
<AVCaptureDeviceFormat: 0x282d8da50 'vide'/'x420' 1920x1080, { 1- 30 fps}, HRSI:4096x2304, fov:68.161, supports vis, max zoom:120.00 (upscales @1.94), AF System:2, ISO:34.0-3264.0, SS:0.000014-1.000000, supports wide color, supports depth, supports multicam>
<AVCaptureDeviceFormat: 0x282d8da30 'vide'/'x420' 1920x1080, { 1- 60 fps}, HRSI:4096x2304, fov:68.161, supports vis, max zoom:120.00 (upscales @1.94), AF System:2, ISO:34.0-3264.0, SS:0.000014-1.000000, supports wide color, supports multicam>
<AVCaptureDeviceFormat: 0x282d8d9e0 'vide'/'x420' 1920x1440, { 1- 30 fps}, HRSI:4032x3024, fov:67.096, max zoom:189.00 (upscales @2.10), AF System:2, ISO:34.0-3264.0, SS:0.000014-1.000000, supports wide color, supports depth, supports multicam>
<AVCaptureDeviceFormat: 0x282d8d950 'vide'/'x420' 3840x2160, { 1- 30 fps}, HRSI:4096x2304, fov:68.161, supports vis, max zoom:125.25 (upscales @1.00), AF System:2, ISO:34.0-3264.0, SS:0.000014-1.000000, supports wide color, supports multicam>

As of Nov 23, it is still an ongoing investigation, I think some joint effort is needed, or some Apple engineer can have a look at this.

I believe I have watched all the available WWDC17/18/19/20 sessions on this topic, and with the new iPhone 12 release, some findings here.

Capturing HDR from camera and saved as 10 bit HLG video directly is only capable on iPhone 12 and newer. This is what it claims in the product release, and I have got sample videos from my friend's new phone, it is working as expected.

In WWDC2020, Export HDR media in your app with AVFoundation , it claims:

At this point, I’d like to briefly touch on which Apple platforms can support HDR export.

iOS supports HEVC hardware encoding on devices with Apple A10 Fusion chips or newer.

Fortunately A10 devices have been around for a while, dating back to the iPhone 7, iPads released in 2018, and the 2019 iPod touch.

In regards to Macs, both HEVC and Apple ProRes software encoders are available on all Macs.

HEVC hardware encoding is generally available on 2017 and newer Macs running the new macOS.

Hardware encoding will make the export significantly faster.

Also in this video, it claims HDR export only works 10bit HEVC encoding, so the A10+ SoC should have 10bit HEVC encoding capability. This is a guess, I can edit the iPhone12 HLG video within official Photo app on iPhone 11 and SE2, and the writing performance(4k@60p, HLG) is quite good, which is a good indicator. However, I have no luck to make this work in code, the sample code listed in the video could not be the full picture, and I am having difficulty to find a working demo yet. And in theory, older devices should have the capability to record 10bit HLG also, or the camera, thermal/power budget is the limitation here.

However, the only relevant HDR key in this is the VideoProfileLevelKey which must be set to HEVC_Main10_AutoLevel when exporting HDR using the HEVC codec.

Note that 8-bit HEVC HDR is not supported, and this key is not applicable to ProRes exports.

All right, let’s take the time now to summarize how you would configure the keys I just discussed when outputting to two common HDR formats: HLG and HDR10. This table shows what the relevant HDR settings are for exporting an HLG file.

Another video worth watching again and again: Edit and play back HDR video with AVFoundation

During the test, I do get a CVPixelBuffer(format:kCVPixelFormatType_420YpCbCr10BiPlanarFullRange) that is HDR enabled, and correctly color managed from sample HLG video. This is a dump from my console log, and it is working on any iOS 14 enabled devices, even with the quite old iPhone6s(A9), because it only involves 10bit HEVC decoding here.

_displayLinkDidRefresh():121 - Optional(<CVPixelBuffer 0x281300500 width=3840 height=2160 pixelFormat=xf20 iosurface=0x282008050 planes=2 poolName=450:decode_1>
<Plane 0 width=3840 height=2160 bytesPerRow=7680>
<Plane 1 width=1920 height=1080 bytesPerRow=7680>
<attributes={
    PixelFormatDescription =     {
        BitsPerComponent = 10;
        CGBitmapContextCompatibility = 0;
        CGImageCompatibility = 0;
        ComponentRange = FullRange;
        ContainsAlpha = 0;
        ContainsGrayscale = 0;
        ContainsRGB = 0;
        ContainsYCbCr = 1;
        FillExtendedPixelsCallback = {length = 24, bytes = 0x0000000000000000b48ab8a1010000000000000000000000};
        IOSurfaceCoreAnimationCompatibility = 1;
        IOSurfaceCoreAnimationCompatibilityHTPCOK = 1;
        IOSurfaceOpenGLESTextureCompatibility = 1;
        OpenGLESCompatibility = 1;
        PixelFormat = 2019963440;
        Planes =         (
                        {
                BitsPerBlock = 16;
                HorizontalSubsampling = 1;
                VerticalSubsampling = 1;
            },
                        {
                BitsPerBlock = 32;
                BlackBlock = {length = 4, bytes = 0x00800080};
                HorizontalSubsampling = 2;
                VerticalSubsampling = 2;
            }
        );
    };
} propagatedAttachments={
    CVFieldCount = 1;
    CVImageBufferChromaLocationBottomField = Left;
    CVImageBufferChromaLocationTopField = Left;
    CVImageBufferColorPrimaries = "ITU_R_2020";
    CVImageBufferTransferFunction = "ITU_R_2100_HLG";
    CVImageBufferYCbCrMatrix = "ITU_R_2020";
    QTMovieTime =     {
        TimeScale = 600;
        TimeValue = 12090;
    };
} nonPropagatedAttachments={
}>)
Unitarianism answered 24/11, 2020 at 5:9 Comment(7)
This is about API to record video in HDR(Dolby vision) on iPhone 12 models, not editing.Melodeemelodeon
ya, I am waiting for my new phone yet, so, for the time being, I am looking into the edit/export workflow of it.Unitarianism
Ok it works. Setting AVCaptureDeviceFormat to the one with index 27 for instance delivers 10 bit frames!Melodeemelodeon
Btw, the updated answer is wrong. You need to set one with media subtype 'x420', not the ones which has videoHDRSupported, which is for EDR and NOT HDR10.Melodeemelodeon
Yes, you are right, got the feedback from my friend who got a iPhone12. I will post the fix later.Unitarianism
How does the buffer come from? I do not have iPhone 12 at hand, so I can only test HLG from iPhone 12 on iPhone 11. Using AVDisplayLink, I can successfully grab RGB texture from HLG movie during playing, AVFoundation will do the conversion for you. Internally, I believe it is using GPU to do the job, like mentioned here: developer.apple.com/documentation/metalperformanceshaders/… since texture conversion is a fairly common operation in GPU world.Unitarianism
For my own test, I tested these formats. Good ones kCVPixelFormatType_420YpCbCr10BiPlanarFullRange, kCVPixelFormatType_64RGBALE, kCVPixelFormatType_32ARGB. Bad ones: kCVPixelFormatType_64ARGB: no buffer got; kCVPixelFormatType_48RGB: no buffer got; kCVPixelFormatType_32RGBA: wrong buffer, always green.Unitarianism
C
2

Forcing a format with BT2020 is the correct way to ensure you're shooting in Dolby Vision. iOS 14.1 or high is required to do this. Here's an abbreviated snippet of how I'm doing this:

// Setup your session
session.beginConfiguration()
session.sessionPreset = .hd1280x720

// Add your camera to the session
let camera = AVCaptureDevice.default(.builtInWideAngleCamera, for: .video, position: .front)
let cameraInput = try AVCaptureDeviceInput(device: camera)
session.addInput(cameraInput)

// Important! Commit the session configuration before configuring your camera
session.commitConfiguration()

// Configure camera
try camera.lockForConfiguration()

// Force HDR on
camera.automaticallyAdjustsVideoHDREnabled = false
camera.isVideoHDREnabled = true

// Find the first 720p format that supports the correct colorspace
let desiredColorSpace = AVCaptureColorSpace.HLG_BT2020
let desiredFormat = camera.formats.first { format in
    // You could of course choose a different resolution if desired
    format.formatDescription.dimensions == CMVideoDimensions(width: 1280, height: 720) &&
        format.supportedColorSpaces.contains(desiredColorSpace)
}

// Set the HDR format
if let format = desiredFormat {
    camera.activeFormat = format
    camera.activeColorSpace = desiredColorSpace
} else {
    assertionFailure("Counldn't find HDR camera format")
}

camera.unlockForConfiguration()
Calycle answered 21/6, 2021 at 21:47 Comment(0)
S
0

It's hard to say without the device in hand, but I would assume that (some of) the AVCaptureDevices in an iPhone 12 will support formats that support HDR delivery (isVideoHDRSupported).

The corresponding AVCaptureVideoDataOutput's availableVideoPixelFormatTypes will probably list kCVPixelFormatType_420YpCbCr10BiPlanarFullRange and similar types as an option.

Steamheated answered 15/10, 2020 at 6:37 Comment(3)
Well the pixel format type should be 422 rather than 420. And AVCaptureMovieFileOutput should also show HDR video recording option which it doesn't. The only new API in iOS 14.1 is the addition of color space BT2020 in AVCaptureColorSpace. isVideoHDRSupported is an old property of device format but it had nothing to do with 10 bit sample buffer delivery or recording HDR videos strangely.Melodeemelodeon
If I understood it correctly, only the iPhone 12s will support HDR recording. So I would assume that you only see HDR format support on a real iPhone 12 device. What new APIs did you expect?Steamheated
Even on iPhone 11 Pro, more than 30% of the formats return isVideoHDRSupported as true. The documentation for isVideoHDRSupported says "EDR is a separate and distinct feature from 10-bit HDR video (first seen in 2020 iPhones)". It is clearly not the same as 10-bit HDR video.Melodeemelodeon
F
0

In case anyone is looking for more info on this I have found the following list within the iOS AVFoundation, note that it states it does not support all of them in CoreVideo.

/*
CoreVideo pixel format type constants.
CoreVideo does not provide support for all of these formats; this list just defines their names.
*/

public var kCVPixelFormatType_1Monochrome: OSType { get } /* 1 bit indexed */
public var kCVPixelFormatType_2Indexed: OSType { get } /* 2 bit indexed */
public var kCVPixelFormatType_4Indexed: OSType { get } /* 4 bit indexed */
public var kCVPixelFormatType_8Indexed: OSType { get } /* 8 bit indexed */
public var kCVPixelFormatType_1IndexedGray_WhiteIsZero: OSType { get } /* 1 bit indexed gray, white is zero */
public var kCVPixelFormatType_2IndexedGray_WhiteIsZero: OSType { get } /* 2 bit indexed gray, white is zero */
public var kCVPixelFormatType_4IndexedGray_WhiteIsZero: OSType { get } /* 4 bit indexed gray, white is zero */
public var kCVPixelFormatType_8IndexedGray_WhiteIsZero: OSType { get } /* 8 bit indexed gray, white is zero */
public var kCVPixelFormatType_16BE555: OSType { get } /* 16 bit BE RGB 555 */
public var kCVPixelFormatType_16LE555: OSType { get } /* 16 bit LE RGB 555 */
public var kCVPixelFormatType_16LE5551: OSType { get } /* 16 bit LE RGB 5551 */
public var kCVPixelFormatType_16BE565: OSType { get } /* 16 bit BE RGB 565 */
public var kCVPixelFormatType_16LE565: OSType { get } /* 16 bit LE RGB 565 */
public var kCVPixelFormatType_24RGB: OSType { get } /* 24 bit RGB */
public var kCVPixelFormatType_24BGR: OSType { get } /* 24 bit BGR */
public var kCVPixelFormatType_32ARGB: OSType { get } /* 32 bit ARGB */
public var kCVPixelFormatType_32BGRA: OSType { get } /* 32 bit BGRA */
public var kCVPixelFormatType_32ABGR: OSType { get } /* 32 bit ABGR */
public var kCVPixelFormatType_32RGBA: OSType { get } /* 32 bit RGBA */
public var kCVPixelFormatType_64ARGB: OSType { get } /* 64 bit ARGB, 16-bit big-endian samples */
public var kCVPixelFormatType_64RGBALE: OSType { get } /* 64 bit RGBA, 16-bit little-endian full-range (0-65535) samples */
public var kCVPixelFormatType_48RGB: OSType { get } /* 48 bit RGB, 16-bit big-endian samples */
public var kCVPixelFormatType_32AlphaGray: OSType { get } /* 32 bit AlphaGray, 16-bit big-endian samples, black is zero */
public var kCVPixelFormatType_16Gray: OSType { get } /* 16 bit Grayscale, 16-bit big-endian samples, black is zero */
public var kCVPixelFormatType_30RGB: OSType { get } /* 30 bit RGB, 10-bit big-endian samples, 2 unused padding bits (at least significant end). */
public var kCVPixelFormatType_422YpCbCr8: OSType { get } /* Component Y'CbCr 8-bit 4:2:2, ordered Cb Y'0 Cr Y'1 */
public var kCVPixelFormatType_4444YpCbCrA8: OSType { get } /* Component Y'CbCrA 8-bit 4:4:4:4, ordered Cb Y' Cr A */
public var kCVPixelFormatType_4444YpCbCrA8R: OSType { get } /* Component Y'CbCrA 8-bit 4:4:4:4, rendering format. full range alpha, zero biased YUV, ordered A Y' Cb Cr */
public var kCVPixelFormatType_4444AYpCbCr8: OSType { get } /* Component Y'CbCrA 8-bit 4:4:4:4, ordered A Y' Cb Cr, full range alpha, video range Y'CbCr. */
public var kCVPixelFormatType_4444AYpCbCr16: OSType { get } /* Component Y'CbCrA 16-bit 4:4:4:4, ordered A Y' Cb Cr, full range alpha, video range Y'CbCr, 16-bit little-endian samples. */
public var kCVPixelFormatType_444YpCbCr8: OSType { get } /* Component Y'CbCr 8-bit 4:4:4 */
public var kCVPixelFormatType_422YpCbCr16: OSType { get } /* Component Y'CbCr 10,12,14,16-bit 4:2:2 */
public var kCVPixelFormatType_422YpCbCr10: OSType { get } /* Component Y'CbCr 10-bit 4:2:2 */
public var kCVPixelFormatType_444YpCbCr10: OSType { get } /* Component Y'CbCr 10-bit 4:4:4 */
public var kCVPixelFormatType_420YpCbCr8Planar: OSType { get } /* Planar Component Y'CbCr 8-bit 4:2:0.  baseAddr points to a big-endian CVPlanarPixelBufferInfo_YCbCrPlanar struct */
public var kCVPixelFormatType_420YpCbCr8PlanarFullRange: OSType { get } /* Planar Component Y'CbCr 8-bit 4:2:0, full range.  baseAddr points to a big-endian CVPlanarPixelBufferInfo_YCbCrPlanar struct */
public var kCVPixelFormatType_422YpCbCr_4A_8BiPlanar: OSType { get } /* First plane: Video-range Component Y'CbCr 8-bit 4:2:2, ordered Cb Y'0 Cr Y'1; second plane: alpha 8-bit 0-255 */
public var kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange: OSType { get } /* Bi-Planar Component Y'CbCr 8-bit 4:2:0, video-range (luma=[16,235] chroma=[16,240]).  baseAddr points to a big-endian CVPlanarPixelBufferInfo_YCbCrBiPlanar struct */
public var kCVPixelFormatType_420YpCbCr8BiPlanarFullRange: OSType { get } /* Bi-Planar Component Y'CbCr 8-bit 4:2:0, full-range (luma=[0,255] chroma=[1,255]).  baseAddr points to a big-endian CVPlanarPixelBufferInfo_YCbCrBiPlanar struct */
public var kCVPixelFormatType_422YpCbCr8BiPlanarVideoRange: OSType { get } /* Bi-Planar Component Y'CbCr 8-bit 4:2:2, video-range (luma=[16,235] chroma=[16,240]).  baseAddr points to a big-endian CVPlanarPixelBufferInfo_YCbCrBiPlanar struct */
public var kCVPixelFormatType_422YpCbCr8BiPlanarFullRange: OSType { get } /* Bi-Planar Component Y'CbCr 8-bit 4:2:2, full-range (luma=[0,255] chroma=[1,255]).  baseAddr points to a big-endian CVPlanarPixelBufferInfo_YCbCrBiPlanar struct */
public var kCVPixelFormatType_444YpCbCr8BiPlanarVideoRange: OSType { get } /* Bi-Planar Component Y'CbCr 8-bit 4:4:4, video-range (luma=[16,235] chroma=[16,240]).  baseAddr points to a big-endian CVPlanarPixelBufferInfo_YCbCrBiPlanar struct */
public var kCVPixelFormatType_444YpCbCr8BiPlanarFullRange: OSType { get } /* Bi-Planar Component Y'CbCr 8-bit 4:4:4, full-range (luma=[0,255] chroma=[1,255]).  baseAddr points to a big-endian CVPlanarPixelBufferInfo_YCbCrBiPlanar struct */
public var kCVPixelFormatType_422YpCbCr8_yuvs: OSType { get } /* Component Y'CbCr 8-bit 4:2:2, ordered Y'0 Cb Y'1 Cr */
public var kCVPixelFormatType_422YpCbCr8FullRange: OSType { get } /* Component Y'CbCr 8-bit 4:2:2, full range, ordered Y'0 Cb Y'1 Cr */
public var kCVPixelFormatType_OneComponent8: OSType { get } /* 8 bit one component, black is zero */
public var kCVPixelFormatType_TwoComponent8: OSType { get } /* 8 bit two component, black is zero */
public var kCVPixelFormatType_30RGBLEPackedWideGamut: OSType { get } /* little-endian RGB101010, 2 MSB are zero, wide-gamut (384-895) */
public var kCVPixelFormatType_ARGB2101010LEPacked: OSType { get } /* little-endian ARGB2101010 full-range ARGB */
public var kCVPixelFormatType_OneComponent10: OSType { get } /* 10 bit little-endian one component, stored as 10 MSBs of 16 bits, black is zero */
public var kCVPixelFormatType_OneComponent12: OSType { get } /* 12 bit little-endian one component, stored as 12 MSBs of 16 bits, black is zero */
public var kCVPixelFormatType_OneComponent16: OSType { get } /* 16 bit little-endian one component, black is zero */
public var kCVPixelFormatType_TwoComponent16: OSType { get } /* 16 bit little-endian two component, black is zero */
public var kCVPixelFormatType_OneComponent16Half: OSType { get } /* 16 bit one component IEEE half-precision float, 16-bit little-endian samples */
public var kCVPixelFormatType_OneComponent32Float: OSType { get } /* 32 bit one component IEEE float, 32-bit little-endian samples */
public var kCVPixelFormatType_TwoComponent16Half: OSType { get } /* 16 bit two component IEEE half-precision float, 16-bit little-endian samples */
public var kCVPixelFormatType_TwoComponent32Float: OSType { get } /* 32 bit two component IEEE float, 32-bit little-endian samples */
public var kCVPixelFormatType_64RGBAHalf: OSType { get } /* 64 bit RGBA IEEE half-precision float, 16-bit little-endian samples */
public var kCVPixelFormatType_128RGBAFloat: OSType { get } /* 128 bit RGBA IEEE float, 32-bit little-endian samples */
public var kCVPixelFormatType_14Bayer_GRBG: OSType { get } /* Bayer 14-bit Little-Endian, packed in 16-bits, ordered G R G R... alternating with B G B G... */
public var kCVPixelFormatType_14Bayer_RGGB: OSType { get } /* Bayer 14-bit Little-Endian, packed in 16-bits, ordered R G R G... alternating with G B G B... */
public var kCVPixelFormatType_14Bayer_BGGR: OSType { get } /* Bayer 14-bit Little-Endian, packed in 16-bits, ordered B G B G... alternating with G R G R... */
public var kCVPixelFormatType_14Bayer_GBRG: OSType { get } /* Bayer 14-bit Little-Endian, packed in 16-bits, ordered G B G B... alternating with R G R G... */
public var kCVPixelFormatType_DisparityFloat16: OSType { get } /* IEEE754-2008 binary16 (half float), describing the normalized shift when comparing two images. Units are 1/meters: ( pixelShift / (pixelFocalLength * baselineInMeters) ) */
public var kCVPixelFormatType_DisparityFloat32: OSType { get } /* IEEE754-2008 binary32 float, describing the normalized shift when comparing two images. Units are 1/meters: ( pixelShift / (pixelFocalLength * baselineInMeters) ) */
public var kCVPixelFormatType_DepthFloat16: OSType { get } /* IEEE754-2008 binary16 (half float), describing the depth (distance to an object) in meters */
public var kCVPixelFormatType_DepthFloat32: OSType { get } /* IEEE754-2008 binary32 float, describing the depth (distance to an object) in meters */
public var kCVPixelFormatType_420YpCbCr10BiPlanarVideoRange: OSType { get } /* 2 plane YCbCr10 4:2:0, each 10 bits in the MSBs of 16bits, video-range (luma=[64,940] chroma=[64,960]) */
public var kCVPixelFormatType_422YpCbCr10BiPlanarVideoRange: OSType { get } /* 2 plane YCbCr10 4:2:2, each 10 bits in the MSBs of 16bits, video-range (luma=[64,940] chroma=[64,960]) */
public var kCVPixelFormatType_444YpCbCr10BiPlanarVideoRange: OSType { get } /* 2 plane YCbCr10 4:4:4, each 10 bits in the MSBs of 16bits, video-range (luma=[64,940] chroma=[64,960]) */
public var kCVPixelFormatType_420YpCbCr10BiPlanarFullRange: OSType { get } /* 2 plane YCbCr10 4:2:0, each 10 bits in the MSBs of 16bits, full-range (Y range 0-1023) */
public var kCVPixelFormatType_422YpCbCr10BiPlanarFullRange: OSType { get } /* 2 plane YCbCr10 4:2:2, each 10 bits in the MSBs of 16bits, full-range (Y range 0-1023) */
public var kCVPixelFormatType_444YpCbCr10BiPlanarFullRange: OSType { get } /* 2 plane YCbCr10 4:4:4, each 10 bits in the MSBs of 16bits, full-range (Y range 0-1023) */
public var kCVPixelFormatType_420YpCbCr8VideoRange_8A_TriPlanar: OSType { get } /* first and second planes as per 420YpCbCr8BiPlanarVideoRange (420v), alpha 8 bits in third plane full-range.  No CVPlanarPixelBufferInfo struct. */
public var kCVPixelFormatType_16VersatileBayer: OSType { get } /* Single plane Bayer 16-bit little-endian sensor element ("sensel") samples from full-size decoding of ProRes RAW images; Bayer pattern (sensel ordering) and other raw conversion information is described via buffer attachments */
public var kCVPixelFormatType_64RGBA_DownscaledProResRAW: OSType { get } /* Single plane 64-bit RGBA (16-bit little-endian samples) from downscaled decoding of ProRes RAW images; components--which may not be co-sited with one another--are sensel values and require raw conversion, information for which is described via buffer attachments */
Ferreous answered 10/12, 2020 at 9:49 Comment(0)

© 2022 - 2025 — McMap. All rights reserved.