iOS 13's CryptoKit framework provides a .rawRepresentation
value for ECDSA public and private keys. I've been trying to reverse-engineer the rawRepresentation
data type to convert between it and JWK. Judging by the 64-byte length of the public key representation, it seems to be a simple x || y
concatenation. I would guess that the private key would then be x || y || d
, but this doesn't seem to be the case, as doing so should yield a 96-byte string, while the actual rawRepresentation
is 144 bytes. It also doesn't seem to be a valid DER/ASN.1 string either. I haven't managed to find a spec that lines up with the actual values I'm getting.
As you could guess, Apple's docs are very descriptive.
rawRepresentation
: A representation of the private key as a collection of bytes.
An example key pair in hex is provided.
Private: 988f8187ff7f00007466815b0d6b02ae1a063198fd1e4923fb1e413195126cc00d30483284186b435726c0c69cc774274ea32eb6a17cbaf2ea88dd7f3a5a2a3ce637bc4b96523c2795035bd2fbeb093b010000000000000000000000000000000000000000000000000000000000000012b2b61abe8beae5aeb6d0bda739235364de96c7f498813cfb0336198dcf9063
Public: 2774c79cc6c02657436b18843248300dc06c129531411efb23491efd9831061a3b09ebfbd25b0395273c52964bbc37e63c2a5a3a7fdd88eaf2ba7ca1b62ea34e
What format could this be?
x
, 32 fory
, and 32 ford
). iOS seems to expect 144 bytes and throws an error when I try to use 96 bytes. – Tailpiece