(The answer has been updated for Swift 4 and later.)
Using the Swift type Data
and String
this can be done as
let myUInt32Array: [UInt32] = [72, 101, 108, 108, 111, 128049, 127465, 127466]
let data = Data(bytes: myUInt32Array, count: myUInt32Array.count * MemoryLayout<UInt32>.stride)
let myString = String(data: data, encoding: .utf32LittleEndian)!
print(myString) // Hello🐱🇩🇪
A forced unwrap is used here because a conversion from UTF-32
code points to a string cannot fail.
You can define a String
extension for your convenience
extension String {
init(utf32chars:[UInt32]) {
let data = Data(bytes: utf32chars, count: utf32chars.count * MemoryLayout<UInt32>.stride)
self = String(data: data, encoding: .utf32LittleEndian)!
}
}
and use it as
let myUInt32Array: [UInt32] = [72, 101, 108, 108, 111, 128049, 127465, 127466]
let myString = String(utf32chars: myUInt32Array)
print(myString) // Hello🐱🇩🇪
And just for completeness, the generic converter
from https://mcmap.net/q/23389/-is-there-a-way-to-create-a-string-from-utf16-array-in-swift
extension String {
init?<C : UnicodeCodec>(codeUnits:[C.CodeUnit], codec : C) {
var codec = codec
var str = ""
var generator = codeUnits.makeIterator()
var done = false
while !done {
let r = codec.decode(&generator)
switch (r) {
case .emptyInput:
done = true
case .scalarValue(let val):
str.unicodeScalars.append(val)
case .error:
return nil
}
}
self = str
}
}
can be used with UTF-8, UTF-16 and UTF-32 input. In your case it would be
let myUInt32Array: [UInt32] = [72, 101, 108, 108, 111, 128049, 127465, 127466]
let myString = String(codeUnits: myUInt32Array, codec : UTF32())!
print(myString) // Hello🐱🇩🇪
Character
first, theUInt32
value must also be checked fornil
. – Saltandpepper