This Stack Overflow question deals with 16-bit Unicode characters. I would like a similar solution that supports 32-bit characters. See this link for a listing of the various Unicode charts. For example, a range of characters that are 32-bit are the Musical Symbols.
The answer in the question linked above doesn't work because it casts the System.Int32 value as a System.Char, which is a 16-bit type.
Edit: Let me clarify that I don't particularly care about displaying the 32-bit Unicode character, I just want to store the character in a string variable.
Edit #2: I wrote a PowerShell snippet that uses the info in the marked answer and its comments. I would have wanted to put this in another comment, but comments can't be multi-line.
$inputValue = '1D11E'
$hexValue = [int]"0x$inputValue" - 0x10000
$highSurrogate = [int]($hexValue / 0x400) + 0xD800
$lowSurrogate = $hexValue % 0x400 + 0xDC00
$stringValue = [char]$highSurrogate + [char]$lowSurrogate
Dour High Arch still deserves credit for the answer for helping me finally understand surrogate pairs.
â
is encoded toC3 A2
in UTF-8, and toE2
in ISO-8859-1 (aka Latin-1). – Greenesecho "`u{1F44D}"
orecho [char]::ConvertFromUtf32(0x1F44D)
– Gonta