How to convert an Int to hex String?
Asked Answered
V

7

113

In Objective-C I used to convert an unsigned integer into a hex string with:

NSString *st = [NSString stringWithFormat:@"%2X", n];

I tried for a long time to translate this into Swift but unsuccessfully.

Vescuso answered 15/6, 2014 at 12:17 Comment(1)
Based on the answer by stackoverflow.com/users/1187415/martin-r here #24986819, you can do String s = "0x" + String(n, radix: 16)Bortz
T
181

You can now do:

let n = 14
var st = String(format:"%02X", n)
st += " is the hexadecimal representation of \(n)"
print(st)
0E is the hexadecimal representation of 14

Note: The 2 in this example is the field width and represents the minimum length desired. The 0 tells it to pad the result with leading 0's if necessary. (Without the 0, the result would be padded with leading spaces). Of course, if the result is larger than two characters, the field length will not be clipped to a width of 2; it will expand to whatever length is necessary to display the full result.

This only works if you have Foundation imported (this includes the import of Cocoa or UIKit). This isn't a problem if you're doing iOS or macOS programming.

Use uppercase X if you want A...F and lowercase x if you want a...f:

String(format: "%x %X", 64206, 64206)  // "face FACE"

If you want to print integer values larger than UInt32.max, add ll (el-el, not eleven) to the format string:

let n = UInt64.max
print(String(format: "%llX is hexadecimal for \(n)", n))
FFFFFFFFFFFFFFFF is hexadecimal for 18446744073709551615

Original Answer

You can still use NSString to do this. The format is:

var st = NSString(format:"%2X", n)

This makes st an NSString, so then things like += do not work. If you want to be able to append to the string with += make st into a String like this:

var st = NSString(format:"%2X", n) as String

or

var st = String(NSString(format:"%2X", n))

or

var st: String = NSString(format:"%2X", n)

Then you can do:

let n = 123
var st = NSString(format:"%2X", n) as String
st += " is the hexadecimal representation of \(n)"
// "7B is the hexadecimal representation of 123"
Toxophilite answered 15/6, 2014 at 12:21 Comment(10)
The hex and decimal numbers are reversed in "7B is 123 in hex", so the st expression should be corrected to: st = "(n) is " + String(format:"%2X",n) + " in hex"Modestamodeste
@KevinSliech, I'll admit that it is ambiguous. I meant it as if someone came up and asked, "what does 7B represent ?", and you'd answer "7B is the number 123 written in hexadecimal".Toxophilite
I find both of those a little ambiguous, especially now that I've read both ~1000 times. Another way would be to write: "0x7B = 123d" or something like that. The ambiguity might goof up folks not well versed in alternate number systems.Modestamodeste
@KevinSliech, thanks. I made the statement clearer.Toxophilite
I think %2X should be %2llXTiruchirapalli
@lbsweek, thanks for the feedback. I updated the answer.Toxophilite
In Swift4 you must use "%02X" instead of "%2X" if you want leading "0" if necessary ("%2X" insert a space)Bomar
Thanks for the heads up, @Bubu. I'll update my answer.Toxophilite
Note: This won't work with Doubles (1.0), Cast to Int first.Rosalba
To get a zero filled UInt64 value, do this: String(format: "%016llX", n) n=917320791915679476, will return "0CBAFA666B36CEF4"Grimy
N
62

In Swift there is a specific init method on String for exactly this:

let hex = String(0xF, radix: 16, uppercase: false)
println("hex=\(hex)") // Output: f
Norry answered 15/2, 2015 at 22:20 Comment(6)
How can i convert this String to UInt?Tiphani
you want to convert "0xF" to a UInt?Norry
let number = UInt("0xF".stringByReplacingOccurrencesOfString("0x", withString:""), radix: 16), number will be of type UInt? If you need more ask a question :)Norry
mayhap: String(0xf, radix: 0x10, uppercase: false)Hake
@Norry Do we have this radix method for NSString in Objective C?Kerouac
@NareshNallamsetty bit late but you'd have to use [NSString stringWithFormat:] method, e.g. [NSString stringWithFormat:@"%X", 0xF]Norry
N
55

With Swift 5, according to your needs, you may choose one of the three following methods in order to solve your problem.


#1. Using String's init(_:radix:uppercase:) initializer

Swift String has a init(_:radix:uppercase:) initializer with the following declaration:

init<T>(_ value: T, radix: Int = 10, uppercase: Bool = false) where T : BinaryInteger

Creates a string representing the given value in base 10, or some other specified base.

The Playground code below shows how to create a String instance that represents an integer value in hexadecimal format by using init(_:radix:uppercase:) and without having to import Foundation:

let string1 = String(2, radix: 16)
print(string1) // prints: "2"

let string2 = String(211, radix: 16)
print(string2) // prints: "d3"

let string3 = String(211, radix: 16, uppercase: true)
print(string3) // prints: "D3"

#2. Using String's init(format:​_:​) initializer

Foundation provides String a init(format:​_:​) initializer. init(format:​_:​) has the following declaration:

init(format: String, _ arguments: CVarArg...)

Returns a String object initialized by using a given format string as a template into which the remaining argument values are substituted.

The Apple's String Programming Guide gives a list of the format specifiers that are supported by String and NSString. Among those format specifiers, %X has the following description:

Unsigned 32-bit integer (unsigned int), printed in hexadecimal using the digits 0–9 and uppercase A–F.

The Playground code below shows how to create a String instance that represents an integer value in hexadecimal format with init(format:​_:​):

import Foundation

let string1 = String(format:"%X", 2)
print(string1) // prints: "2"

let string2 = String(format:"%02X", 1)
print(string2) // prints: "01"

let string3 = String(format:"%02X", 211)
print(string3) // prints: "D3"

let string4 = String(format: "%02X, %02X, %02X", 12, 121, 255)
print(string4) // prints: "0C, 79, FF"

#3. Using String's init(format:​arguments:​) initializer

Foundation provides String a init(format:​arguments:​) initializer. init(format:​arguments:​) has the following declaration:

init(format: String, arguments: [CVarArg])

Returns a String object initialized by using a given format string as a template into which the remaining argument values are substituted according to the user’s default locale.

The Playground code below shows how to create a String instance that represents an integer value in hexadecimal format with init(format:​arguments:​):

import Foundation

let string1 = String(format:"%X", arguments: [2])
print(string1) // prints: "2"

let string2 = String(format:"%02X", arguments: [1])
print(string2) // prints: "01"

let string3 = String(format:"%02X",  arguments: [211])
print(string3) // prints: "D3"

let string4 = String(format: "%02X, %02X, %02X",  arguments: [12, 121, 255])
print(string4) // prints: "0C, 79, FF"
Nastassia answered 6/1, 2016 at 10:28 Comment(0)
C
7

Swift 5.2.4

let value = 200
let hexString = String(format: "%02X", value)
Clown answered 20/7, 2020 at 0:14 Comment(0)
F
3

Answers above work fine for values in the range of a 32 bit Int, but values over this won't work as the value will roll over.

You need to use the length modifier for values greater than a 32bit Int

%x = Unsigned 32-bit integer (unsigned int)

ll = Length modifiers specifying that a following d, o, u, x, or X conversion specifier applies to a long long or unsigned long long argument.

let hexString = String(format:"%llX", decimalValue)
Fertile answered 27/7, 2017 at 21:30 Comment(3)
Please add code to convert hex back to decimal too.Allness
@VarunNaharia Int("hexaString", radix: 16)Kamkama
Hmm doesn't appear to work anymore Argument type 'Decimal' does not conform to expected type 'CVarArg'Puberty
F
2

To use

let string2 = String(format:"%02X", 1)
print(string2) // prints: "01"

In Swift3 import foundation is not required, At least not in a Project. String should have all the functionality as NSString.

Floriated answered 6/11, 2016 at 0:15 Comment(0)
M
-1

Use this

extension Color {
    
    init?(fromInt signedDecimal: Int) {
        if signedDecimal == 0 {
            self = .clear; return
        }
        
        let _input: Int32 = Int32(signedDecimal)
        let _inputCastUInt32: UInt32 = UInt32(bitPattern: _input)
        let _hexColor_ARGB: String = String(_inputCastUInt32, radix: 16, uppercase: true)
        self.init(hex: _hexColor_ARGB)
    }
    
    init?(hex: String) {
        var hexSanitized = hex.trimmingCharacters(in: .whitespacesAndNewlines)
        hexSanitized = hexSanitized.replacingOccurrences(of: "#", with: "")

        var rgb: UInt64 = 0

        var r: CGFloat = 0.0
        var g: CGFloat = 0.0
        var b: CGFloat = 0.0
        var a: CGFloat = 1.0

        let length = hexSanitized.count

        guard Scanner(string: hexSanitized).scanHexInt64(&rgb) else { return nil }

        if length == 6 {
            r = CGFloat((rgb & 0xFF0000) >> 16) / 255.0
            g = CGFloat((rgb & 0x00FF00) >> 8) / 255.0
            b = CGFloat(rgb & 0x0000FF) / 255.0

        } else if length == 8 {
            a = CGFloat((rgb & 0xFF000000) >> 24) / 255.0
            r = CGFloat((rgb & 0x00FF0000) >> 16) / 255.0
            g = CGFloat((rgb & 0x0000FF00) >> 8) / 255.0
            b = CGFloat(rgb & 0x000000FF) / 255.0

        } else {
            return nil
        }

        self.init(red: r, green: g, blue: b, opacity: a)
    }
}

It work with me

Maroc answered 19/4 at 17:16 Comment(2)
The question is not asking how to convert a hex color code into a Color. The question is far more basic. It is asking how to convert a base-10 integer number into a base-16 representation as a string. Such as 42 to "2A".Crocoite
Your answer is not related to the question. The user wants to know how to simply convert Int (integer form, base 10) to hex (hexadecimal form, base 16) string. You are showing how to get the Color representation for the hex, which is not what was asked.Theophany

© 2022 - 2024 — McMap. All rights reserved.