In Objective-C I used to convert an unsigned integer into a hex string with:
NSString *st = [NSString stringWithFormat:@"%2X", n];
I tried for a long time to translate this into Swift but unsuccessfully.
In Objective-C I used to convert an unsigned integer into a hex string with:
NSString *st = [NSString stringWithFormat:@"%2X", n];
I tried for a long time to translate this into Swift but unsuccessfully.
You can now do:
let n = 14
var st = String(format:"%02X", n)
st += " is the hexadecimal representation of \(n)"
print(st)
0E is the hexadecimal representation of 14
Note: The 2
in this example is the field width and represents the minimum length desired. The 0
tells it to pad the result with leading 0
's if necessary. (Without the 0
, the result would be padded with leading spaces). Of course, if the result is larger than two characters, the field length will not be clipped to a width of 2
; it will expand to whatever length is necessary to display the full result.
This only works if you have Foundation
imported (this includes the import of Cocoa
or UIKit
). This isn't a problem if you're doing iOS or macOS programming.
Use uppercase X
if you want A...F
and lowercase x
if you want a...f
:
String(format: "%x %X", 64206, 64206) // "face FACE"
If you want to print integer values larger than UInt32.max
, add ll
(el-el, not eleven) to the format string:
let n = UInt64.max
print(String(format: "%llX is hexadecimal for \(n)", n))
FFFFFFFFFFFFFFFF is hexadecimal for 18446744073709551615
Original Answer
You can still use NSString
to do this. The format is:
var st = NSString(format:"%2X", n)
This makes st
an NSString
, so then things like +=
do not work. If you want to be able to append to the string with +=
make st
into a String
like this:
var st = NSString(format:"%2X", n) as String
or
var st = String(NSString(format:"%2X", n))
or
var st: String = NSString(format:"%2X", n)
Then you can do:
let n = 123
var st = NSString(format:"%2X", n) as String
st += " is the hexadecimal representation of \(n)"
// "7B is the hexadecimal representation of 123"
In Swift there is a specific init
method on String
for exactly this:
let hex = String(0xF, radix: 16, uppercase: false)
println("hex=\(hex)") // Output: f
"0xF"
to a UInt
? –
Norry let number = UInt("0xF".stringByReplacingOccurrencesOfString("0x", withString:""), radix: 16)
, number
will be of type UInt?
If you need more ask a question :) –
Norry String(0xf, radix: 0x10, uppercase: false)
–
Hake [NSString stringWithFormat:]
method, e.g. [NSString stringWithFormat:@"%X", 0xF]
–
Norry With Swift 5, according to your needs, you may choose one of the three following methods in order to solve your problem.
String
's init(_:radix:uppercase:)
initializerSwift String
has a init(_:radix:uppercase:)
initializer with the following declaration:
init<T>(_ value: T, radix: Int = 10, uppercase: Bool = false) where T : BinaryInteger
Creates a string representing the given value in base 10, or some other specified base.
The Playground code below shows how to create a String
instance that represents an integer value in hexadecimal format by using init(_:radix:uppercase:)
and without having to import Foundation
:
let string1 = String(2, radix: 16)
print(string1) // prints: "2"
let string2 = String(211, radix: 16)
print(string2) // prints: "d3"
let string3 = String(211, radix: 16, uppercase: true)
print(string3) // prints: "D3"
String
's init(format:_:)
initializerFoundation
provides String
a init(format:_:)
initializer. init(format:_:)
has the following declaration:
init(format: String, _ arguments: CVarArg...)
Returns a
String
object initialized by using a given format string as a template into which the remaining argument values are substituted.
The Apple's String Programming Guide gives a list of the format specifiers that are supported by String
and NSString
. Among those format specifiers, %X
has the following description:
Unsigned 32-bit integer (
unsigned int
), printed in hexadecimal using the digits 0–9 and uppercase A–F.
The Playground code below shows how to create a String
instance that represents an integer value in hexadecimal format with init(format:_:)
:
import Foundation
let string1 = String(format:"%X", 2)
print(string1) // prints: "2"
let string2 = String(format:"%02X", 1)
print(string2) // prints: "01"
let string3 = String(format:"%02X", 211)
print(string3) // prints: "D3"
let string4 = String(format: "%02X, %02X, %02X", 12, 121, 255)
print(string4) // prints: "0C, 79, FF"
String
's init(format:arguments:)
initializerFoundation
provides String
a init(format:arguments:)
initializer. init(format:arguments:)
has the following declaration:
init(format: String, arguments: [CVarArg])
Returns a
String
object initialized by using a given format string as a template into which the remaining argument values are substituted according to the user’s default locale.
The Playground code below shows how to create a String
instance that represents an integer value in hexadecimal format with init(format:arguments:)
:
import Foundation
let string1 = String(format:"%X", arguments: [2])
print(string1) // prints: "2"
let string2 = String(format:"%02X", arguments: [1])
print(string2) // prints: "01"
let string3 = String(format:"%02X", arguments: [211])
print(string3) // prints: "D3"
let string4 = String(format: "%02X, %02X, %02X", arguments: [12, 121, 255])
print(string4) // prints: "0C, 79, FF"
Swift 5.2.4
let value = 200
let hexString = String(format: "%02X", value)
Answers above work fine for values in the range of a 32 bit Int, but values over this won't work as the value will roll over.
You need to use the length modifier for values greater than a 32bit Int
%x = Unsigned 32-bit integer (unsigned int)
ll = Length modifiers specifying that a following d, o, u, x, or X conversion specifier applies to a long long or unsigned long long argument.
let hexString = String(format:"%llX", decimalValue)
Int("hexaString", radix: 16)
–
Kamkama To use
let string2 = String(format:"%02X", 1)
print(string2) // prints: "01"
In Swift3 import foundation is not required, At least not in a Project. String should have all the functionality as NSString.
Use this
extension Color {
init?(fromInt signedDecimal: Int) {
if signedDecimal == 0 {
self = .clear; return
}
let _input: Int32 = Int32(signedDecimal)
let _inputCastUInt32: UInt32 = UInt32(bitPattern: _input)
let _hexColor_ARGB: String = String(_inputCastUInt32, radix: 16, uppercase: true)
self.init(hex: _hexColor_ARGB)
}
init?(hex: String) {
var hexSanitized = hex.trimmingCharacters(in: .whitespacesAndNewlines)
hexSanitized = hexSanitized.replacingOccurrences(of: "#", with: "")
var rgb: UInt64 = 0
var r: CGFloat = 0.0
var g: CGFloat = 0.0
var b: CGFloat = 0.0
var a: CGFloat = 1.0
let length = hexSanitized.count
guard Scanner(string: hexSanitized).scanHexInt64(&rgb) else { return nil }
if length == 6 {
r = CGFloat((rgb & 0xFF0000) >> 16) / 255.0
g = CGFloat((rgb & 0x00FF00) >> 8) / 255.0
b = CGFloat(rgb & 0x0000FF) / 255.0
} else if length == 8 {
a = CGFloat((rgb & 0xFF000000) >> 24) / 255.0
r = CGFloat((rgb & 0x00FF0000) >> 16) / 255.0
g = CGFloat((rgb & 0x0000FF00) >> 8) / 255.0
b = CGFloat(rgb & 0x000000FF) / 255.0
} else {
return nil
}
self.init(red: r, green: g, blue: b, opacity: a)
}
}
It work with me
42
to "2A"
. –
Crocoite Color
representation for the hex, which is not what was asked. –
Theophany © 2022 - 2024 — McMap. All rights reserved.
String s = "0x" + String(n, radix: 16)
– Bortz