With Swift 5, you can pick one of the following approaches in order to get the ASCII numeric representation of a character.
#1. Using Character's asciiValue property
Character has a property called asciiValue. asciiValue has the following declaration:
var asciiValue: UInt8? { get }
The ASCII encoding value of this character, if it is an ASCII character.
The following Playground sample codes show how to use asciiValue in order to get
the ASCII encoding value of a character:
let character: Character = "a"
print(character.asciiValue) //prints: Optional(97)
let string = "a"
print(string.first?.asciiValue) //prints: Optional(97)
let character: Character = ""
print(character.asciiValue) //prints: nil
#2. Using Character's isASCII property and Unicode.Scalar's value property
As an alternative, you can check that the first character of a string is an ASCII character (using Character's isASCII property) then get the numeric representation of its first Unicode scalar (using Unicode.Scalar's value property). The Playground sample code below show how to proceed:
let character: Character = "a"
if character.isASCII, let scalar = character.unicodeScalars.first {
print(scalar.value)
} else {
print("Not an ASCII character")
}
/*
prints: 97
*/
let string = "a"
if let character = string.first, character.isASCII, let scalar = character.unicodeScalars.first {
print(scalar.value)
} else {
print("Not an ASCII character")
}
/*
prints: 97
*/
let character: Character = ""
if character.isASCII, let scalar = character.unicodeScalars.first {
print(scalar.value)
} else {
print("Not an ASCII character")
}
/*
prints: Not an ASCII character
*/