UnicodeScalar
In Swift a Character can be constructed from a UnicodeScalar
value. And we can build up a UnicodeScalar
from an Int
. This allows many conversions.
With utf8 and utf16, both properties on a String
, we can access all Int
values of the characters of a String
. With these properties, we can apply number-based conversions to characters.
Int
to characterThis example converts an Int
that represents an ASCII char
to an actual Character. The number 97 indicates a lowercase "a."
UnicodeScalar
instance with an init
method. We then create a Character from that.let value = 97 // Convert Int to a UnicodeScalar. let u = UnicodeScalar(value)! // Convert UnicodeScalar to a Character. let char = Character(u) // Write results. print(char)a
Here we apply the utf16 and utf8 properties on a String
. These return Ints that represent that characters in a String
.
UInt16
values. We can convert these to an Int
or cast them with no errors.UInt8
chars. It is important that the string
not contain non-ASCII characters when we use this.let value = "abc" for u in value.utf16 { // This value is a UInt16. print(u) } print("") // Write newline. for u in value.utf8 { // This value is a UInt8. print(u) }97 98 99 97 98 99
Some algorithms, like the ROT13 substitution cipher, require converting from characters to Ints. We can alter characters based on their numeric values.
Every character has an underlying integer value. In Swift we can use this value, along with UnicodeScalar
, to create a Character. This process is logical.