How do I parse JSON like this:
let json = "{\"key\":18446744073709551616}"
struct Foo: Decodable {
let key: UInt64
}
let coder = JSONDecoder()
let test = try! coder.decode(Foo.self, from: json.data(using: .utf8)!)
The problem is that that number is too big for UInt64. I know of no larger integer types in Swift.
Parsed JSON number <18446744073709551616> does not fit in UInt64
I wouldn't mind getting it as String or Data, but that's not allowed because JSONDecoder knows it's supposed to be a number:
Expected to decode String but found a number instead.
You can use Decimal instead:
let json = "{\"key\":184467440737095516160000001}"
struct Foo: Decodable {
let key: Decimal
}
let coder = JSONDecoder()
let test = try! coder.decode(Foo.self, from: json.data(using: .utf8)!)
print(test) // Foo(key: 184467440737095516160000001)
Decimal is the Swift overlay type of NSDecimalNumber which
... can represent any number that can be expressed as mantissa x 10^exponent where mantissa is a decimal integer up to 38 digits long, and exponent is an integer from –128 through 127.
You could also parse it as a Double if the full precision is not needed:
struct Foo: Decodable {
let key: Double
}
let coder = JSONDecoder()
let test = try! coder.decode(Foo.self, from: json.data(using: .utf8)!)
print(test) // Foo(key: 1.8446744073709552e+36)
It seems to be the case that JSONDecoder is using NSDecimalNumber behind the scenes
struct Foo: Decodable {
let key: Int
}
// this is 1 + the mantissa of NSDecimalNumber.maximum
let json = "{\"key\":340282366920938463463374607431768211456}"
let coder = JSONDecoder()
let test = try! coder.decode(Foo.self, from: json.data(using: .utf8)!)
Even in the DecodingError, the number is not accurately represented:
Parsed JSON number <340282366920938463463374607431768211450> does not fit in Int.
So use Decimal if you want to be able to decode at maximum precision (though you still might silently lose precision). Otherwise you just need to yell at whoever is sending you that JSON.
Note that while the documentation says
mantissa is a decimal integer up to 38 digits
It's actually a 128-bit unsigned integer, so it can represent some 39 digit numbers as well, as seen above.
Related
I have a JSON object, which has the key value:
"content":""
Actually, the value of content is 7 BOM characters, you can copy it into SublimeText to view it.
But when I use the command in the Swift:
let object = try JSONDecoder().decode(type, from: data)
then object.content only has 6 characters.
Do you know why and how to fix it?
This full example shows the issue:
import Foundation
let bom: Character = "\u{FEFF}"
let string = String(repeating: bom, count: 7)
print(string.count) // 7
let json = #"{"content": "\#(string)" }"#.data(using: .utf8)!
struct Type: Decodable {
let content: String
}
let decoded = try! JSONDecoder().decode(Type.self, from: json)
print(decoded.content.count) // 6
One of the special characters is lost during JSON parsing.
I have been working on this for a few hours and have not been able to find an answer. I have a JSON with dynamic keys that I am trying to parse into a struct. I thought I could keep it simple but I'm getting serialization errors. Please help - thanks
{"rates":{
"btc":{"name":"Bitcoin","unit":"BTC","value":1.0,"type":"crypto"},
"eth":{"name":"Ether","unit":"ETH","value":35.69,"type":"crypto"},
}}
my stuct
struct CryptoCoins: Decodable {
let rates: [String: [Coin]]
}
struct Coin: Decodable {
let name: String
let unit: String
let value: Double
let type: String
}
my decoder:
guard let container = try? JSONDecoder().decode(CryptoCoins.self, from: json) else {
completion(.failure(.serializationError)) // <- failing here
return
}
You're decoding the property rates into the wrong type - it's not a dictionary of String keys and an array of Coin values - it's just a single Coin value.
struct CryptoCoins: Decodable {
let rates: [String: Coin] // <- here
}
On a related note, don't hide the error with try?. Capture it and log it, if necessary:
do {
let cryptoCoins = try JSONDecoder().decode(CryptoCoins.self, from: json)
// ..
} catch {
print(error)
}
Then you would have gotten a typeMismatch error for btc key: "Expected to decode Array<Any> but found a dictionary instead.", which would have at least given you a hint of where to look.
I am developing an iOS App that fetches Trivia Questions from Open Trivia Database (API)
After reading the docs and played around with it I think that the best solution is to use base64 encoding (since it seems to be supported in Swift). I have successfully fetched the data and parsed it into structs using a JSONParser. The problem that I have to solve is how to convert the values from base64 to UTF8. (The keys are read correctly, and therefore it maps to my structs)
My first idea was to use decoder.dataDecodingStrategy = .base64, but that does not seem to have any effect at all. And I am not really sure why.
Is that the right way to do it, or should I decode it myself afterwards when the strings are read in to structs?
In short, the result of the Parsing is a struct containing a responseCode as an Int and array containing structs representing the questions with the strings that I want to convert to UTF8 as members
My code for parsing looks like this:
let urlPath = "https://opentdb.com/api.php?amount=10&encode=base64"
let apiURL = URL(string: urlPath)!
URLSession.shared.dataTask(with: apiURL) { (data, response, error) in
guard let data = data else {return}
do{
let decoder = JSONDecoder()
decoder.dataDecodingStrategy = .base64
let questionData = try decoder.decode(Response.self, from: data)
print(questionData)
}catch let err{
print("Error", err)
}
}.resume()
Base64 encoding is used for properties you declared as Data, not as Strings, like so:
struct Response: Codable {
let someBaseEncodedString: Data
var someString: String? {
get {
return String(data: someBaseEncodedString, encoding: .utf8)
}
}
}
So, for the example you are giving, all the properties that are returned as a base64 encoded string should have the Data type in your struct, and then after that decoded as strings.
As suggested by other answers, you can decode Data or Base-64 String after JSONSerialization or JSONDecoder decoded the API results.
But if you prefer to write decoding initializer, you can make it as follows:
This may not be much different from your own Response, I guess.
struct Response: Codable {
var responseCode: Int
var results: [Result]
enum CodingKeys: String, CodingKey {
case responseCode = "response_code"
case results
}
}
To prepare to write a decoding initializer for Response, I would like to use some extensions:
extension KeyedDecodingContainer {
func decodeBase64(forKey key: Key, encoding: String.Encoding) throws -> String {
guard let string = try self.decode(String.self, forKey: key).decodeBase64(encoding: encoding) else {
throw DecodingError.dataCorruptedError(forKey: key, in: self,
debugDescription: "Not a valid Base-64 representing UTF-8")
}
return string
}
func decodeBase64(forKey key: Key, encoding: String.Encoding) throws -> [String] {
var arrContainer = try self.nestedUnkeyedContainer(forKey: key)
var strings: [String] = []
while !arrContainer.isAtEnd {
guard let string = try arrContainer.decode(String.self).decodeBase64(encoding: encoding) else {
throw DecodingError.dataCorruptedError(forKey: key, in: self,
debugDescription: "Not a valid Base-64 representing UTF-8")
}
strings.append(string)
}
return strings
}
}
Using these extensions above, you can define the Result type as follows:
extension Response {
struct Result: Codable {
var category: String
var type: String
var difficulty: String
var question: String
var correctAnswer: String
var incorrectAnswers: [String]
enum CodingKeys: String, CodingKey {
case category
case type
case difficulty
case question
case correctAnswer = "correct_answer"
case incorrectAnswers = "incorrect_answers"
}
init(from decoder: Decoder) throws {
let container = try decoder.container(keyedBy: CodingKeys.self)
self.category = try container.decodeBase64(forKey: .category, encoding: .utf8)
self.type = try container.decodeBase64(forKey: .type, encoding: .utf8)
self.difficulty = try container.decodeBase64(forKey: .difficulty, encoding: .utf8)
self.question = try container.decodeBase64(forKey: .question, encoding: .utf8)
self.correctAnswer = try container.decodeBase64(forKey: .correctAnswer, encoding: .utf8)
self.incorrectAnswers = try container.decodeBase64(forKey: .incorrectAnswers, encoding: .utf8)
}
}
}
(You have not mentioned if your Response (or other name?) is defined as a nested type or not, but I think you can rename or modify it yourself.)
With all things above, you can simply decode the API response as:
do {
let decoder = JSONDecoder()
let questionData = try decoder.decode(Response.self, from: data)
print(questionData)
} catch {
print("Error", error)
}
By the way, you say I think that the best solution is to use base64 encoding (since it seems to be supported in Swift), but is that really true?
Base-64 to Data is supported in JSONDecoder, but it is not what you expect. So, using another encoding can be a better choice.
But, anyway, JSON string can represent all unicode characters using only ASCII with \uXXXX or \uHHHH\uLLLL. So, I do not understand why the API designers do not provide an option Standard JSON Encoding. If you can contact to them, please tell them to provide the option, that may simplify many client side codes.
I have a generic REST request:
struct Request<T> {…}
The T is the return type of the request, for example:
struct Animal {…}
let animalRequest = Request<Animal>
let animal: Animal = sendRequest(animalRequest)
Now I would like to express that the generic type has to conform to Decodable so that I can decode the JSON response from the server:
struct Request<T> where T: Decodable {…}
struct Animal: Decodable {…}
This makes sense and works – until I arrive at a request that has no response, a Request<Void>. The compiler is not happy about that one:
Type 'Void' does not conform to protocol 'Decodable'
My mischevious attempt to solve this by adding the Decodable conformance to Void was quickly found out by the compiler:
extension Void: Decodable {…} // Error: Non-nominal type 'Void' cannot be extended
It feels right to have the request generic over the return type. Is there a way to make it work with Void return types? (For example the requests that just create something on the server and don’t return anything.)
A simple workaround is to introduce a custom “no-reply” type that would replace Void:
struct NoReply: Decodable {}
Conforming Void to Decodable is not possible. Void is just a type alias for an empty tuple, (), and tuples cannot conform to protocols at this moment, but they will, eventually.
I found that sometimes other encoded objects of another types can be decoded to NoReply.self. For example custom Error type (enum) can be.
Playground example of this case:
enum MyError: String, Codable {
case general
}
let voidInstance = VoidResult()
let errorInstance = MyError.general
let data1 = try! JSONEncoder().encode(voidInstance)
let data2 = try! JSONEncoder().encode(errorInstance)
let voidInstanceDecoded = try! JSONDecoder().decode(VoidResult.self, from: data1)
//VoidResult as expected
let errorInstanceDecoded = try! JSONDecoder().decode(MyError.self, from: data2)
//MyError.general as expected
let voidInstanceDecodedFromError = try! JSONDecoder().decode(VoidResult.self, from: data2)
//VoidResult - NOT EXPECTED
let errorInstanceDecodedFromVoid = try! JSONDecoder().decode(ScreenError.self, from: data1)
//DecodingError.typeMismatch - Expected
So my suggestion is to add "uniqueness to NoReply (zoul's answer)):
struct VoidResult: Codable {
var id = UUID()
}
let voidInstanceDecodedFromError = try! JSONDecoder().decode(VoidResult.self, from: data2)
//DecodingError.typeMismatch - Now its fine - as expected
In the following snippet of code I am calling an API to retrieve some data from a database.
let json = try JSONSerialization.jsonObject(with: data!, options: .allowFragments) as? [String:AnyObject]
if let dict = json?["personalLeagueStats"] as? [String:AnyObject] {
if let dataMostWinsAgainst = dict["leaguePersonalMostWinsAgainst"] as? [[String : AnyObject]] {
let newdictMostWinsAgainst = dataMostWinsAgainst.first!
let tempMostWinsAgainstUserName = newdictMostWinsAgainst ["member"] as? String
let tempMostWinsAgainstNumber = newdictMostWinsAgainst ["winPercentage"] as? String
let temp2number = Int(tempMostWinsAgainstNumber!)
print (temp2number)
self.mostWinsAgainstPlayer.text = tempMostWinsAgainstUserName
}
}
The data (winPercentage) i believe is stored as an integer as my SQL statement returns a value of e.g 0.67
I want to display this value in my app as a percentage so in Swift I want to multiply it by 100. Please don't ask why don't I do this in PHP as I am trying to learn Swift and would like to know the process for future reference.
It is my understanding that in the following the snippet when the value for winPercentage is stored, it is stored as a String in tempMostWinsAgainstNumber.
When I try and convert this to an integer (to then perform the math) and print the value it prints nil.
Why is this?
How can I convert the value stored in `["winPercentage"] into an integer so I can multiply it by 100?
You need to convert 0.67 to a Double, not an Int.
let tempMostWinsAgainstNumber = newdictMostWinsAgainst ["winPercentage"] as? String
guard let temp2number = Double(tempMostWinsAgainstNumber) else {
// the String could not be converted to a Double
}
print(temp2number)