swift-corelibs-foundation: [SR-7054] JSONDecoder Decimal precision error

Previous ID SR-7054
Radar rdar://problem/33491336
Original Reporter tiborbodecs (JIRA User)
Type Bug
Environment

Xcode 9.2 (9C40b)
Apple Swift version 4.0.3 (swiftlang-900.0.74.1 clang-900.0.39.2)

Same output with 4.1 swift-DEVELOPMENT-SNAPSHOT-2018-02-20-a-ubuntu16.04.

Additional Detail from JIRA
Votes 38
Component/s Foundation
Labels Bug
Assignee bendjones (JIRA)
Priority Medium

md5: 4d586dd9920250512d8b01ea1b3597c9

Issue Description:

Decoding decimals from a JSON structure returns incorrect numbers.

See the following example:

#!/usr/bin/env swift

import Foundation

struct DoubleItem : Codable \{
 var name: String
 var price: Double
}

struct DecimalItem : Codable \{
 var name: String
 var price: Decimal
}

let jsonString = """
\{ "name": "Gum ball", "price": 46.984765 }
"""
let jsonData = jsonString.data(using: .utf8)!

let decoder = JSONDecoder()
do \{
 let doubleItem = try decoder.decode(DoubleItem.self, from: jsonData)
 print(doubleItem)

let decimalItem = try decoder.decode(DecimalItem.self, from: jsonData)
 print(decimalItem)
} 
catch \{
 print(error)
}

Output:

DoubleItem(name: "Gum ball", price: 46.984765000000003)

DecimalItem(name: "Gum ball", price: 46.98476500000001024)

Expected result for the DecimalItem should be: 46.984765

I know that floating point values can not be represented precisely, but decimals should keep their original values, am I wrong? Is this a Swift foundation bug or an intended behavior? Note that encoding the same value with the JSONEncoder will provide the correct value in the JSON.

My actual problem is that the JSONEncoder and JSONDecoder classes are working inconsistently.

If the encoder will output the value as a number, I’d expect that the decoder can decode the exact same value from that if I use the decimal type in my model. My other idea is that the encoder should transform and save the value into a string type inside the JSON instead of the current representation, this could be supported with a decimal coding strategy.

let encoder = JSONEncoder()
encoder.decimalEncodingStrategy = .string //outputs as string: "46.984765"
encoder.decimalEncodingStrategy = .precise //outputs as number: 46.984765
encoder.decimalEncodingStrategy = .lossy //current output like: 46.984765


let encoder = JSONDecoder()
encoder.decimalDecodingStrategy = .string //decoded value from json string:  46.984765
encoder.decimalDecodingStrategy = .precise //decoded value from json number: 46.984765
encoder.decimalDecodingStrategy = .lossy //current value from number:  46.98476500000001024

Please share your thoughts about this idea.

About this issue

  • Original URL
  • State: open
  • Created 6 years ago
  • Reactions: 2
  • Comments: 47 (34 by maintainers)

Most upvoted comments

@guillaumealgis

This seems to be fixed on iOS 15 and newer.

Deserializing 0.0070918203981 as Decimal:

on iOS 14, the result is 0.007091820398099999744 ❌ on iOS 15, the result is 0.0070918203981

Sadly, we target iOS 13+ in our applications and SDKs, so we need to wait a few years, and then we can drop the “string from the backend” approach 😬

@guillaumealgis @scoreyou bear in mind this is corelibs-foundation, not Apple Foundation. It might’ve been fixed in this one but not Apple’s. You could try running Gum Ball on Linux to verify.

Soon enough it could be fixed in both though, as Foundation is being rewritten in Swift, open sourced, and shared across all platforms.