How to use an init(from decoder:) when reading bundled JSON data - json

I have have a JSON dictionary array and need to decode it to make a data type (Hitter), the Hitter object needs to hold the raw JSON data and I need to add properties to the Hitter throughout the life of the app. I've discovered I need to use init(from decoder:) but how do I call the init properly...I will be storing the Hitter in the League struct.
struct League {
var Hitters: Set<Hitter>
func loadHitters() {
...//ingest json from bundle,
//store decoded Hitter objects in temp array,
//then append each item from temp array into League.Hitters set.
}
}
To be clear, my Hitter struct already has the init set up, I need help using the decoder from this point.
EDIT:
I've discovered the solution and it requires the use of retrieving the JSON keys and linking them to your CodingKeys enum. Here's my shortened Hitter class:
struct Hitter: Player, Decodable {
//partial list of properties in JSON data
let strPos: String
let OBP: Float
let wRAA: Float //weightedRunsAboveAverage
//partial list of properties in JSON data - end
//partial list of additional properties
var valAboveReplacement: Float = 0.0
var wOBP: Float {
return OBP * Float(PA)
}
//partial list of additional properties - end
}
I declare the CodingKeys and init(decoder:) in an extension for compartmentalization's sake
extension Hitter {
enum CodingKeys: String, CodingKey {
case strPos, OBP, wRAA
}
convenience init(from decoder: Decoder) {
//container links all the CodingKeys and JSONDecoder keys for proper referencing. Returns the data stored in this decoder as represented in a container keyed by the given key type
let container = try decoder.container(keyedBy: CodingKeys.self)
let strPos = try container.decode(String.self, forKey: .strPos)
let OBP = try container.decode(Float.self, forKey: .OBP)
let wRAA = try container.decode(Float.self, forKey: .wRAA)
//pass my decoded values via a standard initializer
self.init(strPos: strPos, OBP: OBP, wRAA: wRAA)
}
}
This seems to work perfectly fine so long as I explicitly link the JSON format and the CodingKeys via a container.

If you design a struct, you get an memberwise initializer for free — assuming you don't define others initializers yourself But Struct auto inititializers are internal. You have to generate memeberwise initializer when you writing a module to make it public
"Default Memberwise Initializers for Structure Types The default
memberwise initializer for a structure type is considered private if
any of the structure’s stored properties are private. Otherwise, the
initializer has an access level of internal.
As with the default initializer above, if you want a public structure
type to be initializable with a memberwise initializer when used in
another module, you must provide a public memberwise initializer
yourself as part of the type’s definition."
Swift Docs "Swift Programming Language"

Related

Using a KClass reference as a reified parameter to deserialize from JSON

I'm trying to implement a general serialization framework to convert outgoing and incoming messages to json using the kotlinx serialialization. I'm developing a multiplatform app, so I'm trying to get it to run on KotlinJVM and KotlinJS.
For this, I add a type field to every message and use a map that maps each type string to a KClass. What's the type for that map? It contains KClass<> objects whose classes extend the Message class, therefore in java I'd specify my map as
Map<KClass<? extends Message>, String>.
How can I do that in Kotlin?
Afterwards I need to serialize and deserialize the message based on its key and therefore type. Java frameworks take a Class parameter for the type of the object I want to deserialize/instantiate (e.g. gson.fromJson(ClientMessage.class)). In Kotlin this is done using reified parameters Json.decodeFromString<Type>. I do not know the type of the message at compile time though and just have a reference to a KClass, how can I instantiate an object based on that?
#Serializable
open class Message(val type: String) {
companion object {
val messageTypes: Map<KClass<out Message>, String> = mapOf(
ClientLoginMessage::class to "clientLoginMessage",
Message::class to "message"
)
inline fun <reified T> getMessageTypeByClass(): String = messageTypes[T::class]!! // utility for defining the type in the constructors of the individual messages
}
fun toJson() = Json.encodeToString(this)
fun fromJson(json: String): Message? {
val plainMessage = Json.decodeFromString<Message>(json) // get type string from json
return messageTypes.entries.find { it.value == plainMessage.type }?.let {
// how can I use the KClass from it.key as reified parameter?
Json.decodeFromString<?????>(json)
}
}
}
#Serializable
class ClientLoginMessage
: Message(Message.getMessageTypeByClass<ClientLoginMessage>()) {}
Create a map of serializers like for types:
val serializers: Map<KClass<out Message>, KSerializer<out Message>> = mapOf(
ClientLoginMessage::class to ClientLoginMessage.serializer(),
Message::class to Message.serializer()
)
Pass in the serializer needed to Json.decodeFromString like this:
fun fromJson(json: String): Message? {
val plainMessage = Json.decodeFromString<Message>(json) // get type string from json
return messageTypes.entries.find { it.value == plainMessage.type }?.let {
// how can I use the KClass from it.key as reified parameter?
Json.decodeFromString(serializers.get(plainMessage.type)!!, json)
}
}
You might also want to have a look at the Kotlin built in handling of polymorphic classes: https://github.com/Kotlin/kotlinx.serialization/blob/master/docs/polymorphism.md

Do all attributes with a custom type in core-data have to be a relationship?

This is a followup question from the comments in the following How to map nested complex JSON objects and save them to core data?.
Imagine that I already have this code for my app.
class Passenger{
var name: String
var number: String
var image : UIImage
// init method
}
class Trip {
let tripNumber : Int
let passenger : Passenger
init(tripNumber: Int, passenger: Passenger) {
self.tripNumber = tripNumber
self.passenger = passenger
}
}
Now I've decided to add persistence for my app. I just want to have a table of Trips. I want to show the passengers under trips, but don't need a table to query passengers directly. It's just a custom object/property of trip. Every time I access passenger it would be through Trips.
So is there a way that I can create a new subclass of NSManagedObject named 'TripEntity' and store my passengers — WITHOUT 1. creating another NSManagedObject subclass for 'Passenger' 2. Creating a relationship with an inverse relationship between Passenger and Trip? Simply put I just want it to be an attribute. Conceptually to me it's also just an attribute. It's not really a relationship...
Or is that once you're using Core-data then every custom type needs to be explicitly a subclass of NSManagedObject? Otherwise it won't get persisted. I'm guessing this also what object graph means. That your graph needs to be complete. Anything outside the graph is ignored...
I'm asking this because the JSON object that I actually want to store is gigantic and I'm trying to reduce the work needed to be done.
You can add passengers to one Trip entity but as the attribute types are restricted you have to use a transformable type which can be quite expensive to archive and unarchive the objects.
The most efficient way if the source data is JSON is to create Core Data entities for Passenger and Trip and add inverse relationships. Then make all NSManagedObject classes adopt Codable and add init(from decoder and encode(to encoder: methods to each class.
For example let's assume that the Trip class has a to-many relationship to Passenger it could look like
#NSManaged public var tripNumber: Int32
#NSManaged public var passengers: Set<Passenger>
and in the Passenger class there is an attribute trip
#NSManaged public var trip: Trip?
this is the required Decodable initializer in Trip. The decoder can decode arrays as well as sets.
private enum CodingKeys: String, CodingKey { case tripNumber, passengers }
public required convenience init(from decoder: Decoder) throws {
guard let context = decoder.userInfo[.context] as? NSManagedObjectContext else { fatalError("Context Error") }
let entity = NSEntityDescription.entity(forEntityName: "Trip", in: context)!
self.init(entity: entity, insertInto: context)
let values = try decoder.container(keyedBy: CodingKeys.self)
tripNumber = try values.decode(Int32.self, forKey: .tripNumber)
passengers = try values.decode(Set<Passenger>.self, forKey: .passengers)
passengers.forEach{ $0.trip = self }
}
The inverse relationship is set via the forEach line.
You have to add an extension of JSONDecoder to be able to pass the current NSManagedObjectContext in the userInfo object.
The corresponding encoder is – pretty straightforward –
public func encode(to encoder: Encoder) throws {
var container = encoder.container(keyedBy: CodingKeys.self)
try container.encode(tripNumber, forKey: .tripNumber)
try container.encode(passengers, forKey: .passengers)
}
NSManagedObject classes adopting Codable are very comfortable for pre-populating the database from JSON data or making JSON backups.
Any custom property needs to be something that can be represented as one of the Core Data property types. That includes obvious things like strings and numeric values. It also includes "binary", which is anything that can be transformed to/from NSData (Data in Swift).
From your description, the best approach is probably to
Adopt NSCoding for your Passenger class.
Make the passenger property a Core Data "transformable" type. (as an aside, should this be an array of passengers? You have a single related passenger, but your question describes it as if there's more than one).
After doing #2, set the "custom class" field for the passenger property to be the name of your class-- Passenger.
If you do these two things, Core Data will automatically invoke the NSCoding methods to convert between your Passenger class and a binary blob that can be saved in Core Data.

"Dynamic" JSON Decoding with Circe

Consider this JSON:
{
"myDocument": {
"static_key": "value",
"dynamic_key": "value",
"static_key2": "value2",
"dynamic_key2": {
"dynamic_key3": "value3"
}
}
}
The JSON documents that I'm going to process have some static keys (fields that I know that are going to be present always) but has some others that could be or not be present, with names unbeknownst in advance for mapping them to some case class.
I have the absolute path to the field in a String (retrieved from a configuration stored in a Database) with the following structure:
"myDocument.dynamic_key2.dynamic_key3"
I know that I need to have an ADT for mapping back and forth, so I came to this:
sealed trait Data
final case class StringTuple(key: String, value: String) extends Data
object StringTuple {
implicit val encoder: Encoder[StringTuple] = deriveEncoder[StringTuple]
implicit val decoder: Decoder[StringTuple] = deriveDecoder[StringTuple]
}
final case class NumericTuple(key: String, value: Double) extends Data
object NumericTuple {
implicit val encoder: Encoder[NumericTuple] = deriveEncoder[NumericTuple]
implicit val decoder: Decoder[NumericTuple] = deriveDecoder[NumericTuple]
}
final case class DateTuple(key: String, value: OffsetDateTime) extends Data
object DateTuple {
implicit val encoder: Encoder[DateTuple] = deriveEncoder[DateTuple]
implicit val decoder: Decoder[DateTuple] = deriveDecoder[DateTuple]
}
final case class TransformedJson(data: Data*)
object TransformedJson {
def apply(data: Data*): TransformedJson = new TransformedJson(data: _*)
implicit val encoder: Encoder[TransformedJson] = deriveEncoder[TransformedJson]
implicit val decoder: Decoder[TransformedJson] = deriveDecoder[TransformedJson]
}
Based on this discussion, it makes no sense to use Map[String, Any] with circe, so I divided the three possible key-value cases that I will encounter when parsing individual fields:
A numeric field, that I'm going to parse as Double.
A String field parsed as is (String).
A date field parsed as OffsetDateTime.
For that reason, I've created three case classes that model these combinations (NumericTuple, StringTuple and DateTuple) and my idea is to produce an output JSON like this:
{
"dynamic_key": "extractedValue",
"dynamic_key3": "extractedValue3",
...
}
("Plain", with no nesting at all).
My idea is to create a list of Data objects to achieve this and I have something like this:
def extractValue(confElement: Conf, json: String) = {
val cursor: HCursor = parse(json).getOrElse(Json.Null).hcursor
val decodeDynamicParam = Decoder[NumericTuple].prepare(
/*
Here I think (not sure) that I can extract the value with the decoder,
but, how can I extract the key name and set it, alongside with the extracted
value?
*/
_.downField(confElement.path)
)
}
Some considerations:
Based on Travis' response to this question I'm trying to model the JSON as closely as possible for working with circe. That's why I tried with the tuples model.
Based (again) on Travis's response to this SO question is that I'm trying with Decode.prepare(...) method. And here comes my question...
Question: How do I extract the specific key name of the current cursor position and map it to the Tuple?
I need only the current key, not all the key set that .keys method of ACursor returns. With that key, I want to map manually the Tuple with the current key name and the extracted value.
For summing it up, my need is to transform a structure that has some unknown keys (name and position), extract their values based on the absolute-dot-separated path that I have and lift both the key name and the value name to a case class that I suffixed as Tuple.
Can you shed some light about this?
Thanks

Initialize subclasses from json

I am looking for a good way of persisting arbitrary subclasses.
I am writing objects asDictionary to json upon save, and init(json) them back upon load. The structure is Groups that have Units of different kind. The Group only knows its units as something implementing UnitProtocol.
The subclasses UA etc of Unit has the exact same data as a Unit. So data wise, the asDictionary and init(json) fits well in Unit. The subclasses only differ in logic. So when restoring them from file, I believe it has to be the exact subclass that is initialized.
(Bad) Solutions I thought of
Letting every group know that it can have Units of different subclasses by not holding them as [UnitProtocol] only, but also as [UA], [UB], etc, that can saved separately, restored by their respective sub inits, and be merged into a [UnitProtocol] upon init.
Store subunits with their classname and create a Unit.Init(json) that somehow is able to pass down the initialization depending on subtype.
?? Still thinking, but I believe there has to be something I can learn here to do this in a maintainable way without breaking the single responsibility policy.
For init class from json I used this technic :
//For row in json
for row in json {
//namespace it use for init class
let namespace = (Bundle.main.infoDictionary!["CFBundleExecutable"] as! String).replacingOccurrences(of: " ", with: "_")
//create instance of your class
if let myClass = NSClassFromString("\(namespace).\(row.name)") as? NameProtocol.Type{
//init your class
let classInit : NameProtocol = myClass.init(myArgument: "Hey")
//classInit is up for use
}
}
Store every Unit json with its classname
func asDictionary() -> Dictionary<String, Any> {
let className = String(describing: type(of: self))
let d = Dictionary<String, Any>(
dictionaryLiteral:
("className", className),
("someUnitData", someUnitData),
// and so on, for all the data of Unit...
And init from json with #horusdunord's solution:
for unitJson in unitsJson {
let className = (unitJson as! [String: Any])["className"] as? String
let namespace = (Bundle.main.infoDictionary!["CFBundleExecutable"] as! String).replacingOccurrences(of: " ", with: "_")
if let unitSubclass = NSClassFromString("\(namespace).\(className ?? "N/A")") as? UnitProtocol.Type {
if let unit = unitSubclass.init(json: unitJson as! [String : Any]) {
units.append(unit)
} else {
return nil
}
} else {
return nil
}
}
The trick here is that casting the class unitSubclass to UnitProtocol then allows you to call its init(json) declared in that protocol but implemented in the particular subclass, or in its superclass Unit, if the properties are all the same.

Rust Json serialization overlapping responsibilities

I'm learning Json serialization in Rust, in particular, how to serialize Rust objects to Json.
Currently I see 3 methods of converting an instance of a struct to Json:
Deriving Encodable trait
Manual implementation of ToJson trait
Manual implementation of Encodable trait
Below code illustrates all 3 approaches:
extern crate serialize;
use serialize::{Encoder, Encodable, json};
use serialize::json::{Json, ToJson};
use std::collections::TreeMap;
fn main() {
let document = Document::new();
let word_document = WordDocument::new();
println!("1. Deriving `Encodable`: {}", json::encode(&document));
println!("2. Manually implementing `ToJson` trait: {}", document.to_json());
println!("3. Manually implementing `Encodable` trait: {}", json::encode(&word_document));
}
#[deriving(Encodable)]
struct Document<'a> {
metadata: Vec<(&'a str, &'a str)>
}
impl<'a> Document<'a> {
fn new() -> Document<'a> {
let metadata = vec!(("Title", "Untitled Document 1"));
Document {metadata: metadata}
}
}
impl<'a> ToJson for Document<'a> {
fn to_json(&self) -> Json {
let mut tm = TreeMap::new();
for &(ref mk, ref mv) in self.metadata.iter() {
tm.insert(mk.to_string(), mv.to_string().to_json());
}
json::Object(tm)
}
}
struct WordDocument<'a> {
metadata: Vec<(&'a str, &'a str)>
}
impl<'a> WordDocument<'a> {
fn new() -> WordDocument<'a> {
let metadata = vec!(("Title", "Untitled Word Document 1"));
WordDocument {metadata: metadata}
}
}
impl<'a, E, S: Encoder<E>> Encodable<S, E> for WordDocument<'a> {
fn encode(&self, s: &mut S) -> Result<(), E> {
s.emit_map(self.metadata.len(), |e| {
let mut i = 0;
for &(ref key, ref val) in self.metadata.iter() {
try!(e.emit_map_elt_key(i, |e| key.encode(e)));
try!(e.emit_map_elt_val(i, |e| val.encode(e)));
i += 1;
}
Ok(())
})
}
}
Rust playpen: http://is.gd/r7cYmE
Results:
1. Deriving `Encodable`: {"metadata":[["Title","Untitled Document 1"]]}
2. Manually implementing `ToJson` trait: {"Title":"Untitled Document 1"}
3. Manually implementing `Encodable` trait: {"Title":"Untitled Word Document 1"}
First method is fully automatic, but does not provide sufficient flexibility.
Second and third achieve same level of flexibility by specifying the serialization process manually. In my case I want document metadata to be serialized as an object, not as an array (which is what deriving implementation gives me).
Questions:
Why do methods 2 and 3 exist at all? I don't understand the reason for the overlap between them. I would expect there to exist only one automatic (deriving) method of serialization and one manual.
If I want manual serialization, which method should I choose and why?
Am I right in assuming that method 2 will build a Json enum in memory (besides the struct itself) and is a worse fit for huge documents (multi megabytes), while method 3 is streaming and safer for huge documents?
Why does rust stdlib use method 3 even for primitives, while not using method 2 internally?
Why do methods 2 and 3 exist at all? I don't understand the reason for the overlap between them. I would expect there to exist only one automatic (deriving) method of serialization and one manual.
Method 2 (the ToJson trait) is specific to encoding JSON. It returns JSON objects, instead of writing to a stream. One example of use is for mapping to custom representations - see this example in the documentation.
Method 3 (implementing Encodable) has to exist for method 1 to work.
Am I right in assuming that method 2 will build a Json enum in memory (besides the struct itself) and is a worse fit for huge documents (multi megabytes), while method 3 is streaming and safer for huge documents?
Yes. ToJson creates a nested Json enum of the whole object, while Encodable streams to a Writer.
If I want manual serialization, which method should I choose and why?
Why does rust stdlib use method 3 even for primitives, while not using method 2 internally?
You should use Encodable. It is not specific to the JSON format, does not use an intermediate representation (so can be streamed instead of stored in memory) and should continue to work with new formats added to the library.