JSON Endoding and Decoding CoreData - json

I am very new to Swift programming, but fairly competent in programming in other languages.
I have a project which uses NSPersistenContainer for CoreData. I would like to export and re-import the data using JSON or XML.
I can manually generate a CSV file, but that’s limited in its usefulness, so I would prefer JSON; XML if I have to.
Everything I’ve found is dated, and requires extending NSManagedObject and using Codable. I gather this would not apply if I am using NSPersistenContainer.
Is there anything built in to modern Swift, or how would I go about doing this?

Codable is the thing built in to Swift to do what you want, and extending or subclassing NSManagedObject is how you use it with Core Data. Using NSPersistentContainer is orthogonal to the question. You almost certainly want it but it has no bearing on JSON import/export. It’s there to set up Core Data for you but you still use managed objects as your data model. Core Data doesn’t have built in support for JSON; it relies on the existence of Codable to provide that.

Related

Recommended way to deserialize a JSON from an httpClient response

I am trying to understand the recommended way of parsing a JSON into an object, particularly from httpClient responses (but my question may also relate to parsing JSON from streams in general)
I have scoured the Internet reading many blog posts and that's what I have come up with:
I understand that parsing a stream to a string and then parsing the string to an object is a big no-no in terms of memory usage.
And according to many blog posts I have come across the traditional way of doing it, is or used to be working with streams, using the package Newtonsoft.JSON as follows:
using var streamReader = new StreamReader(stream);
using var jsonTextReader = new JsonTextReader(streamReader);
var myDeserializedObject = new JsonSerializer().Deserialize<MyObject>(jsonTextReader);
But then I have come across another way of doing it:
If you are using .NET Core 3 and above (not so sure about the version) you have a built-in way of deserializing the stream using System.Text.JSON:
var myDeserializedObject = await JsonSerializer.DeserializeAsync<MyObject>(stream);
and particularly to httpClient requests (and if you are using .NET 5 and above if I am not mistaken)
you can do:
var myDeserializedObject = httpClient.GetFromJsonAsync<MyObject>();
Please if someone could explain the ups and downs (if there are any) of each approach, especially in terms of performance and memory usage.
My company has been discussing the something of a similar situation. We ended with the following considerations .
Using Newtonsoft.JSON:
Pros:
A widely used library with a lot of features, including serializing and deserializing JSON.
Provides good control over the serialization and deserialization process.
Cons:
Can consume more memory due to string conversion of the JSON stream.
May have performance overhead when serializing and deserializing large JSON payloads.
Using System.Text.Json:
Pros:
Built-in and optimized for performance in .NET Core 3 and above.
Consumes less memory compared to Newtonsoft.JSON.
Has improved performance compared to Newtonsoft.JSON.
Cons:
Has limited options for customizing the serialization and deserialization process.
May not have all the features available in Newtonsoft.JSON.
For most cases, System.Text.Json should be sufficient for deserializing JSON payloads. However, for more complex JSON serialization/deserialization requirements, Newtonsoft.JSON may still be the preferred choice. Ultimately, the choice depends on the specific requirements of the project if there are any guiding and limitations for your project.

Automatic library generation for JSON

Is there a tool like Google's Protobuf for JSON? I know you can convert from a Protobuf format to JSON but that requires a whole lot of extra serialization/deserialization, and I was wondering if there is some kind of tool that lets you specify the structure of a JSON message and then automatically generates libraries for use in a specified language (direct serialization/deserialization not just a wrapper around Protobuf's JSON formatter class)
I know nearly all languages provide their own in house way of handling JSON, and many higher level ones even allow you to avoid the boiler plate parsing code, but I was looking for a universal tool where you would only need to specify the format once, and then just get the generated libraries for use in multiple languages.
The Protobuf equivalent would be JSON-Schema, but still is language dependent on having a serializer or code generator available, just as Protobuf is.
If you're looking at making a REST-API, then OpenAPI Spec + swagger-codegen could be an option.

Best way to store golang map[string]interface into data store

Scenario: I am having a arbitrary JSON size ranging from 300 KB - 6.5 MB read from MongoDB. Since it is very much arbitrary/dynamic data I cannot have struct type defined in golang, So I am using map[sting]interface{} type. And string of JSON data is parsed by using encoding/json's Unmarshal method. Some what similar to what is mentioned in Generic JSON with interface{}.
Issue: But the problem is its taking more time (around 30ms to 180ms) to parse the string json into map[string]interface{}. (Comparing to php parsing json using json_encode/decode / igbinary/ msgpack)
Question: Is there any way to pre-process it and store in cache?
I mean parse string into map[string]interface{} and serialize it and store to some cache, then when we retrieve it should not take much time to unserialization and proceed with execution.
Note: I am Newbie for the golang any suggestion are highly appriciated. Thanks
Updates: Serialization using Gob, binary built-in package & Msgpack implementation for Golang package are already tried. No luck, No improvement in the time to unserialization.
The standard library package for JSON is notoriously slow. There is a good reason for that: it use RTTI to provide a really flexible interface that is really simple. Hence the unmarshalling being slower than PHP's…
Fortunately, there is an alternative way, which is to implement the json.Unmarshaller interface on the types you want to use. If this interface is implemented, the package will use it instead of its standard method so you can see huge performance boosts.
And to help you, there is a small group of tools that appeared, among which:
https://godoc.org/github.com/benbjohnson/megajson
https://godoc.org/github.com/pquerna/ffjson
(listing here the main players from memory, there must be others)
These tools will generate tailored implementations of the json.Unmarshaller interface for the types you requested. And with go:generate, you can even integrate them seamlessly to your build step.

Windows Store App: Performance of JSON parser

There are many possibilities to parse a JSON in context of a Windows Store App.
Regardless in which language (C#, JavaScript or C++).
For example: .NET 4.5 JsonObject and DataContractJsonSerializer, JavaScript Json parser or an extern one like Json.NET.
Does anybody know something about that?
I only read good things about Json.NET's performance.
But are they true and play that a role for JSON's which include datasets of 100k JSON objects? Or the user won't notice a difference?
I only have experience using Json.NET ... works just fast and great! I also used the library in enterprise-projects - i never got disappointed!
If it helps, and FWIW, I've been collecting recently some new JSON parsing / deserialization performance data that can be observed over various JSON payload "shapes" (and sizes), when using four JSON librairies, there:
https://github.com/ysharplanguage/FastJsonParser#Performances
(.NET's out-of-the-box JavaScriptSerializer vs. JSON.NET vs. ServiceStack vs. JsonParser)
Please note:
these figures are for the full .NET only (i.e., the desktop / server tier; not mobile devices)
I was interested in getting new benchmark figures about parsing / deserialization performances only (i.e., not serialization)
finally, I was also especially interested (although not exclusively) in figures re: strongly typed deserialization use cases (i.e., deserializing into POCOs)
'Hope this helps,

How can I marshal JSON to/from a POJO for BlackBerry Java?

I'm writing a RIM BlackBerry client app. BlackBerry uses a simplified version of Java (no generics, no annotations, limited collections support, etc.; roughly a Java 1.3 dialect). My client will be speaking JSON to a server. We have a bunch of JAXB-generated POJOs, but they're heavily annotated, and they use various classes that aren't available on this platform (ArrayList, BigDecimal, XMLGregorianCalendar). We also have the XSD used by the JAXB-XJC compiler to generate those source files.
Being the lazy programmer that I am, I'd really rather not manually translate the existing source files to Java 1.3-compatible JSON-marshalling classes. I already tried JAXB 1.0.6 xjc. Unfortunately, it doesn't understand the XSD file well enough to emit proper classes.
Do you know of a tool that will take JAXB 2.0 XSD files and emit Java 1.3 classes? And do you know of a JSON marshalling library that works with old Java?
I think I am doomed because JSON arrived around 2006, and Java 5 was released in late 2004, meaning that people probably wouldn't be writing JSON-parsing code for old versions of Java.
However, it seems that there must be good JSON libraries for J2ME, which is why I'm holding out hope.
For the first part good luck but I really don't think you're going to find a better solution than to modify the code yourself. However, there is a good J2ME JSON library you can find a link to the mirror here.
I ended up using apt (annotation processing tool) to run over the 1.5 sources and emit new 1.3-friendly source. Actually turned out to be a pretty nice solution!
I still haven't figured out an elegant way to do the actual JSON marshalling, but the apt tool can probably help write the rote code that interfaces with a JSON library like the one Jonathan pointed out.