Handling Large JSON bodies in Salesforce - json

Is there was way to digest large JSON bodies like you would with say a spring-boot project #JsonIgnoreProperties(ignoreUnknown=true). Where you do not have to have a DTO in its entirety matching the JSON body? I recently hit some limits with this approach of 1:1(DTO:JSON) and I have not been successful in finding something that will allow me to ignore the massive JSON bodies(7MB and UP).
Thank you all for your input.

You could use JSON.deserialize(jsonString, apexType). Unlike JSON.deserializeStrict() this method just ignores unknown attributes instead of throwing an exception.
You could test it in developer console:
String jsonString = '{ "unknown1": "aaa", "unknown2": "bbb", "type": "xyz", "id": "12345" }';
Data data = (Data) JSON.deserialize(jsonString, Data.class); // No Exception
System.debug(data);
try {
Data data = (Data) JSON.deserializeStrict(jsonString, Data.class);
} catch(JSONException e) {
System.debug('Exception thrown'); // You'll see this.
}
class Data {
public String type;
public String id;
}
Anyway if you have to handle more than 7MB of data, you should use an async transaction: the heap size limit is 6MB for synchronous one. Governor Limits

Related

How do I PUT and GET JSON object of ANY format in a Spring Boot REST API using MySQL?

I need to be able to PUT a JSON data from POSTMAN, with no fixed format, store it in database(in MYSQL preferably with datatype: JSON) and then send a GET request to get the same value back. The data can be uniquely identified by an id that I'll be sending as a path variable.
I cannot define an entity class as the JSON will have no fixed format. How do I proceed?
#PutMapping("/sample/{sampleNo}")
public ResponseEntity<Object> addSampleData(
#ApiParam("Sampleto PUT") #PathVariable Long sampleNo, #RequestBody String sampleBody) {
if (sampleBody!= null) {
sampleService.save(new Sample(sampleNo, sampleBody));
return new ResponseEntity<>(HttpStatus.OK);
} else {
return new ResponseEntity<>(HttpStatus.BAD_REQUEST);
}
}
Not sure how to proceed with the "save" functionality so that the data is stored as a JSON object, because even if I've used datatype JSON n mySQL, GET returns a string with "\".
{
"sampleNo": 1,
"sampleData": "{\"sample\": \"test\", \"sampleNo\": \"20\"}"
}
Example: PUT request body
{
"field1": "test",
"field2": 20
}
GET response body expected
{
"field1": "test",
"field2": 20
}
P.S: I don't need to process the data, so it doesn't matter how I store it, I just need to be able to GET it back in the same format(JSON) it arrived(in the PUT req).
sampleData is returned as JSON String, so it is reasonable to include escape sequence \":
This is a valid JSON String:
String json = "{\"sample\": \"test\", \"sampleNo\": \"20\"}";
This does not compile:
String invalidJson = "{"sample": "test", "sampleNo": "20"}";
Solution:
In order to have the expected response, you have to map your MySQL JSON column into an Object.
There are several ways to achieve this, take a look at a solution here. It maps JSON column into a Map<String, Object>.

Node-RED parse json

I am trying to pull out the value "533"
{
"d": {
"ItemValue_1": 533
},
"ts": "2021-01-20T10:59:41.958591"
}
This does not work
var ItemValue_1 = msg.payload.ItemValue_1;
msg.payload = ItemValue_1;
return msg;
My result is unsuccessful
I was able to solve on my own, it works.
sensorMeasurement=JSON.parse(msg.payload);
msg.payload=sensorMeasurement.d;
var msgout = {payload : msg.payload.ItemValue_1}
return msgout;
The better way to do this is as follows:
Add a JSON node before the function node, this will turn a string payload in to a JSON object (assuming the string actually represents a JSON object).
Then if you are using a function node the following:
msg.payload = msg.payload.d.ItemValue_1;
return msg;
It is bad practice to create a new object as you did in your answer as it throws away any meta data that may be required later that is attached to the original object.
Rather than use a function node it would also be cleaner to use a change node and the Move mode to shift the msg.payload.d.ItemValue_1 to msg.payload

Avro Json.ObjectWriter - "Not the Json schema" error

I'm writing a tool to convert data from a homegrown format to Avro, JSON and Parquet, using Avro 1.8.0. Conversion to Avro and Parquet is working okay, but JSON conversion throws the following error:
Exception in thread "main" java.lang.RuntimeException: Not the Json schema:
{"type":"record","name":"Torperf","namespace":"converTor.torperf",
"fields":[{"name":"descriptor_type","type":"string","
[... rest of the schema omitted for brevity]
Irritatingly this is the schema that I passed along and which indeed I want the converter to use. I have no idea what Avro is complaining about.
This is the relevant snippet of my code:
// parse the schema file
Schema.Parser parser = new Schema.Parser();
Schema mySchema;
// tried two ways to load the schema
// like this
File schemaFile = new File("myJsonSchema.avsc");
mySchema = parser.parse(schemaFile) ;
// and also like Json.class loads it's schema
mySchema = parser.parse(Json.class.getResourceAsStream("myJsonSchema.avsc"));
// initialize the writer
Json.ObjectWriter jsonDatumWriter = new Json.ObjectWriter();
jsonDatumWriter.setSchema(mySchema);
OutputStream out = new FileOutputStream(new File("output.avro"));
Encoder encoder = EncoderFactory.get().jsonEncoder(mySchema, out);
// append a record created by way of a specific mapping
jsonDatumWriter.write(specificRecord, encoder);
I replaced myJsonSchema.avsc with the one returned from the exception without success (and except whitespace and linefeeds they are the same). Initializing the jsonEncoder with org.apache.avro.data.Json.SCHEMA instead of mySchema didn't change anything either. Replacing the schema passed to Json.ObjectWriter with org.apache.avro.data.Json.SCHEMA leads to a NullPointerException at org.apache.avro.data.Json.write(Json.java:183) (which is a deprecated method).
From staring at org.apache.avro.data.Json.java it seems to me like Avro is checking my record schema against it's own schema of a Json record (line 58) for equality (line 73).
58 SCHEMA = Schema.parse(Json.class.getResourceAsStream("/org/apache/avro/data/Json.avsc"));
72 public void setSchema(Schema schema) {
73 if(!Json.SCHEMA.equals(schema))
74 throw new RuntimeException("Not the Json schema: " + schema);
75 }
The referenced Json.avsc defines the field types of a record:
{"type": "record", "name": "Json", "namespace":"org.apache.avro.data",
"fields": [
{"name": "value",
"type": [
"long",
"double",
"string",
"boolean",
"null",
{"type": "array", "items": "Json"},
{"type": "map", "values": "Json"}
]
}
]
}
equals is implemented in org.apache.avro.Schema, line 346:
public boolean equals(Object o) {
if(o == this) {
return true;
} else if(!(o instanceof Schema)) {
return false;
} else {
Schema that = (Schema)o;
return this.type != that.type?false:this.equalCachedHash(that) && this.props.equals(that.props);
}
}
I don't fully understand what's going on in the third check (especially equalCachedHash()) but I only recognize checks for equality in a trivial way which doesn't make sense to me.
Also I can't find any examples or notes about usage of Avro's Json.ObjectWriter on the InterWebs. I wonder if I should go with the deprecated Json.Writer instead because there are at least a few code snippets online to learn and glean from.
The full source is available at https://github.com/tomlurge/converTor
Thanks,
Thomas
A little more debugging proofed that passing org.apache.avro.data.Json.SCHEMA to Json.ObjectWriter is indeed the right thing to do. The object I get back written to System.out prints the JSON object that I expect. The null pointer exception though did not go away.
Probably I would not have had to setSchema() of Json.ObjectWriter at all since omitting the command alltogether leads to the same NullPointerException.
I finally filed a bug with Avro and it turned out that in my code I was handing an object of type "specific" to ObjectWriter which it couldn't handle. It did return silently though and an error was thrown only at a later stage. That was fixed in Avro 1.8.1 - see https://issues.apache.org/jira/browse/AVRO-1807 for details.

Getting SerializableException while saving data into isolated storage

I am working on windows phone 8 app. In which I need to call an api to get some response, that I am serializing and saving into local settings
While saving these data I am getting an exception of type 'System.Runtime.Serialization.SerializationException'.
My Code is,
string someData = responseDataDict["SomeData"].ToString();
if someData != null)
{
Dictionary<string, Object> someDict = JsonConvert.DeserializeObject<Dictionary<string, object>>(someData);
Datastore.SaveData = stringDict;
}
public static Datastore
{
get
{
if (localSettings.Contains("SaveData"))
{
return (Dictionary<string, Object>)localSettings["SaveData];
else
return null;
}
}
set
{
localSettings["SaveData"] = value;
localSettings.Save();
}
}
}
The response from api is,
"MESSAGE": "Some Message",
"UI_MESSAGE": {
"LEFT": "OK",
"RIGHT": "CANCEL",
}
I think the problem is in "UI_MESSAGE",
The Exception is,
System.Runtime.Serialization.SerializationException: Type 'Newtonsoft.Json.Linq.JObject' with data contract name 'ArrayOfKeyValueOfstringJTokeneJCYCtcq:http://schemas.microsoft.com/2003/10/Serialization/Arrays' is not expected. Add any types not known statically to the list of known types - for example, by using the KnownTypeAttribute attribute or by adding them to the list of known types passed to DataContractSerializer.
Please help me to resolve this issue, Thanks in advance
You can't (easily) serialize the data because the serializer doesn't know how to serialize Newtonsoft.Json.Linq.JObject types.
We could try to figure out how to get the serializer working with this type, which may or may not take lots of time, or we could stop and think for a bit...
You grab the response data in the line
string someData = responseDataDict["SomeData"].ToString();
What is someData? It's
{
"MESSAGE": "Some Message",
"UI_MESSAGE": {
"LEFT": "OK",
"RIGHT": "CANCEL",
}
That's a json string. A serialized javascript object. It's already bloody serialized.
Save that in localSettings.

CSV Media Type Formatter for ASP.NET Web API

I tried to implement a CSV MediaTypeFormatter for my Web API as described here:
http://www.tugberkugurlu.com/archive/creating-custom-csvmediatypeformatter-in-asp-net-web-api-for-comma-separated-values-csv-format
(I don't want to paste in all the code from there)
But I don't get it to work using the Web API Controller below.
I used Fiddler to call the Web API with: http://myhostname.com/api/csvexport?format=csv
public dynamic Get()
{
var ret = new[] { "CarId", "Make", "Model", "Name" };
return ret;
}
For "type" in the CsvFormatter i get a:
DeclaringMethod = 'type.DeclaringMethod' threw an exception of type 'System.InvalidOperationException'
with
Method may only be called on a Type for which Type.IsGenericParameter is true.
So I might not get the concept of the Formatter right and have a problem with the type?
You are getting this error because Tugberk's formatter only works for models which implement the generic IEnumerable<T> interface. It makes sense, since people generally want CSV formatted data when they are getting a set of results. If you only want one data entity, why would you want it in CVS format?
Your method return type is dynamic, not IEnumerable<T>. You might be able to get it to work by doing something more like this:
public IEnumerable<string> Get()
{
var ret = new[] { "CarId", "Make", "Model", "Name" };
return ret;
}