Choose deep or shallow JSON serialization in Grails - json

Is there a way to easily specify whether to convert an object as JSON in a deep or shallow manner? I know you can configure the grails.converters.JSON utility in the Config.groovy file by specifying something like the following:
grails.converters.json.default.deep = true
but when I convert certain objects, I don't want to deep convert.
I also saw that somebody recommended using JSON.use("deep"), but I get the following error:
Error 2012-03-04 00:39:13,673 ["http-bio-8080"-exec-1] ERROR errors.GrailsExceptionResolver - IllegalAccessException occurred when processing request: [GET] /Quaffic/home/json
Class org.codehaus.groovy.grails.web.converters.marshaller.json.GenericJavaBeanMarshaller can not access a member of class org.joda.time.tz.DateTimeZoneBuilder$PrecalculatedZone with modifiers "public". Stacktrace follows:
Message: Class org.codehaus.groovy.grails.web.converters.marshaller.json.GenericJavaBeanMarshaller can not access a member of class org.joda.time.tz.DateTimeZoneBuilder$PrecalculatedZone with modifiers "public"
Line | Method
->> 198 | value in grails.converters.JSON
It seems like it could be a joda.time error, but this doesn't happen when I just use the plain Config.groovy technique. Kind of confusing...
Any help would be great!

My solution was to not rely on deep/shallow json generation. I created a map, inserted elements, and serialized that. Probably not the best practice, but it gets the job done.

Related

Typescript: undefined calculated property after deserializing in JSON

I'm new to Typescript and I encountered a JSON deserializing problem.
Consider this class:
class Product {
public id!: number;
public get calculatedProperty() : string {
return "Test";
};
};
As you can see calculatedProperty is a runtime calculated property.
Also, consider that I deserialize a JSON string into my object in this way:
var jsonData = '{ "id": 2 }';
let deserialized = JSON.parse(jsonData) as Product;
The problem comes now:
This call console.log(deserialized.id); returns correctly 1.
This call console.log(deserialized.calculatedProperty); returns undefined!
I really don't understand way. It seems that as Product doesn't really create a Product object, because If I directly invoke the constructor, new Product, the calculated property exists.
What am I doing wrong with the JSON deserialization?
Thanks!
TypeScript's job is only to perform type checking during development and make sure we don't make careless mistakes. At the end of the day, all it does is just compiling the script and transform it into good old JavaScript. Therefore, any TypeScript syntax are not applied in runtime.
In other words, type assertions are removed in runtime.
There are also several warnings in the documentation about this:
Like a type annotation, type assertions are removed by the compiler and won’t affect the runtime behavior of your code.
Reminder: Because type assertions are removed at compile-time, there is no runtime checking associated with a type assertion. There won’t be an exception or null generated if the type assertion is wrong.
Besides, the as keyword does not instantiate a constructor. It merely provides a type information (which will be removed during compile-time). The only way we can instantiate a constructor and access its instance properties/methods is through the new keyword.
The JSON.parse method isn't really for converting json into a class rather than an object.
To solve your issue you could potentially convert the json into an object like this:
let deserializedObject = JSON.parse(jsonData) as Object;
and after that you could assign the object to a class like that:
let deserialized = Object.assign(new Product(), deserializedObject);
Note that I have not tested this yet, but it should work.
Also this is fine for simple objects, but not for objects with complex hierarchy.
Look into class-transformer for more information. https://github.com/typestack/class-transformer

Serilog Azure EventHub Sink output template json

So, I am battling and trying to figure this out.
I have Serilog and using EventHub to log errors.
It took a while to find but I needed it to be serialized into JSON so I used this:
logger = new LoggerConfiguration().WriteTo.Sink(new AzureEventHubSink(eventHubClient, new JsonFormatter()))
.CreateLogger();
Great. Now, when I write the exception:
logger.Error(ex, "An Error Occurred");
It writes it BUT the exception is written in 1 field (big long strong).
Is there a way to tell SeriLog to write each property of the exception in its own field (think of it as a SQL table with fields)?
How about changing the outputTemplate but still using JsonFormatter, as there is no overload to accept the output template?
I am using Stream Analytics to do some querying and it makes it better (MUCH better) to have each exception property as its own field column rather than just 1 field with the entire JSON string in there, and I need to do cross joins on another data source.
Thank you.
Seems the only way is to inherit from JsonFormatter and then override the WriteException method and write the Property in question....
WriteJsonProperty("ClassName", exception.GetType(), ref delim, output);
WriteJsonProperty("Message", exception.Message, ref delim, output);

What this the best way to ignore unwanted fields in a JSON payload from a PUT/PATCH using Golang?

I have a situation where people consuming our API will need to do a partial update in my resource. I understand that the HTTP clearly specifies that this is a PATCH operation, even though people on our side are used to send a PUT request for this and that's how the legacy code is built.
For exemplification, imagine the simple following struct:
type Person struct {
Name string
Age int
Address string
}
On a POST request, I will provide a payload with all three values (Name, Age, Address) and validate them accordingly on my Golang backend. Simple.
On a PUT/PATCH request though, we know that, for instance, a name never changes. But say I would like to change the age, then I would simply send a JSON payload containing the new age:
PUT /person/1 {age:30}
Now to my real question:
What is the best practice to prevent name from being used/updated intentionally or unintentionally modified in case a consumer of our API send a JSON payload containing the name field?
Example:
PUT /person/1 {name:"New Name", age:35}
Possible solutions I thought of, but I don't actually like them, are:
On my validator method, I would either forcibly remove the unwanted field name OR respond with an error message saying that name is not allowed.
Create a DTO object/struct that would be pretty much an extension of my Person struct and then unmarshall my JSON payload into it, for instance
type PersonPut struct {
Age int
Address string
}
In my opinion this would add needless extra code and logic to abstract the problem, however I don't see any other elegant solution.
I honestly don't like those two approaches and I would like to know if you guys faced the same problem and how you solved it.
Thanks!
The first solution your brought is a good one. Some well known frameworks use to implement similar logic.
As an example, latests Rails versions come with a built in solution to prevent users to add extra data in the request, causing the server to update wrong fields in database. It is a kind of whitelist implemented by ActionController::Parameters class.
Let's suppose we have a controller class as bellow. For purpose of this explanation, it contains two update actions. But you won't see it in real code.
class PeopleController < ActionController::Base
# 1st version - Unsafe, it will rise an exception. Don't do it
def update
person = current_account.people.find(params[:id])
person.update!(params[:person])
redirect_to person
end
# 2nd version - Updates only permitted parameters
def update
person = current_account.people.find(params[:id])
person.update!(person_params) # call to person_params method
redirect_to person
end
private
def person_params
params.require(:person).permit(:name, :age)
end
end
Since the second version allows only permitted values, it'll block the user to change the payload and send a JSON containing a new password value:
{ name: "acme", age: 25, password: 'account-hacked' }
For more details, see Rails docs: Action Controller Overview and ActionController::Parameters
If the name cannot be written it is not valid to provide it for any update request. I would reject the request if the name was present. If I wanted to be more lenient, I might consider only rejecting the request if name is different from the current name.
I would not silently ignore a name which was different from the current name.
This can be solved by decoding the JSON body into a map[string]json.RawMessage first. The json.RawMessage type is useful for delaying the actual decoding. Afterwards, a whitelist can be applied on the map[string]json.RawMessage map, ignoring unwanted properties and only decoding the json.RawMessages of the properties we want to keep.
The process of decoding the whitelisted JSON body into a struct can be automated using the reflect package; an example implementation can be found here.
I am not proficient on Golang but I believe a good strategy would be converting your name field to be a read-only field.
For instance, in a strictly object-oriented language as Java/.NET/C++ you can just provide a Getter but not a Setter.
Maybe there is some accessor configuration for Golang just like Ruby has....
If it is read-only then it shouldn't bother with receiving a spare value, it should just ignore it. But again, not sure if Golang supports it.
I think the clean way is to put this logic inside the PATCH handler. There should be some logic that would update only the fields that you want. Is easier if you unpack into a map[string]string and only iterate over the fields that you want to update. Additionally you could decode the json into a map, delete all the fields that you don't want to be updated, re-encode in json and then decode into your struct.

Grails, create domain object from json-string with has-many relation

I'm trying to parse a grails parameter map to a Json String, and then back to a parameter map. (For saving html form entries with constraint-violations)
Everything is fine as long as there is no hasMany relationship in the parameter-map.
I'm using
fc.parameter = params as JSON
to save the params as JSON String.
Later I'm trying to rebuild the parameter map and create a new Domain-Object with it:
new Foo(JSON.parse(fc.parameter))
Everything is fine using only 1:1 relationships (states).
[states:2, listSize:50, name:TestFilter]
But when I try to rebuild a params-map with multi-select values (states)
[states:[1,2], listSize:50, name:TestFilter]
I'm getting this IllegalStateException:
Failed to convert property value of type org.codehaus.groovy.grails.web.json.JSONArray to required type java.util.Set for property states; nested exception is java.lang.IllegalStateException: Cannot convert value of type [java.lang.String] to required type [de.gotosec.approve.State] for property states[0]: no matching editors or conversion strategy found
I tried to use this, but without success:
JSON.use("deep") {
new Foo(JSON.parse(fc.parameter))
}
You can use JsonSlurper instead of the converters.JSON of grails, it maps JSON objects to Groovy Maps. I think this link also might help you.
Edit: Now, if the problem is binding the params map to your domain, you should try using bindData() method, like:
bindData(foo, params)
Note that this straightforward use is only if you're calling bindData inside a controller.
What seems to be happening in your case is that Grails is trying to bind a concrete type of List (ArrayList in the case of JsonSlurper and JSONArray in the case of converters.JSON) into a Set of properties (which is the default data structure for one-to-many associations). I would have to take a look at your code to confirm that. But, as you did substitute states: [1,2] for a method of your app, try another test to confirm this hypothesis. Change:
states:[1,2]
for
states:[1,2] as Set
If this is really the problem and not even bindData() works, take a look at this for a harder way to make it work using object marshalling and converters.JSON. I don't know if it's practical for you to use it in your project, but it sure works nicely ;)

Parsing JSON objects of unknown type with AutoBean on GWT

My server returns a list of objects in JSON. They might be Cats or Dogs, for example.
When I know that they'll all be Cats, I can set the AutoBeanCodex to work easily. When I don't know what types they are, though... what should I do?
I could give all of my entities a type field, but then I'd have to parse each entity before passing it to the AutoBeanCodex, which borders on defeating the point. What other options do I have?
Just got to play with this the other day, and fought it for a few hours, trying #Category methods and others, until I found this: You can create a property of type Splittable, which represents the underlying transport type that has some encoding for booleans/Strings/Lists/Maps. In my case, I know some enveloping type that goes over the wire at design time, and based on some other property, some other field can be any number of other autobeans.
You don't even need to know the type of the other bean at compile time, you could get values out using Splittable's methods, but if using autobeans anyway, it is nice to define the data that is wrapped.
interface Envelope {
String getStatus();
String getDataType();
Splittable getData();
}
(Setters might be desired if you sending data as well as recieving - encoding a bean into a `Splittable to send it in an envelope is even easier than decoding it)
The JSON sent over the wire is decoded (probably using AutoBeanCodex) into the Envelope type, and after you've decided what type must be coming out of the getData() method, call something like this to get the nested object out
SpecificNestedBean bean = AutoBeanCodex.decode(factory,
SpecificNestedBean.class,
env.getData()).as();
The Envelope type and the nested types (in factory above) don't even need to be the same AutoBeanFactory type. This could allow you to abstract out the reading/writing of envelopes from the generic transport instance, and use a specific factory for each dataType string property to decode the data's model (and nested models).