Pass String with Json data to Map in Golang [duplicate] - json

This question already has answers here:
Golang parse JSON array into data structure
(3 answers)
Closed 5 years ago.
Currently I have stored in my database json objects as string. I want to pass them to a map to be able to consult any field as:
Mymap["Name"]
Mymap["Age"]
..
Let's say that my string would be something like:
'{"Name":["zero"],"Age":"10"}'
I don't know the structure of the data, so that Json can have many fields as required and also many levels of nestings (but I am worried more about to get at least the first level)

If you're dealing with a json object of arbitrary structure you can use a map of interfaces as the type to unmarshal it into.
map[string]interface{}
The encoding/json package will nicely unmarshal the json object into it, nested or not.
This, while very flexible, has an obvious downside, the types of the map's values are unknown and so to do anything useful with them you'll have to use a type assertion or type switch.
v, ok := m["key"].(Type)
https://play.golang.org/p/wM0gkU1g5G

Related

Convert one line json into multiple lines to write to a file (Ruby) [duplicate]

This question already has answers here:
How to "pretty" format JSON output in Ruby on Rails
(20 answers)
Closed 3 years ago.
I've run into a situation where I'm retrieving a json string in ruby, but I'm attempting to write it to a file structured like your usual json file (for example:
{
"top":{
"mid1":"bot1",
"mid2":"bot2"
}
}
However, the way the json string is structured is like so:
{"top":{"mid1":"bot1","mid2":"bot2"}}
I've seen other posts mention doing JSON.parse in order to get what I want, however that just causes it to look like this:
{"top"=>{"mid1"=>"bot1", "mid2"=>"bot2"}}
Is there a way that I can just convert the json string into what it looks like in the first code block above?
The first representation is JSON. The decoded version is Ruby's internal representation. If you want to go back to JSON, JSON.dump(...) will reconvert it.
Keep in mind this is just presentation. The data is the same.
You may be interested in the pp method as that produces nice, structured debugging output.

Is there any way to extract JSON Schema from a given JSON in Go? [duplicate]

This question already exists:
How can I extract JSON Schema from a given JSON in Go? [closed]
Closed 3 years ago.
Is there a way that to convert a JSON to its schema in Go? I need to compare 2 JSON templates or schemas and cannot find any package or function to do the same - can someone please help me with this?
You can have a look at gjson library. It has functions to parse and get unmarshalled JSON. You can use gjson functionality to compare json results.
I think you will need to unmarshal them recursively (if they contain nested json) into something like map[string]interface{}, and then loop through and compare the keys. There are some libraries mentioned on this question https://stackoverflow.com/a/42153666 which could be used to unmarshal them safely.
For example, You can use Exists from the gabs library while iterating through the keys in the unmarhsalled map to see if the same keys exist in the other map.
// From gabs library
// Exists checks whether a field exists within the hierarchy.
func (g *Container) Exists(hierarchy ...string) bool {
return g.Search(hierarchy...) != nil
}
Edit: without libraries here: https://play.golang.org/p/jmfFsLT0G1n based on the test case of this code golf exercise: https://codegolf.stackexchange.com/questions/195476/extract-all-keys-from-an-object-json
The json package provided in Go’s standard library provides us with all the functionality we need. For any JSON string, the standard way to parse it is:
import "encoding/json" //...
// ... myJsonString := `{"some":"json"}`
// `&myStoredVariable` is the address of the variable we want to store our // parsed data in
json.Unmarshal([]byte(myJsonString), &myStoredVariable)
//...

Mongodb/Mongo Query: How do I deserialize nested JSON stored in a string field? [duplicate]

This question already has answers here:
MongoDB: How to query over a json string?
(1 answer)
Using stored JavaScript functions in the Aggregation pipeline, MapReduce or runCommand
(1 answer)
Closed 3 years ago.
In the context of metabase:
I am applying the following one-liner query to extract a particular field from a collection of mongodb documents:
[{"$project":{"myName":"$field1.field2" }},
{"$match":{"_id":{"$eq":"blah"}}} ]
myName is a string field that is in fact a serialized JSON object.
How can I adapt the pipeline above, perhaps using JSON.parse(), to pull out fields nested further down the JSON hierarchy?
Part II: How would I add a final step to extract some of those deeply nested fields? The situation is further complicated because the document structure is not consistent between documents...
Thanks!

Excessive use of map[string]interface{} in go development?

The majority of my development experience has been from dynamically typed languages like PHP and Javascript. I've been practicing with Golang for about a month now by re-creating some of my old PHP/Javascript REST APIs in Golang. I feel like I'm not doing things the Golang way most of the time. Or more generally, I'm not use to working with strongly typed languages. I feel like I'm making excessive use of map[string]interface{} and slices of them to box up data as it comes in from http requests or when it gets shipped out as json http output. So what I'd like to know is if what I'm about to describe goes against the philosophy of golang development? Or if I'm breaking the principles of developing with strongly typed languages?
Right now, about 90% of the program flow for REST Apis I've rewritten with Golang can be described by these 5 steps.
STEP 1 - Receive Data
I receive http form data from http.Request.ParseForm() as formvals := map[string][]string. Sometimes I will store serialized JSON objects that need to be unmarshaled like jsonUserInfo := json.Unmarshal(formvals["user_information"][0]) /* gives some complex json object */.
STEP 2 - Validate Data
I do validation on formvals to make sure all the data values are what I expect before using it in SQL queries. I treat everyting as a string, then use Regex to determine if the string format and business logic is valid (eg. IsEmail, IsNumeric, IsFloat, IsCASLCompliant, IsEligibleForVoting,IsLibraryCardExpired etc...). I've written my own Regex and custom functions for these types of validations
STEP 3 - Bind Data to SQL Queries
I use golang's database/sql.DB to take my formvals and bind them to my Query and Exec functions like this Query("SELECT * FROM tblUser WHERE user_id = ?, user_birthday > ? ",formvals["user_id"][0], jsonUserInfo["birthday"]). I never care about the data types I'm supplying as arguments to be bound, so they're all probably strings. I trust the validation in the step immediately above has determined they are acceptable for SQL use.
STEP 4 - Bind SQL results to []map[string]interface{}{}
I Scan() the results of my queries into a sqlResult := []map[string]interface{}{} because I don't care if the value types are null, strings, float, ints or whatever. So the schema of an sqlResult might look like:
sqlResult =>
[0] {
"user_id":"1"
"user_name":"Bob Smith"
"age":"45"
"weight":"34.22"
},
[1] {
"user_id":"2"
"user_name":"Jane Do"
"age":nil
"weight":"22.22"
}
I wrote my own eager load function so that I can bind more information like so EagerLoad("tblAddress", "JOIN ON tblAddress.user_id",&sqlResult) which then populates sqlResult with more information of the type []map[string]interface{}{} such that it looks like this:
sqlResult =>
[0] {
"user_id":"1"
"user_name":"Bob Smith"
"age":"45"
"weight":"34.22"
"addresses"=>
[0] {
"type":"home"
"address1":"56 Front Street West"
"postal":"L3L3L3"
"lat":"34.3422242"
"lng":"34.5523422"
}
[1] {
"type":"work"
"address1":"5 Kennedy Avenue"
"postal":"L3L3L3"
"lat":"34.3422242"
"lng":"34.5523422"
}
},
[1] {
"user_id":"2"
"user_name":"Jane Do"
"age":nil
"weight":"22.22"
"addresses"=>
[0] {
"type":"home"
"address1":"56 Front Street West"
"postal":"L3L3L3"
"lat":"34.3422242"
"lng":"34.5523422"
}
}
STEP 5 - JSON Marshal and send HTTP Response
then I do a http.ResponseWriter.Write(json.Marshal(sqlResult)) and output data for my REST API
Recently, I've been revisiting articles with code samples that use structs in places I would have used map[string]interface{}. For example, I wanted to refactor Step 2 with a more standard approach that other golang developers would use. So I found this https://godoc.org/gopkg.in/go-playground/validator.v9, except all it's examples are with structs . I also noticed that most blogs that talk about database/sql scan their SQL results into typed variables or structs with typed properties, as opposed to my Step 4 which just puts everything into map[string]interface{}
Hence, i started writing this question. I feel the map[string]interface{} is so useful because majority of the time,I don't really care what the data is and it gives me to the freedom in Step 4 to construct any data schema on the fly before I dump it as JSON http response. I do all this with as little code verbosity as possible. But this means my code is not as ready to leverage Go's validation tools, and it doesn't seem to comply with the golang community's way of doing things.
So my question is, what do other golang developers do with regards to Step 2 and Step 4? Especially in Step 4...do Golang developers really encourage specifying the schema of the data through structs and strongly typed properties? Do they also specify structs with strongly typed properties along with every eager loading call they make? Doesn't that seem like so much more code verbosity?
It really depends on the requirements just like you have said you don't require to process the json it comes from the request or from the sql results. Then you can easily unmarshal into interface{}. And marshal the json coming from sql results.
For Step 2
Golang has library which works on validation of structs used to unmarshal json with tags for the fields inside.
https://github.com/go-playground/validator
type Test struct {
Field `validate:"max=10,min=1"`
}
// max will be checked then min
you can also go to godoc for validation library. It is very good implementation of validation for json values using struct tags.
For STEP 4
Most of the times, We use structs if we know the format and data of our JSON. Because it provides us more control over the data types and other functionality. For example if you wants to empty a JSON feild if you don't require it in your JSON. You should use struct with _ json tag.
Now you have said that you don't care if the result coming from sql is empty or not. But if you do it again comes to using struct. You can scan the result into struct with sql.NullTypes. With that also you can provide json tag for omitempty if you wants to omit the json object when marshaling the data when sending a response.
Struct values encode as JSON objects. Each exported struct field
becomes a member of the object, using the field name as the object
key, unless the field is omitted for one of the reasons given below.
The encoding of each struct field can be customized by the format
string stored under the "json" key in the struct field's tag. The
format string gives the name of the field, possibly followed by a
comma-separated list of options. The name may be empty in order to
specify options without overriding the default field name.
The "omitempty" option specifies that the field should be omitted from
the encoding if the field has an empty value, defined as false, 0, a
nil pointer, a nil interface value, and any empty array, slice, map,
or string.
As a special case, if the field tag is "-", the field is always
omitted. Note that a field with name "-" can still be generated using
the tag "-,".
Example of json tags
// Field appears in JSON as key "myName".
Field int `json:"myName"`
// Field appears in JSON as key "myName" and
// the field is omitted from the object if its value is empty,
// as defined above.
Field int `json:"myName,omitempty"`
// Field appears in JSON as key "Field" (the default), but
// the field is skipped if empty.
// Note the leading comma.
Field int `json:",omitempty"`
// Field is ignored by this package.
Field int `json:"-"`
// Field appears in JSON as key "-".
Field int `json:"-,"`
As you can analyze from above information given in Golang spec for json marshal. Struct provide so much control over json. That's why Golang developer most probably use structs.
Now on using map[string]interface{} you should use it when you don't the structure of your json coming from the server or the types of fields. Most Golang developers stick to structs wherever they can.

How to create diff of two generic T:Codable structs in Swift?

Given: I have two structs of the same type, conforming to Codable Protocol.
The structs can be multi-level (nested properties, surely also are conforming to Codable). The type is not known at the time of implementation, so i consider it generic, conforming to Codable.
One object is "base" (say, received from server), second (actually the copy of "base"), but modified inside application.
The intention is: To send a request for saving new data, but sending only the "diff" of two structs. So, only the fields, that are different should be present in resulting JSON.
The straightforward way with getting JSON strings for both structs and manipulating with them, is understandable, but seem to be the last-chance approach...
I've tried the approach with Mirror, and recursion, but now have managed to make it work only for first level - on the second level of nesting i've lost the type of nested property (if struct or array), and cannot cast it right then...
I wonder if it can be made somehow with custom encoder?
P.S.: the generic type should have all properties as Optionals, so should not provide any explicit initializers.
Instead of your "last-chance approach" -- matching JSON strings -- you could use JSONSerialization.jsonObject to convert the JSON data to Foundation objects and perform your comparison on that higher level of abstraction (if that's what you meant in your question in the first place, then sorry - nevermind).
Of course you'd pay an extra penalty of converting your Codable objects to data and then parsing that data into an object hierarchy.