Lets say I have this type:
type Foo struct{
Bar string `json:"bar"`
}
and I want to unmarshal this json into it:
in := []byte(`{"bar":"aaa", "baz":123}`)
foo := &Foo{}
json.Unmarshal(in,foo)
will succeed just fine. I would like to at least know that there were fields that were skipped in the processing. Is there any good way to access that information?
playground snippet
As you're probably aware you can unmarshal any valid json into a map[string]interface{}. Having unmarsahled into an instance of Foo already there is no meta data available where you could check fields that were excluded or anything like that. You could however, unmarshal into both types and then check the map for keys that don't correspond to fields on Foo.
in := []byte(`{"bar":"aaa", "baz":123}`)
foo := &Foo{}
json.Unmarshal(in,foo)
allFields := &map[string]interface{}
json.Unmarshal(in, allFields)
for k, _ := range allFields {
fmt.Println(k)
// could also use reflect to get field names as string from Foo
// the get the symmetric difference by nesting another loop here
// and appending any key that is in allFields but not on Foo to a slice
}
Related
I have a csv response that comes from an endpoint that I don't control and I'm failing to parse its response because it has quotes. It looks something like this:
name,id,quantity,"status (active, expired)"
John,14,4,active
Bob,12,7,expired
to parse this response I have created the following struct:
type UserInfo struct {
Name string `csv:"name"`
ID string `csv:"id"`
Quantity string `csv:"quantity"`
Status string `csv:"status (active, expired)"`
}
I have tried using
Status string `csv:""status (active, expired)""`
Status string `csv:'"status (active, expired)"'`
but none seem to be helpful, I just can't access the field Status when I use gocsv.Unmarshal.
var actualResult []UserInfo
err = gocsv.Unmarshal(in, &actualResult)
for _, elem := range actualResult {
fmt.Println(elem.Status)
}
And I get nothing as as response.
https://go.dev/play/p/lje1zNO9w6E here's an example
You don't need third party package like gocsv (unless you have specific usecase) when it can be done easily with Go's builtin encoding/csv.
You just have to ignore first line/record which is csv header in your endpoint's response.
csvReader := csv.NewReader(strings.NewReader(csvString))
records, err := csvReader.ReadAll()
if err != nil {
panic(err)
}
var users []UserInfo
// Iterate over all records excluding first one i.e., header
for _, record := range records[1:] {
users = append(users, UserInfo{Name: record[0], ID: record[1], Quantity: record[2], Status: record[3]})
}
fmt.Printf("%v", users)
// Output: [{ John 14 4 active } { Bob 12 7 expired}]
Here is working example on Go Playground based on your use case and sample string.
I simply don't think gocarina/gocsv can parse a header with a quoted comma. I don't see it spelled out anywhere in the documentation that it cannot, but I did some digging and there are clear examples of commas being used in the "CSV annotations", and it looks like the author only conceived of commas in the annotations being used for the purposes of the package/API, and not as part of the column name.
If we look at sample_structs_test.go from the package, we can see commas being used in some of the following ways:
in metadata directives, like "omitempty":
type Sample struct {
Foo string `csv:"foo"`
Bar int `csv:"BAR"`
Baz string `csv:"Baz"`
...
Omit *string `csv:"Omit,omitempty"`
}
for declaring that a field in the struct can be populated from multiple, different headers:
type MultiTagSample struct {
Foo string `csv:"Baz,foo"`
Bar int `csv:"BAR"`
}
You can see this in action, here.
FWIW, the official encoding/json package has the same limitation, and they note it (emphasis added):
The encoding of each struct field can be customized by the format string stored under the "json" key in the struct field's tag. The format string gives the name of the field, possibly followed by a comma-separated list of options. The name may be empty in order to specify options without overriding the default field name.
and
The key name will be used if it's a non-empty string consisting of only Unicode letters, digits, and ASCII punctuation except quotation marks, backslash, and comma.
So, you may not be able to get what you expect/want: sorry, this may just be a limitation of having the ability to annotate your structs. If you want, you could file a bug with gocarina/gocsv.
In the meantime, you can just modify the header as it's coming in. This is example is pretty hacky, but it works: it just replaces "status (active, expired)" with "status (active expired)" and uses the comma-less version to annotate the struct.
endpointReader := strings.NewReader(sCSV)
// Fix header
var bTmp bytes.Buffer
fixer := bufio.NewReader(endpointReader)
header, _ := fixer.ReadString('\n')
header = strings.Replace(header, "\"status (active, expired)\"", "status (active expired)", -1)
bTmp.Write([]byte(header))
// Read rest of CSV
bTmp.ReadFrom(fixer)
// Turn back into a reader
reader := bytes.NewReader(bTmp.Bytes())
var actualResult []UserInfo
...
I can run that and now get:
active
expired
I have a general enough function for going through a map[string] and getting all keys:
i := 0
keys := make([]string, len(input))
for k := range input {
keys[i] = k
i++
}
return keys
My problem is I have two different inputs I want to throw in here, a map[string]MyStruct and map[string][][]float64. Whenever I've tried having the input to the func as map[string]interface{}, go resists all my attempts to try to cast the map[string]MyStruct as a map[string]interface{}. Is there a way I can do this without needing to have two functions, one with map[string]MyStruct as input, and one with map[string][][]float64? The contents of the map[string] don't matter at this point, because I'm just trying to get all the keys of them for use later in the code. This needs to be a function that's called; We're using Sonar, and it's set to refuse code duplication, so I can't have this code snippet duplicated.
Until next Go version brings us generics there're several ways to cope with it.
Duplicate code
Use code generation - design some template and then go on build will fill it for you.
Use interface{} as an input type of a function and then use reflection to guess which type was given to a function.
I'm pretty sure in this case general code will be more complicated than 2 separate functions.
func getKeys(input interface{}) []string {
switch inp := input.(type) {
case map[string]MyStruct:
keys := make([]string, 0, len(inp))
for k := range inp {
keys = append(keys, k)
}
return keys
case map[string][]float64:
...
default:
fmt.Printf("I don't know about type %T!\n", v)
}
You cannot use input type map[string]interface{} to place map[string]string or map[string][]string go won't copy each value to new type. You may take the most general type interface{} and then cast it.
I am connecting to an API which gives a rather large json payload. I need to add a key and value to the root object.Once I do ioutil.Readall from the package "net/http" the JSON is a byte array.
My goal is to just simply add to the structure and move on. As an example, the following the pretty similar to what I am doing: https://tutorialedge.net/golang/consuming-restful-api-with-go/
So how can I simply add to a JSON structure another element (key: value)?
If all you want to do is add a key and value to the root object and produce new JSON, and you don't care about having the data in a structure, you can unmarshal into map[string]interface{}, add your value, and then marshal again:
var m map[string]interface{}
err := json.Unmarshal(data, &m)
m["new_key"] = newValue
newData, err := json.Marshal(m)
(I'm not checking for errors, but you should do that of course.) Take a look at https://golang.org/pkg/encoding/json/ for more information about how to deal with JSON in Go.
Here's how you can do it in an efficient way while preserving the order of keys in the original JSON. The idea is to use json.RawMessage
// original JSON
bytes := []byte(`{"name": "Adele", "age": 24}`)
// let's try to add the kv pair "country": "USA" to it
type Person struct {
Bio json.RawMessage `json:"bio"`
Country string `json:"country"`
}
p := Person{
Bio: json.RawMessage(bytes),
Country: "USA",
}
// ignoring error for brevity
modifiedBytes, _ := json.Marshal(p)
fmt.Println(string(modifiedBytes))
Output:
{"bio":{"name":"Adele","age":24},"country":"USA"}
You can see that the ordering of original JSON is preserved, which wouldn't have been the case if you marshalled the JSON to a map[string]interface{}. This is also more efficient when you're dealing with huge JSONs since there's no reflection involved.
Complete code - https://play.golang.org/p/3hAPVbrAo_w
Since you have a byte data, you need to parse it and store the result in a variable that has your json structure using json.Marshal.
Then after, to add a new key value pair, you can define a new struct with the key and its data type
var variable type1
// Unmarshal the byte data in a variable
json.Unmarshall(data, &variable)
// to add a new key value you can define a new type
type type2 struct {
type1
key type
}
// and you can add
variable2 := type2{variable, newValueToAdd}
While deserializing & reserializing is the more "correct" approach, it may be overkill for just adding a value at the root, which can be done with simple string manipulation (really byte slice manipulation, but the semantics are similar and it's arguably easier):
data := []byte(`{"foo": 1, "bar": true}`)
ins := []byte(`, "baz": "hello"`) // Note the leading comma.
closingBraceIdx := bytes.LastIndexByte(data, '}')
data = append(data[:closingBraceIdx], ins...)
data = append(data, '}')
This is more error-prone, because it is unaware of JSON syntax entirely, but it's safe enough for most cases and for large JSON documents it is more efficient than a parse, insert, and reserialize.
Playground example: https://play.golang.org/p/h8kL4Zzp7rq
if you want to add key-value of json bytes to new json object, you can use json.RawMessage.
type Res struct {
Data interface{}
Message string
}
var row json.RawMessage
row = []byte(`{"Name":"xxx","Sex":1}`)
res := Res{Data:row,Message:"user"}
resBytes ,err := json.Marshal(res)
println(string(resBytes))
//print result:"Data":{"Name":"xxx","Sex":1},"Message":"user"}
SJSON package is another way to modify JSON values.
In this example :
json:= `{
"name": {"first":"James","last":"Bond"},
"age' :40,
"license" {"Diamond","Gold","Silver"}
}`
To replace a "Diamond" with "Ultimate"
value, _ := sjson.Set(json, "license.0", "Ultimate")
fmt.Println(value)
json:= `{
"name": {"first":"James","last":"Bond"},
"age' :40,
"license" {"Ultimate","Gold","Silver"}
}`
Or to add a value "Ultimate" in the end:
value, _ := sjson.Set(json, "license.-1", "Ultimate")
fmt.Println(value)
json:= `{
"name": {"first":"James","last":"Bond"},
"age' :40,
"license" {"Diamond","Gold","Silver","Ultimate"}
}`
What is an idiomatic golang way to dump the struct into a csv file provided? I am inside a func where my struct is passed as interface{}:
func decode_and_csv(my_response *http.Response, my_struct interface{})
Why interface{}? - reading data from JSON and there could be a few different structs returned, so trying to write a generic enough function.
an example of my types:
type Location []struct {
Name string `json: "Name"`
Region string `json: "Region"`
Type string `json: "Type"`
}
It would be a lot easier if you used a concrete type. You'll probably want to use the encoding/csv package, here is a relevant example; https://golang.org/pkg/encoding/csv/#example_Writer
As you can see, the Write method is expecting a []string so in order to generate this, you'll have to either 1) provide a helper method or 2) reflect my_struct. Personally, I prefer the first method but it depends on your needs. If you want to go route two you can get all the fields on the struct an use them as the column headers, then iterate the fields getting the value for each, use append in that loop to add them to a []string and then pass it to Write out side of the loop.
For the first option, I would define a ToSlice or something on each type and then I would make an interface call it CsvAble that requires the ToSlice method. Change the type in your method my_struct CsvAble instead of using the empty interface and then you can just call ToSlice on my_struct and pass the return value into Write. You could have that return the column headers as well (meaning you would get back a [][]string and need to iterate the outer dimension passing each []string into Write) or you could require another method to satisfy the interface like GetHeaders that returns a []string which is the column headers. If that were the case your code would look something like;
w := csv.NewWriter(os.Stdout)
headers := my_struct.GetHeaders()
values := my_struct.ToSlice()
if err := w.Write(headers); err != nil {
//write failed do something
}
if err := w.Write(values); err != nil {
//write failed do something
}
If that doesn't make sense let me know and I can follow up with a code sample for either of the two approaches.
My original problem is I want to parse URL.Values to a generic type (map[interface{}]interface{}) edit/add some values then convert it to JSON string and put it to PostgreSQL JSON column.
I tried this code to parse it but content seems to be null whereas err is false. request.URL.Query() prints a nice map object.
v := reflect.ValueOf(request.URL.Query())
i := v.Interface()
content, err := i.(map[interface{}]interface{})
// Do some operations
jsonString, _ := json.Marshal(content)
// Add to DB
Why is it null? Also am I thinking too generic?
content, err := i.(map[interface{}]interface{}), this isn't a cast, it's a type assertion. You're saying (asserting) that interface is of type map[interface{}]interface{}, it's not. It's of type map[string][]string. You get null as the value because it fails. I highly doubt error is false.
Are you thinking too generic? Of course you are. I can't think of any reason why the collections type needs to change... Append what you want to it, write it to your db. There's nothing preventing that afaik?