Cast nested []interface{} to []map[string]interface{} - json

My goal is to generate from a json a map[string]interface structure , where all nested []interface{} are casted into []map[string]interface{} . This is because we are using the module https://github.com/ivahaev/go-xlsx-templater to fill xlsx from json . And all the data is expected to be into a map[string]interface{} struct where all the nested []interface{} are []map[string]interface.
Having a json as an input like the following :
{
"totalAmount": 4,
"subtotal": 4,
"Vendors": [{
"MethodOfTenders": [{
"order": 1,
"fees": 2
}, {
"order": 1,
"fees": 2
}],
"subtotalFees": 4
},
{
"MethodOfTenders": [{
"order": 1,
"fees": 2
}, {
"order": 1,
"fees": 1
}],
"subtotalFees": 3
}
]
}
When unmarshalling into an map[string]interface{} .I get the following struct :
map[string]interface{}:
"totalAmount" : 4 interface{}(float64)
"subtotal" : 4 interface{}(float64)
"Vendors" : interface{}([]interface{})
[0]: interface(map[string]interface{})
"MethodOfTenders" : interface{}([]interface{})
[0] : interface(map[string]interface{})
"order" : 1 interface{}(float64)
"fees" : 2 interface{}(float64)
[1] : interface(map[string]interface{})
"order" : 1 interface{}(float64),
"fees" : 2 interface{}(float64)
"subtotalFees" : 4 interface{}(float64)
[1]: interface(map[string]interface{})
"MethodOfTenders" : interface{}([]interface{})
[0] : interface(map[string]interface{})
"order" : 1 interface{}(float64)
"fees" : 2 interface{}(float64)
[1] : interface(map[string]interface{})
"order" : 1 interface{}(float64),
"fees" : 2 interface{}(float64)
"subtotalFees" : 3 interface{}(float64)
After doing some parsing , ranging each []interface{} and creating a a []map[string]interface{} to store each one of the nested map[string]interface .
I got the desired result where all []interfaces{} are []map[string]interface{}
map[string]interface{}:
"totalAmount" : 4 interface{}(float64)
"subtotal" : 4 interface{}(float64)
"Vendors" : interface{}([]map[string]interface{})
[0]: interface(map[string]interface{})
"MethodOfTenders" : interface{}([]map[string]interface{})
[0] : interface(map[string]interface{})
"order" : 1 interface{}(float64)
"fees" : 2 interface{}(float64)
[1] : interface(map[string]interface{})
"order" : 1 interface{}(float64),
"fees" : 2 interface{}(float64)
"subtotalFees" : 4 interface{}(float64)
[1]: interface(map[string]interface{})
"MethodOfTenders" : interface{}([]map[string]interface{})
[0] : interface(map[string]interface{})
"order" : 1 interface{}(float64)
"fees" : 2 interface{}(float64)
[1] : interface(map[string]interface{})
"order" : 1 interface{}(float64),
"fees" : 2 interface{}(float64)
"subtotalFees" : 3 interface{}(float64)
Is there any way to walk al the map[string]interface recursively and change the []interfaces{} for []map[string]interface ?
EDIT:
Here the repository with the full template and json , currently working with the reflection approach .
template and json repo

Update
So I've since noticed you added a link to a repo containing what you have so far. The template is weird (some of the values in there aren't part of your sample data). I've removed all fields except for the ones that were actually in the sample data JSON you have. As it turns out, the package you are using is quite rough around the edges (it does take care of type assertions in nested values, but not at the top level). I'd personally either fork the package and fix that issue, but in the mean while, if you know what fields you want to iterate over, I got it all to work in about 5 minutes with this:
var jsonMap map[string]interface{}
data := sampleData()
err := json.Unmarshal(data, &jsonMap)
if err != nil {
panic("Final log")
}
ctx := jsonMap
v := jsonMap["Vendors"]
vs := v.([]interface{})
vendors := make([]map[string]interface{}, 0, len(vs))
for _, v := range vs {
m := v.(map[string]interface{})
vendors = append(vendors, m)
}
jsonMap["Vendors"] = vendors
doc := xlst.New()
err = doc.ReadTemplate("export_support_template.xlsx")
if err != nil {
fmt.Println("ERROR OPENING THE TEMPLATE: ", err)
panic("error opening template")
}
err = doc.Render(ctx)
if err != nil {
fmt.Println("ERROR RENDERING THE TEMPLATE: ", err)
panic("error rendering template")
}
err = doc.Save("report.xlsx")
As you can see, I just unmarshalled the data. I then focused on the Vendors key, cast it to a slice (v.([]inteface{}), created a new variable of type []map[string]interface{}, and copied over the data casting the elements of the slice to maps in this simple loop:
for _, v := range vs {
m := v.(map[string]interface{})
vendors = append(vendors, m)
}
Then, I just reassigned the Vendors key in the original map (ctx["Vendors"] = vendors), and passed that in to the template. Less than 10 lines of code needed, and everything worked like a charm. No need for reflection, or any other magic. Just straightforward type casts. Without this, the package you're using does complain about the use of range on an interface{} type (instead of first checking to see if the key can be successfully cast to a []interface{} first). I removed all other functions from your main file (except for the sample data one), ran go build and executed ./json_marshaller. It churned out a correctly populated xlsx file no problem. Easy.
I'd still recommend you actually create a PR to the templater package so you don't have to handle this stuff yourself. It should be a fairly straightforward change, but for the time being: this approach works perfectly well.
Include your template
Looking at the repository of the go-xlsx-templater, I don't really see why you'd need anything other than this:
data := map[string]interface{}
_ = json.Unmarshal(input, &data)
The reason why this is perfectly fine, is because go-xlsx-templater performs the type conversion of all interface{} values within this map already. It will traverse the map, and check if any of the values are of the type []interface{}, it'll then iterate over this slice and check for map[string]interface{} values inside the slice. It's all in the source code. Going by their own documentation, it should work just fine if your template looks somewhat like this:
| Total: | {{ totalAmount }} |
| Subtotal: | {{ subtotal }} |
| | Vendor subtotal | fees | order |
| {{ range Vendors }} |
| | | {{ MethodsOfTender.Fees }} | {{ MethodsOfTender.order }} |
| | {{ subtotalFees }} |
| {{ end }} |
You should end up with the spreadsheet being populated correctly:
| Total: | 4 | | |
| Subtotal: | 4 | | |
| | Vendor subtotal | fees | order |
| | | 2 | 1 |
| | | 2 | 1 |
| | 4 | | |
| | | 2 | 1 |
| | | 1 | 1 |
| | 3 | | |
Essentially, the data-set you're showing is no different to the example the repo gives in their very own main README file. The context data itself contains nested maps, and a screenshot of a spreadsheet template showing how to use it. Putting it bluntly, I feel like your question is pretty much a case of RTFM...
Without your template, however, there's no way to be sure, so I'm leaving my initial answer in full below...
If I understand what you're asking correctly, you're looking for a way to use type assertions/casts to extract keys from a map of type map[string]interface{} and use the values of type []interface{} as []map[string]interface{}. In this particular example, you need the top level key Vendors to be accessible as []map[string]interface{} (as opposed to interface{} or []interface{}). If that's what you're looking for, you can do this quite easily:
data := map[string]interface{}{}
if err := json.Unmarshal(jsonBytes, &data); err != nil {
// handle error
}
nested := map[string][]map[string]interface{}{} // subset of data that needs to be accessible as slices of maps
for k, v := range data {
if s, ok := v.([]interface); ok {
// this key is of type []interface, see if we can use it as slice of maps
if mapS, err := sliceAnyToMap(s); err == nil {
nested[k] = mapS
} // handle error if needed
}
}
func sliceAnyToMap(s []any) ([]map[string]interface{}, error) {
ret := make([]map[string]interface{}, 0, len(s))
for _, v := range s {
if m, ok := v.(map[string]interface{}); ok {
ret = append(ret, m)
} else {
return nil, errors.New("slice contains non-map data")
}
}
return ret, nil
}
With this, you'll end up with a variable called nested which will contain keys like Vendors, with the data accessible as a []map[string]interface{}. This can be passed on to your templater to populate the spreadsheet with the vendors data.
Demo here
Now, inside some of the Vendors key, there is some data which is in turn a a map, which still can't be accessed directly. Breaking up this sliceAnyToMap stuff into a function will make it easier to convert all data you need fairly easily.
Now, having said all this, I do personally think this is very much an X-Y problem. The data you've shown here looks to be of a very well defined structure. Rather than faffing around with map[string]interface{} and the like, it would be a whole lot easier to read, write, and maintain if you were to use actual types. Based on what you've included in your question, a couple of types like this would be all it takes:
type Data struct {
Total float64 `json:"totalAmount"`
Subtotal float64 `json:"subtotal"`
Vendors []Vendor `json:"Vendors"`
}
type Vendor struct {
MethodOfTenders []MOT `json:"MethodOfTenders"`
SubtotalFees float64 `json:"subtotalFees"`
}
type MOT struct {
Fees float64 `json:"fees"`
Order float64 `json:"order"`
}
With these types, you can quickly, and easily parse the given input into a format that is much, much easier to use further down the line:
data := Data{}
if err := json.Unmarshal(input, &data); err != nil {
// handle error
}
for _, vendor := range data.Vendors {
fmt.Printf("Vendor subtotal fees: %f\n", vendor.SubtotalFees)
for i, mot := range vendor.MethodOfTenders {
fmt.Printf("MOT %d:\nOrder: %f\nFees: %f\n\n", i+1, mot.Order, mot.Fees)
}
fmt.Println("-----")
}
Demo here
Last couple of things I'm wondering is whether or not all numberic values should be float64. Sure, it makes sense for amounts and/or fees (assuming they're monetary values), but the Order field probably contains an order number, which I don't expect to be a value like 0.123.
Finally, if you need to pass this data on to a templater of sorts which expects a map, then it's a fairly trivial thing to do using the aforementioned types all the same. There's a couple of ways to do this. The easy way being:
func (d Data) ToMap() (map[string]interface{}, error) {
raw, err := json.Marshal(d)
if err != nil {
return nil, err
}
asMap := map[string]interface{}{}
if err := json.Unmarshal(raw, &asMap); err != nil {
return nil, err
}
return asMap, nil
}
JSON marshalling is quick to implement, but it's a tad inefficient, so you if performance matters a lot, then you could spend a few minutes writing some methods like:
func (d Data) ToMap() map[string]interface{} {
vendors := make([]map[string]interface{}, 0, len(d.Vendors))
for _, v := range d.Vendors {
vendors = append(vendors, v.ToMap())
}
return map[string]interface{}{
"totalAmount": d.Total,
"subtotal": d.Subtotal,
"Vendors": vendors,
}
}
func (v Vendor) ToMap() map[string]interface{} {
mot := make([]map[string]interface{}, 0, len(v.MethodsOfTender))
for _, m := range v.MethodsOfTender {
mot = append(mot, m.ToMap())
}
return map[string]interface{
"MethodsOfTender": mot,
"subtotalFees": v.SubtotalFees,
}
}
func (m MOT) ToMap() map[string]interface{} {
return map[string]interface{}{
"fees": m.Fees,
"order": m.Order,
}
}
Then, turning your Data object into a map is as simple as:
dataMap := data.ToMap()

package main
import (
"encoding/json"
"fmt"
)
func main() {
s := `{
"totalAmount": 4,
"subtotal": 4,
"Vendors": [{
"MethodOfTenders": [
{
"order": 1,
"fees": 2
}, {
"order": 1,
"fees": 2
}
],
"subtotalFees": 4
},
{
"MethodOfTenders": [
{
"order": 1,
"fees": 2
}, {
"order": 1,
"fees": 1
}
],
"subtotalFees": 3
}
]
}
`
var data map[string]interface{}
if err := json.Unmarshal([]byte(s), &data); err != nil {
fmt.Println(err)
}
fmt.Printf("%+v\n", data)
}
Playground
When Unmarshal from json it is generating expected output which is required by go-xlsx-templater
Note: Created a fix for the issue stated in the question here raised PR. After the fix no conversion is required.
After applying the fix the remaining code with sampleData() function which is not included in the example
package main
import (
"encoding/json"
"fmt"
"log"
"os"
"reflect"
"strings"
xlst "github.com/ivahaev/go-xlsx-templater"
)
func main() {
wd, err := os.Getwd()
if err != nil {
log.Fatalf(err.Error())
}
path := strings.Join([]string{wd, ""}, "")
doc := xlst.New()
err = doc.ReadTemplate(path + "/export_support_template.xlsx")
if err != nil {
fmt.Println("ERROR OPENING THE TEMPLATE: ", err)
panic("error opening template")
}
var ctx map[string]interface{}
data := sampleData()
err := json.Unmarshal(data, &ctx)
if err != nil {
fmt.Println("Final log")
}
err = doc.Render(ctx)
if err != nil {
fmt.Println("ERROR RENDERING THE TEMPLATE: ", err)
panic("error rendering template")
}
err = doc.Save(path + "/report.xlsx")
if err != nil {
fmt.Println("ERROR SAVING THE TEMPLATE: ", err)
panic("error saving template")
}
}

The trick is performed using reflection. Go down recursively and check the kind of values:
// "Casts" map values to the desired type recursively
func castMap(m map[string]any) map[string]any {
for k := range m {
switch reflect.ValueOf(m[k]).Kind() {
case reflect.Map:
mm, ok := m[k].(map[string]any)
if !ok {
panic(fmt.Errorf("Expected map[string]any, got %T", m[k]))
}
m[k] = castMap(mm)
case reflect.Slice, reflect.Array:
ma, ok := m[k].([]any)
if !ok {
panic(fmt.Errorf("Expected []any, got %T", m[k]))
}
m[k] = castArray(ma)
default:
// fmt.Printf("%s: %T, kind %v\n", k, m[k], reflect.ValueOf(m[k]).Kind())
continue
}
}
return m
}
// "Casts" slice elements to the desired types recursively
func castArray(a []any) []map[string]any {
res := []map[string]any{}
for i := range a {
switch reflect.ValueOf(a[i]).Kind() {
case reflect.Map:
am, ok := a[i].(map[string]any)
if !ok {
panic(fmt.Errorf("Expected map[string]any, got %T", a[i]))
}
am = castMap(am)
res = append(res, am)
default:
panic(fmt.Errorf("Expected map[string]any, got %T", a[i]))
}
}
return res
}
Full example with main is here: https://go.dev/play/p/MEQRe-f3dY1
It's output:
before: map[string]interface {}{"Vendors":[]interface {}{map[string]interface {}{"MethodOfTenders":[]interface {}{map[string]interface {}{"fees":2, "order":1}, map[string]interface {}{"fees":2, "order":1}}, "subtotalFees":4}, map[string]interface {}{"MethodOfTenders":[]interface {}{map[string]interface {}{"fees":2, "order":1}, map[string]interface {}{"fees":1, "order":1}}, "subtotalFees":3}}, "subtotal":4, "totalAmount":4}
after: map[string]interface {}{"Vendors":[]map[string]interface {}{map[string]interface {}{"MethodOfTenders":[]map[string]interface {}{map[string]interface {}{"fees":2, "order":1}, map[string]interface {}{"fees":2, "order":1}}, "subtotalFees":4}, map[string]interface {}{"MethodOfTenders":[]map[string]interface {}{map[string]interface {}{"fees":2, "order":1}, map[string]interface {}{"fees":1, "order":1}}, "subtotalFees":3}}, "subtotal":4, "totalAmount":4}
See? "Vendors" was []interface {}, became []map[string]interface {}
"Vendors[].MethodOfTenders" were []interface {}, became []map[string]interface {}
The functions panic if they see something unexpected. Feel free to modify them to returning error if needed.
UPDATE
Here is the same recursive algoritm using type assertions.
// "Casts" map values to the desired type recursively
func castMap(m map[string]any) map[string]any {
for k := range m {
mm, ok := m[k].(map[string]any)
if ok {
m[k] = castMap(mm)
continue
}
ma, ok := m[k].([]any)
if ok {
m[k] = castArray(ma)
continue
}
}
return m
}
// "Casts" slice elements to the desired types recursively
func castArray(a []any) []map[string]any {
res := []map[string]any{}
for i := range a {
am, ok := a[i].(map[string]any)
if ok {
am = castMap(am)
res = append(res, am)
} else {
panic(fmt.Errorf("Expected map[string]any, got %T", a[i]))
}
}
return res
}
Full code https://go.dev/play/p/IbOhQqpisie
BENCHMARK
One of the commenters made a claim that reflection is expensive. This is not the case here:
goos: windows
goarch: amd64
pkg: example.org/try/test
cpu: Intel(R) Core(TM) i7-8550U CPU # 1.80GHz
BenchmarkCastReflect-8 382574 2926 ns/op 2664 B/op 29 allocs/op
BenchmarkCastType-8 418257 2934 ns/op 2664 B/op 29 allocs/op
The benchmark is here: https://go.dev/play/p/Ro8PeVQy8kA
Playground doesn't execute benchmarks. It should be run on a real CPU

Related

Writing multiple columns in a csv file using golang

I'm trying to write a CSV file, I can have 1 to n columns. Currently my data are correctly written except that they are all written on the same column.
I would like to have something like this :
NAME|DESCRIPTION|PRODUCER
name1|desc1|false
name2|desc2|true
name3|desc3|false
Here is my code, a small piece of a switch:
case "companies":
var respToolCompanies entrepriseTool.CompaniesResponse
if jsonErr := json.Unmarshal(resByt, &respToolCompanies); jsonErr != nil {
log.Fatalf("unmarshal: %s", jsonErr)
}
for _, mapping := range mappings {
writeHeader(csvwriter, mapping)
for _, company := range respToolCompanies.Companies {
writeDataAccordinglyToFieldType(mapping, company, csvwriter)
}
csvwriter.Flush()
}
The writeDataAccordinglyToFieldType function:
func writeDataAccordinglyToFieldType(mapping ExportmappingsModel, entities interface{}, csvwriter *csv.Writer) {
switch mapping.SourceColType.String {
case "string":
field := extractFieldValue(entities, mapping)
writeFieldToBuffer(csvwriter, field.String())
case "number":
field := extractFieldValue(entities, mapping)
valInt := field.Int()
str := strconv.Itoa(int(valInt))
writeFieldToBuffer(csvwriter, str)
case "bool":
field := extractFieldValue(entities, mapping)
var boolVal string
if field.Bool() {
boolVal = "true"
} else {
boolVal = "false"
}
writeFieldToBuffer(csvwriter, boolVal)
}
}
And where I write data:
func writeFieldToBuffer(csvwriter *csv.Writer, field string) {
err := csvwriter.Write([]string{field})
if err != nil {
log.Println("Unable to write a line inside the file")
}
}
csv.Write will write in different column only when your string slice will have multiple elements .Currently you are writing each field one by one and using a slice that is only having one record at one time
I am not saying that you have to pass the delimiter .Rather populate the string slice all at once so that csv.Write automatically iterates over the slice and write each new element in new column. . So change your logic to change
err := csvwriter.Write([]string{field})
to something like this : enter code here
record := []string {"name1" ,"desc1","false"}
if err := csvwriter.Write(record); err != nil {
log.Fatalln("error writing record to file", err)
}
or you can populate the whole thing using the two dimensional slice and then writeall at the end
records := [][]string {{"name1" ,"desc1","false"},{"name2"
,"desc2","false"}}
if err := csvwriter.WriteAll(records); err != nil {
log.Fatalln("error writing record to file", err)
}
First things first: Go's csv Writer writes a record, and a record is a slice of strings. Looking at the example from the csv pkg documentation:
records := [][]string{
{"first_name", "last_name", "username"},
{"Rob", "Pike", "rob"},
{"Ken", "Thompson", "ken"},
{"Robert", "Griesemer", "gri"},
}
w := csv.NewWriter(os.Stdout)
for _, record := range records {
if err := w.Write(record); err != nil {
log.Fatalln("error writing record to csv:", err)
}
}
we can see that we write to a CSV file row-by-row, not column-by-column. So any solution you come up with must include a complete row when you call writer.Write(...).
That said, going between a struct and the csv Writer can be difficult. Have you looked at Gocarina's gocsv package? It would reduce (what I imagine to be) your problem down to something like the following:
import (
"encoding/json"
"os"
"github.com/gocarina/gocsv"
)
type company struct {
Name string `json:"name" csv:"Name"`
Description string `json:"description" csv:"Description"`
Producer bool `json:"producer" csv:"Producer"`
Employees int `json:"employees" csv:"Employees"`
}
var blob = `[
{ "name": "Foo To You", "description": "Selling Foo since before the dinosaurs", "producer": false, "employees": 7 },
{ "name": "Bar Mfg.", "description": "Best makers of Bar, bar none", "producer": true, "employees": 12 }
]`
func main() {
var companies []company
json.Unmarshal([]byte(blob), &companies)
gocsv.Marshal(companies, os.Stdout)
}
to produce this CSV (marked-up as a table):
| Name | Description | Producer | Employees |
|------------|----------------------------------------|----------|-----------|
| Foo To You | Selling Foo since before the dinosaurs | false | 7 |
| Bar Mfg. | Best makers of Bar, bar none | true | 12 |

handle unstructured JSON data using the standard Go unmarshaller

Some context, I'm designing a backend that will receive JSON post data, but the nature of the data is that it has fields that are unstructured. My general research tells me this is a static language vs unstructured data problem.
Normally if you can create a struct for it if the data is well known and just unmarshal into the struct. I have create custom unmarshaling functions for nested objects.
The issue now is that one of the fields could contain an object with an arbitrary number of keys. To provide some code context:
properties: {
"k1": "v1",
"k2": "v2",
"k3": "v3",
...
}
type Device struct {
id: string,
name: string,
status: int,
properties: <what would i put here?>
}
So its hard to code an explicit unmarshaling function for it. Should put a type of map[string]string{}? How would it work if the values were not all strings then? And what if that object itself had nested values/objects as well?
You can make the Properties field as map[string]interface{} so that it can accommodate different types of values.I created a small code for your scenario as follows:
package main
import (
"encoding/json"
"fmt"
)
type Device struct {
Id string
Name string
Status int
Properties map[string]interface{}
}
func main() {
devObj := Device{}
data := []byte(`{"Id":"101","Name":"Harold","Status":1,"properties":{"key1":"val1"}}`)
if err := json.Unmarshal(data, &devObj); err != nil {
panic(err)
}
fmt.Println(devObj)
devObj2 := Device{}
data2 := []byte(`{"Id":"102","Name":"Thanor","Status":1,"properties":{"k1":25,"k2":"someData"}}`)
if err := json.Unmarshal(data2, &devObj2); err != nil {
panic(err)
}
fmt.Println(devObj2)
devObj3 := Device{}
data3 := []byte(`{"Id":"101","Name":"GreyBeard","Status":1,"properties":{"k1":25,"k2":["data1","data2"]}}`)
if err := json.Unmarshal(data3, &devObj3); err != nil {
panic(err)
}
fmt.Println(devObj3)
}
Output:
{101 Harold 1 map[key1:val1]}
{102 Thanor 1 map[k1:25 k2:someData]}
{101 GreyBeard 1 map[k1:25 k2:[data1 data2]]}
I would use one of the popular Go JSON parsers that don't require parsing to a pre-defined struct. An added benefit, and the primary reason they were created, is that they are much faster than encoding/json because they don't use reflection, interface{} or certain other approaches.
Here are two:
https://github.com/buger/jsonparser - 4.1k GitHub stars
https://github.com/valyala/fastjson - 1.3k GitHub stars
Using github.com/buger/jsonparser, a property could be retrieved using the GetString function:
func GetString(data []byte, keys ...string) (val string, err error)
Here's a full example:
package main
import (
"fmt"
"strconv"
"github.com/buger/jsonparser"
)
func main() {
jsondata := []byte(`
{
"properties": {
"k1": "v1",
"k2": "v2",
"k3": "v3"
}
}`)
for i := 1; i > 0; i++ {
key := "k" + strconv.Itoa(i)
val, err := jsonparser.GetString(jsondata, "properties", key)
if err == jsonparser.KeyPathNotFoundError {
break
} else if err != nil {
panic(err)
}
fmt.Printf("found: key [%s] val [%s]\n", key, val)
}
}
See it run on Go Playground: https://play.golang.org/p/ykAM4gac8zT

Proper json unmarshaling in Go with the empty interface

I'm currently learning golang and (probably as many others before me) I'm trying to properly understand the empty interface.
As an exercise, I'm reading a big json file produced by Postman and trying to access just one field (out of the many available).
Here is a simple representation of the json without the unnecessary fields I don't want to read (but that are still there):
{
"results": [
{
"times": [
1,
2,
3,
4
]
}
]
}
Since the json object is big, I opted out of unmarshaling it with a custom struct, and rather decided to use the empty interface interface{}
After some time, I managed to get some working code, but I'm quite sure this isn't the correct way of doing it.
byteValue, _ := ioutil.ReadAll(jsonFile)
var result map[string]interface{}
err = json.Unmarshal(byteValue, &result)
if err != nil {
log.Fatalln(err)
}
// ESPECIALLY UGLY
r := result["results"].([]interface{})
r1 := r[0].(map[string]interface{})
r2 := r1["times"].([]interface{})
times := make([]float64, len(r2))
for i := range r2 {
times[i] = r2[i].(float64)
}
Is there a better way to navigate through my json object without having to instantiate new variables every time i move deeper and deeper into the object?
Even if the JSON is large, you only have to define the fields you actually care about
You only need to use JSON tags if the keys aren't valid Go
identifiers (keys are idents in this case), even then you can sometimes avoid it by using a map[string]something
Unless you need the sub structs for some function or whatnot, you don't need to define them
Unless you need to reuse the type, you don't even have to define that, you can just define the struct at declaration time
Example:
package main
import (
"encoding/json"
"fmt"
)
const s = `
{
"results": [
{
"times": [1, 2, 3, 4]
}
]
}
`
func main() {
var t struct {
Results []struct {
Times []int
}
}
json.Unmarshal([]byte(s), &t)
fmt.Printf("%+v\n", t) // {Results:[{Times:[1 2 3 4]}]}
}
[...] trying to access just one field (out of the many available).
For this concrete use case I would use a library to query and access to a single value in a known path like:
https://github.com/jmespath/go-jmespath
In the other hand, if you're practicing how to access nested values in a JSON, I would recommend you to give a try to write a recursive function that follows a path in an unknown structure the same way (but simple) like go-jmespath does.
Ok, I challenged myself and spent an hour writing this. It works. Not sure about performance or bugs and it's really limited :)
https://play.golang.org/p/dlIsmG6Lk-p
package main
import (
"encoding/json"
"errors"
"fmt"
"strings"
)
func main() {
// I Just added a bit more of data to the structure to be able to test different paths
fileContent := []byte(`
{"results": [
{"times": [
1,
2,
3,
4
]},
{"times2": [
5,
6,
7,
8
]},
{"username": "rosadabril"},
{"age": 42},
{"location": [41.5933262, 1.8376757]}
],
"more_results": {
"nested_1": {
"nested_2":{
"foo": "bar"
}
}
}
}`)
var content map[string]interface{}
if err := json.Unmarshal(fileContent, &content); err != nil {
panic(err)
}
// some paths to test
valuePaths := []string{
"results.times",
"results.times2",
"results.username",
"results.age",
"results.doesnotexist",
"more_results.nested_1.nested_2.foo",
}
for _, p := range valuePaths {
breadcrumbs := strings.Split(p, ".")
value, err := search(breadcrumbs, content)
if err != nil {
fmt.Printf("\nerror searching '%s': %s\n", p, err)
continue
}
fmt.Printf("\nFOUND A VALUE IN: %s\n", p)
fmt.Printf("Type: %T\nValue: %#v\n", value, value)
}
}
// search is our fantastic recursive function! The idea is to search in the structure in a very basic way, for complex querying use jmespath
func search(breadcrumbs []string, content map[string]interface{}) (interface{}, error) {
// we should never hit this point, but better safe than sorry and we could incurr in an out of range error (check line 82)
if len(breadcrumbs) == 0 {
return nil, errors.New("ran out of breadcrumbs :'(")
}
// flag that indicates if we are at the end of our trip and whe should return the value without more checks
lastBreadcrumb := len(breadcrumbs) == 1
// current breadcrumb is always the first element.
currentBreadcrumb := breadcrumbs[0]
if value, found := content[currentBreadcrumb]; found {
if lastBreadcrumb {
return value, nil
}
// if the value is a map[string]interface{}, go down the rabbit hole, recursion!
if aMap, isAMap := value.(map[string]interface{}); isAMap {
// we are calling ourselves popping the first breadcrumb and passing the current map
return search(breadcrumbs[1:], aMap)
}
// if it's an array of interfaces the thing gets complicated :(
if anArray, isArray := value.([]interface{}); isArray {
for _, something := range anArray {
if aMap, isAMap := something.(map[string]interface{}); isAMap && len(breadcrumbs) > 1 {
if v, err := search(breadcrumbs[1:], aMap); err == nil {
return v, nil
}
}
}
}
}
return nil, errors.New("woops, nothing here")
}

How to check if a json matches a struct / struct fields

Is there an easy way to check if each field of myStruct was mapped by using json.Unmarshal(jsonData, &myStruct).
The only way I could image is to define each field of a struct as pointer, otherwise you will always get back an initialized struct.
So every jsonString that is an object (even an empty one {}) will return an initialized struct and you cannot tell if the json represented your struct.
The only solution I could think of is quite uncomfortable:
package main
import (
"encoding/json"
"fmt"
)
type Person struct {
Name *string `json:name`
Age *int `json:age`
Male *bool `json:male`
}
func main() {
var p *Person
err := json.Unmarshal([]byte("{}"), &p)
// handle parse error
if err != nil {
return
}
// handle json did not match error
if p.Name == nil || p.Age == nil || p.Male == nil {
return
}
// now use the fields with dereferencing and hope you did not forget a nil check
fmt.Println("Hello " + *p.Name)
}
Maybe one could use a library like govalidator and use SetFieldsRequiredByDefault. But then you still have to execute the validation and still you are left with the whole pointer dereferencing for value retrieval and the risk of nil pointer.
What I would like is a function that returns my unmarshaled json as a struct or an error if the fields did not match. The only thing the golang json library offers is an option to fail on unknown fields but not to fail on missing fields.
Any idea?
Another way would be to implement your own json.Unmarshaler which uses reflection (similar to the default json unmarshaler):
There are a few points to consider:
if speed is of great importance to you then you should write a benchmark to see how big the impact of the extra reflection is. I suspect its negligible but it can't hurt to write a small go benchmark to get some numbers.
the stdlib will unmarshal all numbers in your json input into floats. So if you use reflection to set integer fields then you need to provide the corresponding conversion yourself (see TODO in example below)
the json.Decoder.DisallowUnknownFields function will not work as expected with your type. You need to implement this yourself (see example below)
if you decide to take this approach you will make your code more complex and thus harder to understand and maintain. Are you actually sure you must know if fields are omitted? Maybe you can refactor your fields to make good usage of the zero values?
Here a fully executable test of this approach:
package sandbox
import (
"encoding/json"
"errors"
"reflect"
"strings"
"testing"
)
type Person struct {
Name string
City string
}
func (p *Person) UnmarshalJSON(data []byte) error {
var m map[string]interface{}
err := json.Unmarshal(data, &m)
if err != nil {
return err
}
v := reflect.ValueOf(p).Elem()
t := v.Type()
var missing []string
for i := 0; i < t.NumField(); i++ {
field := t.Field(i)
val, ok := m[field.Name]
delete(m, field.Name)
if !ok {
missing = append(missing, field.Name)
continue
}
switch field.Type.Kind() {
// TODO: if the field is an integer you need to transform the val from float
default:
v.Field(i).Set(reflect.ValueOf(val))
}
}
if len(missing) > 0 {
return errors.New("missing fields: " + strings.Join(missing, ", "))
}
if len(m) > 0 {
extra := make([]string, 0, len(m))
for field := range m {
extra = append(extra, field)
}
// TODO: consider sorting the output to get deterministic errors:
// sort.Strings(extra)
return errors.New("unknown fields: " + strings.Join(extra, ", "))
}
return nil
}
func TestJSONDecoder(t *testing.T) {
cases := map[string]struct {
in string
err string
expected Person
}{
"Empty object": {
in: `{}`,
err: "missing fields: Name, City",
expected: Person{},
},
"Name missing": {
in: `{"City": "Berlin"}`,
err: "missing fields: Name",
expected: Person{City: "Berlin"},
},
"Age missing": {
in: `{"Name": "Friedrich"}`,
err: "missing fields: City",
expected: Person{Name: "Friedrich"},
},
"Unknown field": {
in: `{"Name": "Friedrich", "City": "Berlin", "Test": true}`,
err: "unknown fields: Test",
expected: Person{Name: "Friedrich", City: "Berlin"},
},
"OK": {
in: `{"Name": "Friedrich", "City": "Berlin"}`,
expected: Person{Name: "Friedrich", City: "Berlin"},
},
}
for name, c := range cases {
t.Run(name, func(t *testing.T) {
var actual Person
r := strings.NewReader(c.in)
err := json.NewDecoder(r).Decode(&actual)
switch {
case err != nil && c.err == "":
t.Errorf("Expected no error but go %v", err)
case err == nil && c.err != "":
t.Errorf("Did not return expected error %v", c.err)
case err != nil && err.Error() != c.err:
t.Errorf("Expected error %q but got %v", c.err, err)
}
if !reflect.DeepEqual(c.expected, actual) {
t.Errorf("\nWant: %+v\nGot: %+v", c.expected, actual)
}
})
}
}
You could compare p with a empty struct, instead of comparing each field with nil.
// handle json did not match error
if p == Person{} {
return
}
Since Person{} will initialize with the 0 value of each field, this will result in each property that is pointers to be nil, strings will be "", ints will be 0, and so on.

Unmarshaling string encoded json ints with nulls uses previous value when null

I am trying to unmarshal json that contains ints encoded as strings. Using tags to specify that the field is encoded as a string works, but I am running into issues when the field is null. The main problem, it seems, is that the null is not encoded as a string so the parser ignores it and keeps going. The problem is that it jams in the previously decoded value for some reason.
Any ideas on how I can get this parsing correctly?
I have the following code:
package main
import (
"encoding/json"
"log"
)
type Product struct {
Price int `json:",string,omitempty"`
}
func main() {
data := `
[
{"price": "1"},
{"price": null},
{"price": "2"}
]
`
var products []Product
if err := json.Unmarshal([]byte(data), &products); err != nil {
log.Printf("%#v", err)
}
log.Printf("%#v", products)
}
Output:
[]main.Product{main.Product{Price:1}, main.Product{Price:1}, main.Product{Price:2}}
Code on go playground
Feels like a bug in the json package.
You can work around it with a custom Unmarshaller, like this, although it may be annoying if you've got a complex struct:
func (p *Product) UnmarshalJSON(b []byte) error {
m := map[string]string{}
err := json.Unmarshal(b, &m)
if err != nil {
return err
}
if priceStr, ok := m["price"]; ok {
p.Price, _ = strconv.Atoi(priceStr)
}
return nil
}
http://play.golang.org/p/UKjfVqHCGi