Need to parse integers in JSON as integers, not floats - json

First of all let me explain the problem.
I have a stream of JSON records coming into my Golang app. It basically forwards these to a data store (InfluxDB). There are some integer values in the JSON, and also some float values. It is essential that these get forwarded to the data store with the original data type. If they don't, there will be type conflicts and the write operation will fail.
The Ruby JSON parser has no trouble doing this:
require 'json'
obj = { "a" => 123, "b" => 12.3 }
parsed = JSON.parse(obj.to_json)
print parsed["a"].class # => Integer
print parsed["b"].class # => Float
The encoding/json package in Golang, does have some trouble (all numbers are parsed as floats):
package main
import "encoding/json"
import "fmt"
func main () {
str := "{\"a\":123,\"b\":12.3}"
var parsed map[string]interface{}
json.Unmarshal([]byte(str), &parsed)
for key, val := range parsed {
switch val.(type) {
case int:
fmt.Println("int type: ", key)
case float64:
fmt.Println("float type: ", key)
default:
fmt.Println("unknown type: ", key)
}
}
}
Which prints:
float type: a
float type: b
I need a way to parse ints as ints, and floats as floats, in the way the Ruby JSON parser does.
It is not feasible in this case to parse everything as strings and check whether or not there is a decimal. Certain values come in as strings such as "123" and I need to push those as strings.
I do not have structs for the parsed objects, nor is that an option. The golang app doesn't actually care about the schema and just forwards the input as it receives it.
I tried the approach outlined here: Other ways of verifying reflect.Type for int and float64 (using reflect) but it did not accurately identify the int:
t := reflect.TypeOf(parsed["a"])
k := t.Kind()
k == reflect.Int // false
k == reflect.Float64 // true

For example, Ruby JSON number types using the general Go mechanism for custom JSON values,
package main
import (
"encoding/json"
"fmt"
"strconv"
)
func main() {
str := `{"a":123,"b":12.3,"c":"123","d":"12.3","e":true}`
var raw map[string]json.RawMessage
err := json.Unmarshal([]byte(str), &raw)
if err != nil {
panic(err)
}
parsed := make(map[string]interface{}, len(raw))
for key, val := range raw {
s := string(val)
i, err := strconv.ParseInt(s, 10, 64)
if err == nil {
parsed[key] = i
continue
}
f, err := strconv.ParseFloat(s, 64)
if err == nil {
parsed[key] = f
continue
}
var v interface{}
err = json.Unmarshal(val, &v)
if err == nil {
parsed[key] = v
continue
}
parsed[key] = val
}
for key, val := range parsed {
fmt.Printf("%T: %v %v\n", val, key, val)
}
}
Playground: https://play.golang.org/p/VmG8IZV4CG_Y
Output:
int64: a 123
float64: b 12.3
string: c 123
string: d 12.3
bool: e true
Another example, Ruby JSON number types using the Go json.Number type,
package main
import (
"encoding/json"
"fmt"
"strings"
)
func main() {
str := `{"a":123,"b":12.3,"c":"123","d":"12.3","e":true}`
var parsed map[string]interface{}
d := json.NewDecoder(strings.NewReader(str))
d.UseNumber()
err := d.Decode(&parsed)
if err != nil {
panic(err)
}
for key, val := range parsed {
n, ok := val.(json.Number)
if !ok {
continue
}
if i, err := n.Int64(); err == nil {
parsed[key] = i
continue
}
if f, err := n.Float64(); err == nil {
parsed[key] = f
continue
}
}
for key, val := range parsed {
fmt.Printf("%T: %v %v\n", val, key, val)
}
}
Playground: https://play.golang.org/p/Hk_Wb0EM-aY
Output:
int64: a 123
float64: b 12.3
string: c 123
string: d 12.3
bool: e true
A working version of #ShudiptaSharma's suggestion.

There exists a type json.Number in json package which has 3 format functions String() string, Float64() (float64, error) and Int64() (int64, error). We can use them to parse integer and float type.
So, I handle json integer parsing in this way:
package main
import "encoding/json"
import "fmt"
import "strings"
import (
"reflect"
)
func main () {
str := "{\"a\":123,\"b\":12.3}"
var parsed map[string]interface{}
d := json.NewDecoder(strings.NewReader(str))
d.UseNumber()
fmt.Println(d.Decode(&parsed))
for key, val := range parsed {
fmt.Println(reflect.TypeOf(val))
fmt.Printf("decoded to %#v\n", val)
switch val.(type) {
case json.Number:
if n, err := val.(json.Number).Int64(); err == nil {
fmt.Println("int64 type: ", key, n)
} else if f, err := val.(json.Number).Float64(); err == nil {
fmt.Println("float64 type: ", key, f)
} else {
fmt.Println("string type: ", key, val)
}
default:
fmt.Println("unknown type: ", key, val)
}
fmt.Println("===============")
}
}

Related

Unmarshaling from JSON key containing a single quote

I feel quite puzzled by this.
I need to load some data (coming from a French database) that is serialized in JSON and in which some keys have a single quote.
Here is a simplified version:
package main
import (
"encoding/json"
"fmt"
)
type Product struct {
Name string `json:"nom"`
Cost int64 `json:"prix d'achat"`
}
func main() {
var p Product
err := json.Unmarshal([]byte(`{"nom":"savon", "prix d'achat": 170}`), &p)
fmt.Printf("product cost: %d\nerror: %s\n", p.Cost, err)
}
// product cost: 0
// error: %!s(<nil>)
Unmarshaling leads to no errors however the "prix d'achat" (p.Cost) is not correctly parsed.
When I unmarshal into a map[string]any, the "prix d'achat" key is parsed as I would expect:
package main
import (
"encoding/json"
"fmt"
)
func main() {
blob := map[string]any{}
err := json.Unmarshal([]byte(`{"nom":"savon", "prix d'achat": 170}`), &blob)
fmt.Printf("blob: %f\nerror: %s\n", blob["prix d'achat"], err)
}
// blob: 170.000000
// error: %!s(<nil>)
I checked the json.Marshal documentation on struct tags and I cannot find any issue with the data I'm trying to process.
Am I missing something obvious here?
How can I parse a JSON key containing a single quote using struct tags?
Thanks a lot for any insight!
I didn't find anything in the documentation, but the JSON encoder considers single quote to be a reserved character in tag names.
func isValidTag(s string) bool {
if s == "" {
return false
}
for _, c := range s {
switch {
case strings.ContainsRune("!#$%&()*+-./:;<=>?#[]^_{|}~ ", c):
// Backslash and quote chars are reserved, but
// otherwise any punctuation chars are allowed
// in a tag name.
case !unicode.IsLetter(c) && !unicode.IsDigit(c):
return false
}
}
return true
}
I think opening an issue is justified here. In the meantime, you're going to have to implement json.Unmarshaler and/or json.Marshaler. Here is a start:
func (p *Product) UnmarshalJSON(b []byte) error {
type product Product // revent recursion
var _p product
if err := json.Unmarshal(b, &_p); err != nil {
return err
}
*p = Product(_p)
return unmarshalFieldsWithSingleQuotes(p, b)
}
func unmarshalFieldsWithSingleQuotes(dest interface{}, b []byte) error {
// Look through the JSON tags. If there is one containing single quotes,
// unmarshal b again, into a map this time. Then unmarshal the value
// at the map key corresponding to the tag, if any.
var m map[string]json.RawMessage
t := reflect.TypeOf(dest).Elem()
v := reflect.ValueOf(dest).Elem()
for i := 0; i < t.NumField(); i++ {
tag := t.Field(i).Tag.Get("json")
if !strings.Contains(tag, "'") {
continue
}
if m == nil {
if err := json.Unmarshal(b, &m); err != nil {
return err
}
}
if j, ok := m[tag]; ok {
if err := json.Unmarshal(j, v.Field(i).Addr().Interface()); err != nil {
return err
}
}
}
return nil
}
Try it on the playground: https://go.dev/play/p/aupACXorjOO

Golang Interface to struct conversion giving error

I have an json as a string of following format:
{"add": [{"var": ["100"]}, "200"]}
Here the key 'add' is not a constant value for all the jsons. In some cases, it can be 'minus', 'multiply' etc.
The value of that key is an array. In this case [{"var": ["100"]}, "200"]. This means that the 100 should be added to existing value 200.
I am trying to parse this expression. Since the main key(in this case 'add') is not a constant, I cannot convert into a struct. So, I converted it to a json object by following way:
type mathExpVar struct {
valueVar []string `json:"var"`
}
var mathExpJson map[string][]interface{}
var input = "{\"add\": [{\"var\": [\"100\"]}, \"200\"]}"
err := json.Unmarshal([]byte(input), &mathExpJson)
for operator, values := range mathExpJson{
vals, ok := values[0].(mathExpVar) // here values[0] will be {"var": ["100"]}
if !ok{
return nil
}
}
Here 'ok' is always returning false. I am not sure why. There is no additional error message for me to check why this is failing. Could someone help me in resolving this?
Link to go playground for the same: https://go.dev/play/p/POfQmEoPbjD
A Working example: https://go.dev/play/p/02YzI5cv8vV
The whole reason the original response from Burak Serdar was not valid in my opinion is that it does not take into account the fact that you would need to handle the rest of the params as well. If you look closely enough, then you see that the expression is not an array of strings, its of varying type. This implementation handles the custom Unmarshalling and stores all the extra parameters in the Extra field.
Also code:
package main
import (
"encoding/json"
"log"
"reflect"
)
const jsonPayload = "{\"add\": [{\"var\": [\"100\"]}, \"200\"]}"
func main() {
data := MathExpressions{}
err := json.Unmarshal([]byte(jsonPayload), &data)
if err != nil {
log.Println("Failed to unmarshal json, error:", err)
return
}
log.Println(data)
for operation, expression := range data {
log.Print("Op:", operation, "Exp:", expression)
}
log.Println("Finished..")
}
/**
The sub definition for a specific expression in the object
*/
type ExpressionDefinition struct {
Vars []string `json:"var"`
Extra []string
}
func (e *ExpressionDefinition) UnmarshalJSON(data []byte) error {
tokens := make([]interface{}, 0)
err := json.Unmarshal(data, &tokens)
if err != nil {
return err
}
for _, token := range tokens {
log.Println("Processing token:", token, "type:", reflect.TypeOf(token))
switch token.(type) {
case map[string]interface{}:
for _, v := range token.(map[string]interface{})["var"].([]interface{}) {
e.Vars = append(e.Vars, v.(string))
}
case string:
e.Extra = append(e.Extra, token.(string))
}
}
log.Println(tokens)
return nil
}
/**
The main expressions object which contains all the sub-expressions.
*/
type MathExpressions map[string]ExpressionDefinition
Here the entire structure of the parsed json value will be stored in nested map[string]interface{}(json object) or []interface{}(json array) types.
In the line:
vals, ok := values[0].(mathExpVar)
values[0] would be of type map[string]interface{}, which cannot be asserted to mathExpVar, which is a struct, an entirely different datatype.
You need to type assert to map[string]interface{} first, then do this in each nested level as you go forward:
package main
import (
"encoding/json"
"fmt"
)
func main() {
type mathExpVar struct {
valueVar []string `json:"var"`
}
var mathExpJson map[string][]interface{}
var input = "{\"add\": [{\"var\": [\"100\"]}, \"200\"]}"
err := json.Unmarshal([]byte(input), &mathExpJson)
if err != nil {
fmt.Println("Error in unmarshalling")
}
for _, values := range mathExpJson {
var vals mathExpVar
valMap, ok := values[0].(map[string]interface{})
if ok {
varSlice, ok := valMap["var"].([]interface{})
if ok {
for _, v := range varSlice {
nv, ok := v.(string)
if ok {
vals.valueVar = append(vals.valueVar, nv)
} else {
fmt.Printf("%T\n", v)
}
}
} else {
fmt.Printf("%T\n", valMap["var"])
}
} else {
fmt.Printf("%T\n", values[0])
}
fmt.Printf("%+v\n", vals)
}
}
See: https://go.dev/play/p/Ot_9IZr4pwM
For more on interfaces and go reflection, check out: https://go.dev/blog/laws-of-reflection

Golang Converting JSON

map[key:2073933158088]
I need to grab the key out of this data structure as a string, but I can't seem to figure out how!
Help with this overly simple question very much appreciated.
The value above is encapsulated in the variable named data.
I have tried: data.key, data[key], data["key"], data[0] and none of these seem to be appropriate calls.
To define data I sent up a JSON packet to a queue on IronMQ. I then pulled the message from the queue and manipulated it like this:
payloadIndex := 0
for index, arg := range(os.Args) {
if arg == "-payload" {
payloadIndex = index + 1
}
}
if payloadIndex >= len(os.Args) {
panic("No payload value.")
}
payload := os.Args[payloadIndex]
var data interface{}
raw, err := ioutil.ReadFile(payload)
if err != nil {
panic(err.Error())
}
err = json.Unmarshal(raw, &data)
Design your data type to match json structure. This is how can you achieve this:
package main
import (
"fmt"
"encoding/json"
)
type Data struct {
Key string `json:"key"`
}
func main() {
data := new(Data)
text := `{ "key": "2073933158088" }`
raw := []byte(text)
err := json.Unmarshal(raw, data)
if err != nil {
panic(err.Error())
}
fmt.Println(data.Key)
}
Since the number in the json is unquoted, it's not a string, Go will panic if you try to just handle it as a string (playground: http://play.golang.org/p/i-NUwchJc1).
Here's a working alternative:
package main
import (
"fmt"
"encoding/json"
"strconv"
)
type Data struct {
Key string `json:"key"`
}
func (d *Data) UnmarshalJSON(content []byte) error {
var m map[string]interface{}
err := json.Unmarshal(content, &m)
if err != nil {
return err
}
d.Key = strconv.FormatFloat(m["key"].(float64), 'f', -1, 64)
return nil
}
func main() {
data := new(Data)
text := `{"key":2073933158088}`
raw := []byte(text)
err := json.Unmarshal(raw, data)
if err != nil {
panic(err.Error())
}
fmt.Println(data.Key)
}
You can see the result in the playground: http://play.golang.org/p/5hU3hdV3kK

Unmarshal CSV record into struct in Go

The problem how to automatically deserialize/unmarshal record from CSV file into Go struct.
For example, I have
type Test struct {
Name string
Surname string
Age int
}
And CSV file contains records
John;Smith;42
Piter;Abel;50
Is there an easy way to unmarshal those records into struct except by using "encoding/csv" package for reading record and then doing something like
record, _ := reader.Read()
test := Test{record[0],record[1],atoi(record[2])}
There is gocarina/gocsv which handles custom struct in the same way encoding/json does.
You can also write custom marshaller and unmarshaller for specific types.
Example:
type Client struct {
Id string `csv:"client_id"` // .csv column headers
Name string `csv:"client_name"`
Age string `csv:"client_age"`
}
func main() {
in, err := os.Open("clients.csv")
if err != nil {
panic(err)
}
defer in.Close()
clients := []*Client{}
if err := gocsv.UnmarshalFile(in, &clients); err != nil {
panic(err)
}
for _, client := range clients {
fmt.Println("Hello, ", client.Name)
}
}
Seems I've done with automatic marshaling of CSV records into structs (limited to string and int). Hope this would be useful.
Here is a link to playground: http://play.golang.org/p/kwc32A5mJf
func Unmarshal(reader *csv.Reader, v interface{}) error {
record, err := reader.Read()
if err != nil {
return err
}
s := reflect.ValueOf(v).Elem()
if s.NumField() != len(record) {
return &FieldMismatch{s.NumField(), len(record)}
}
for i := 0; i < s.NumField(); i++ {
f := s.Field(i)
switch f.Type().String() {
case "string":
f.SetString(record[i])
case "int":
ival, err := strconv.ParseInt(record[i], 10, 0)
if err != nil {
return err
}
f.SetInt(ival)
default:
return &UnsupportedType{f.Type().String()}
}
}
return nil
}
I'll try to create github package is someone needs this implementation.
You could bake your own. Perhaps something like this:
package main
import (
"fmt"
"strconv"
"strings"
)
type Test struct {
Name string
Surname string
Age int
}
func (t Test) String() string {
return fmt.Sprintf("%s;%s;%d", t.Name, t.Surname, t.Age)
}
func (t *Test) Parse(in string) {
tmp := strings.Split(in, ";")
t.Name = tmp[0]
t.Surname = tmp[1]
t.Age, _ = strconv.Atoi(tmp[2])
}
func main() {
john := Test{"John", "Smith", 42}
fmt.Printf("john:%v\n", john)
johnString := john.String()
fmt.Printf("johnString:%s\n", johnString)
var rebornJohn Test
rebornJohn.Parse(johnString)
fmt.Printf("rebornJohn:%v\n", rebornJohn)
}
Using csvutil it is possible to give column header see example.
In your case, this could be :
package main
import (
"encoding/csv"
"fmt"
"io"
"os"
"github.com/jszwec/csvutil"
)
type Test struct {
Name string
Surname string
Age int
}
func main() {
csv_file, _ := os.Open("test.csv")
reader := csv.NewReader(csv_file)
reader.Comma = ';'
userHeader, _ := csvutil.Header(Test{}, "csv")
dec, _ := csvutil.NewDecoder(reader, userHeader...)
var users []Test
for {
var u Test
if err := dec.Decode(&u); err == io.EOF {
break
}
users = append(users, u)
}
fmt.Println(users)
}

Go- Copy all common fields between structs

I have a database that stores JSON, and a server that provides an external API to whereby through an HTTP post, values in this database can be changed. The database is used by different processes internally, and as such have a common naming scheme.
The keys the customer sees are different, but map 1:1 with the keys in the database (there are unexposed keys). For example:
This is in the database:
{ "bit_size": 8, "secret_key": false }
And this is presented to the client:
{ "num_bits": 8 }
The API can change with respect to field names, but the database always has consistent keys.
I have named the fields the same in the struct, with different flags to the json encoder:
type DB struct {
NumBits int `json:"bit_size"`
Secret bool `json:"secret_key"`
}
type User struct {
NumBits int `json:"num_bits"`
}
I'm using encoding/json to do the Marshal/Unmarshal.
Is reflect the right tool for this? Is there an easier way since all of the keys are the same? I was thinking some kind of memcpy (if I kept the user fields in the same order).
Couldn't struct embedding be useful here?
package main
import (
"fmt"
)
type DB struct {
User
Secret bool `json:"secret_key"`
}
type User struct {
NumBits int `json:"num_bits"`
}
func main() {
db := DB{User{10}, true}
fmt.Printf("Hello, DB: %+v\n", db)
fmt.Printf("Hello, DB.NumBits: %+v\n", db.NumBits)
fmt.Printf("Hello, User: %+v\n", db.User)
}
http://play.golang.org/p/9s4bii3tQ2
buf := bytes.Buffer{}
err := gob.NewEncoder(&buf).Encode(&DbVar)
if err != nil {
return err
}
u := User{}
err = gob.NewDecoder(&buf).Decode(&u)
if err != nil {
return err
}
Here's a solution using reflection. You have to further develop it if you need more complex structures with embedded struct fields and such.
http://play.golang.org/p/iTaDgsdSaI
package main
import (
"encoding/json"
"fmt"
"reflect"
)
type M map[string]interface{} // just an alias
var Record = []byte(`{ "bit_size": 8, "secret_key": false }`)
type DB struct {
NumBits int `json:"bit_size"`
Secret bool `json:"secret_key"`
}
type User struct {
NumBits int `json:"num_bits"`
}
func main() {
d := new(DB)
e := json.Unmarshal(Record, d)
if e != nil {
panic(e)
}
m := mapFields(d)
fmt.Println("Mapped fields: ", m)
u := new(User)
o := applyMap(u, m)
fmt.Println("Applied map: ", o)
j, e := json.Marshal(o)
if e != nil {
panic(e)
}
fmt.Println("Output JSON: ", string(j))
}
func applyMap(u *User, m M) M {
t := reflect.TypeOf(u).Elem()
o := make(M)
for i := 0; i < t.NumField(); i++ {
f := t.FieldByIndex([]int{i})
// skip unexported fields
if f.PkgPath != "" {
continue
}
if x, ok := m[f.Name]; ok {
k := f.Tag.Get("json")
o[k] = x
}
}
return o
}
func mapFields(x *DB) M {
o := make(M)
v := reflect.ValueOf(x).Elem()
t := v.Type()
for i := 0; i < v.NumField(); i++ {
f := t.FieldByIndex([]int{i})
// skip unexported fields
if f.PkgPath != "" {
continue
}
o[f.Name] = v.FieldByIndex([]int{i}).Interface()
}
return o
}
Using struct tags, the following would sure be nice,
package main
import (
"fmt"
"log"
"hacked/json"
)
var dbj = `{ "bit_size": 8, "secret_key": false }`
type User struct {
NumBits int `json:"bit_size" api:"num_bits"`
}
func main() {
fmt.Println(dbj)
// unmarshal from full db record to User struct
var u User
if err := json.Unmarshal([]byte(dbj), &u); err != nil {
log.Fatal(err)
}
// remarshal User struct using api field names
api, err := json.MarshalTag(u, "api")
if err != nil {
log.Fatal(err)
}
fmt.Println(string(api))
}
Adding MarshalTag requires just a small patch to encode.go:
106c106,112
< e := &encodeState{}
---
> return MarshalTag(v, "json")
> }
>
> // MarshalTag is like Marshal but marshalls fields with
> // the specified tag key instead of the default "json".
> func MarshalTag(v interface{}, tag string) ([]byte, error) {
> e := &encodeState{tagKey: tag}
201a208
> tagKey string
328c335
< for _, ef := range encodeFields(v.Type()) {
---
> for _, ef := range encodeFields(v.Type(), e.tagKey) {
509c516
< func encodeFields(t reflect.Type) []encodeField {
---
> func encodeFields(t reflect.Type, tagKey string) []encodeField {
540c547
< tv := f.Tag.Get("json")
---
> tv := f.Tag.Get(tagKey)
The following function use reflect to copy fields between two structs. A src field is copied to a dest field if they have the same field name.
// CopyCommonFields copies src fields into dest fields. A src field is copied
// to a dest field if they have the same field name.
// Dest and src must be pointers to structs.
func CopyCommonFields(dest, src interface{}) {
srcType := reflect.TypeOf(src).Elem()
destType := reflect.TypeOf(dest).Elem()
destFieldsMap := map[string]int{}
for i := 0; i < destType.NumField(); i++ {
destFieldsMap[destType.Field(i).Name] = i
}
for i := 0; i < srcType.NumField(); i++ {
if j, ok := destFieldsMap[srcType.Field(i).Name]; ok {
reflect.ValueOf(dest).Elem().Field(j).Set(
reflect.ValueOf(src).Elem().Field(i),
)
}
}
}
Usage:
func main() {
type T struct {
A string
B int
}
type U struct {
A string
}
src := T{
A: "foo",
B: 5,
}
dest := U{}
CopyCommonFields(&dest, &src)
fmt.Printf("%+v\n", dest)
// output: {A:foo}
}
You can cast structures if they have same field names and types, effectively reassigning field tags:
package main
import "encoding/json"
type DB struct {
dbNumBits
Secret bool `json:"secret_key"`
}
type dbNumBits struct {
NumBits int `json:"bit_size"`
}
type User struct {
NumBits int `json:"num_bits"`
}
var Record = []byte(`{ "bit_size": 8, "secret_key": false }`)
func main() {
d := new(DB)
e := json.Unmarshal(Record, d)
if e != nil {
panic(e)
}
var u User = User(d.dbNumBits)
println(u.NumBits)
}
https://play.golang.org/p/uX-IIgL-rjc
Here's a solution without reflection, unsafe, or a function per struct. The example is a little convoluted, and maybe you wouldn't need to do it just like this, but the key is using a map[string]interface{} to get away from a struct with field tags. You might be able to use the idea in a similar solution.
package main
import (
"encoding/json"
"fmt"
"log"
)
// example full database record
var dbj = `{ "bit_size": 8, "secret_key": false }`
// User type has only the fields going to the API
type User struct {
// tag still specifies internal name, not API name
NumBits int `json:"bit_size"`
}
// mapping from internal field names to API field names.
// (you could have more than one mapping, or even construct this
// at run time)
var ApiField = map[string]string{
// internal: API
"bit_size": "num_bits",
// ...
}
func main() {
fmt.Println(dbj)
// select user fields from full db record by unmarshalling
var u User
if err := json.Unmarshal([]byte(dbj), &u); err != nil {
log.Fatal(err)
}
// remarshal from User struct back to json
exportable, err := json.Marshal(u)
if err != nil {
log.Fatal(err)
}
// unmarshal into a map this time, to shrug field tags.
type jmap map[string]interface{}
mInternal := jmap{}
if err := json.Unmarshal(exportable, &mInternal); err != nil {
log.Fatal(err)
}
// translate field names
mExportable := jmap{}
for internalField, v := range mInternal {
mExportable[ApiField[internalField]] = v
}
// marshal final result with API field names
if exportable, err = json.Marshal(mExportable); err != nil {
log.Fatal(err)
}
fmt.Println(string(exportable))
}
Output:
{ "bit_size": 8, "secret_key": false }
{"num_bits":8}
Edit: More explanation. As Tom notes in a comment, there's reflection going on behind the code. The goal here is to keep the code simple by using the available capabilities of the library. Package json currently offers two ways to work with data, struct tags and maps of [string]interface{}. The struct tags let you select fields, but force you to statically pick a single json field name. The maps let you pick field names at run time, but not which fields to Marshal. It would be nice if the json package let you do both at once, but it doesn't. The answer here just shows the two techniques and how they can be composed in a solution to the example problem in the OP.
"Is reflect the right tool for this?" A better question might be, "Are struct tags the right tool for this?" and the answer might be no.
package main
import (
"encoding/json"
"fmt"
"log"
)
var dbj = `{ "bit_size": 8, "secret_key": false }`
// translation from internal field name to api field name
type apiTrans struct {
db, api string
}
var User = []apiTrans{
{db: "bit_size", api: "num_bits"},
}
func main() {
fmt.Println(dbj)
type jmap map[string]interface{}
// unmarshal full db record
mdb := jmap{}
if err := json.Unmarshal([]byte(dbj), &mdb); err != nil {
log.Fatal(err)
}
// build result
mres := jmap{}
for _, t := range User {
if v, ok := mdb[t.db]; ok {
mres[t.api] = v
}
}
// marshal result
exportable, err := json.Marshal(mres)
if err != nil {
log.Fatal(err)
}
fmt.Println(string(exportable))
}
An efficient way to achieve your goal is to use the gob package.
Here an example with the playground:
package main
import (
"bytes"
"encoding/gob"
"fmt"
)
type DB struct {
NumBits int
Secret bool
}
type User struct {
NumBits int
}
func main() {
db := DB{10, true}
user := User{}
buf := bytes.Buffer{}
err := gob.NewEncoder(&buf).Encode(&db)
if err != nil {
panic(err)
}
err = gob.NewDecoder(&buf).Decode(&user)
if err != nil {
panic(err)
}
fmt.Println(user)
}
Here the official blog post: https://blog.golang.org/gob