I feel quite puzzled by this.
I need to load some data (coming from a French database) that is serialized in JSON and in which some keys have a single quote.
Here is a simplified version:
package main
import (
"encoding/json"
"fmt"
)
type Product struct {
Name string `json:"nom"`
Cost int64 `json:"prix d'achat"`
}
func main() {
var p Product
err := json.Unmarshal([]byte(`{"nom":"savon", "prix d'achat": 170}`), &p)
fmt.Printf("product cost: %d\nerror: %s\n", p.Cost, err)
}
// product cost: 0
// error: %!s(<nil>)
Unmarshaling leads to no errors however the "prix d'achat" (p.Cost) is not correctly parsed.
When I unmarshal into a map[string]any, the "prix d'achat" key is parsed as I would expect:
package main
import (
"encoding/json"
"fmt"
)
func main() {
blob := map[string]any{}
err := json.Unmarshal([]byte(`{"nom":"savon", "prix d'achat": 170}`), &blob)
fmt.Printf("blob: %f\nerror: %s\n", blob["prix d'achat"], err)
}
// blob: 170.000000
// error: %!s(<nil>)
I checked the json.Marshal documentation on struct tags and I cannot find any issue with the data I'm trying to process.
Am I missing something obvious here?
How can I parse a JSON key containing a single quote using struct tags?
Thanks a lot for any insight!
I didn't find anything in the documentation, but the JSON encoder considers single quote to be a reserved character in tag names.
func isValidTag(s string) bool {
if s == "" {
return false
}
for _, c := range s {
switch {
case strings.ContainsRune("!#$%&()*+-./:;<=>?#[]^_{|}~ ", c):
// Backslash and quote chars are reserved, but
// otherwise any punctuation chars are allowed
// in a tag name.
case !unicode.IsLetter(c) && !unicode.IsDigit(c):
return false
}
}
return true
}
I think opening an issue is justified here. In the meantime, you're going to have to implement json.Unmarshaler and/or json.Marshaler. Here is a start:
func (p *Product) UnmarshalJSON(b []byte) error {
type product Product // revent recursion
var _p product
if err := json.Unmarshal(b, &_p); err != nil {
return err
}
*p = Product(_p)
return unmarshalFieldsWithSingleQuotes(p, b)
}
func unmarshalFieldsWithSingleQuotes(dest interface{}, b []byte) error {
// Look through the JSON tags. If there is one containing single quotes,
// unmarshal b again, into a map this time. Then unmarshal the value
// at the map key corresponding to the tag, if any.
var m map[string]json.RawMessage
t := reflect.TypeOf(dest).Elem()
v := reflect.ValueOf(dest).Elem()
for i := 0; i < t.NumField(); i++ {
tag := t.Field(i).Tag.Get("json")
if !strings.Contains(tag, "'") {
continue
}
if m == nil {
if err := json.Unmarshal(b, &m); err != nil {
return err
}
}
if j, ok := m[tag]; ok {
if err := json.Unmarshal(j, v.Field(i).Addr().Interface()); err != nil {
return err
}
}
}
return nil
}
Try it on the playground: https://go.dev/play/p/aupACXorjOO
Related
I have an json as a string of following format:
{"add": [{"var": ["100"]}, "200"]}
Here the key 'add' is not a constant value for all the jsons. In some cases, it can be 'minus', 'multiply' etc.
The value of that key is an array. In this case [{"var": ["100"]}, "200"]. This means that the 100 should be added to existing value 200.
I am trying to parse this expression. Since the main key(in this case 'add') is not a constant, I cannot convert into a struct. So, I converted it to a json object by following way:
type mathExpVar struct {
valueVar []string `json:"var"`
}
var mathExpJson map[string][]interface{}
var input = "{\"add\": [{\"var\": [\"100\"]}, \"200\"]}"
err := json.Unmarshal([]byte(input), &mathExpJson)
for operator, values := range mathExpJson{
vals, ok := values[0].(mathExpVar) // here values[0] will be {"var": ["100"]}
if !ok{
return nil
}
}
Here 'ok' is always returning false. I am not sure why. There is no additional error message for me to check why this is failing. Could someone help me in resolving this?
Link to go playground for the same: https://go.dev/play/p/POfQmEoPbjD
A Working example: https://go.dev/play/p/02YzI5cv8vV
The whole reason the original response from Burak Serdar was not valid in my opinion is that it does not take into account the fact that you would need to handle the rest of the params as well. If you look closely enough, then you see that the expression is not an array of strings, its of varying type. This implementation handles the custom Unmarshalling and stores all the extra parameters in the Extra field.
Also code:
package main
import (
"encoding/json"
"log"
"reflect"
)
const jsonPayload = "{\"add\": [{\"var\": [\"100\"]}, \"200\"]}"
func main() {
data := MathExpressions{}
err := json.Unmarshal([]byte(jsonPayload), &data)
if err != nil {
log.Println("Failed to unmarshal json, error:", err)
return
}
log.Println(data)
for operation, expression := range data {
log.Print("Op:", operation, "Exp:", expression)
}
log.Println("Finished..")
}
/**
The sub definition for a specific expression in the object
*/
type ExpressionDefinition struct {
Vars []string `json:"var"`
Extra []string
}
func (e *ExpressionDefinition) UnmarshalJSON(data []byte) error {
tokens := make([]interface{}, 0)
err := json.Unmarshal(data, &tokens)
if err != nil {
return err
}
for _, token := range tokens {
log.Println("Processing token:", token, "type:", reflect.TypeOf(token))
switch token.(type) {
case map[string]interface{}:
for _, v := range token.(map[string]interface{})["var"].([]interface{}) {
e.Vars = append(e.Vars, v.(string))
}
case string:
e.Extra = append(e.Extra, token.(string))
}
}
log.Println(tokens)
return nil
}
/**
The main expressions object which contains all the sub-expressions.
*/
type MathExpressions map[string]ExpressionDefinition
Here the entire structure of the parsed json value will be stored in nested map[string]interface{}(json object) or []interface{}(json array) types.
In the line:
vals, ok := values[0].(mathExpVar)
values[0] would be of type map[string]interface{}, which cannot be asserted to mathExpVar, which is a struct, an entirely different datatype.
You need to type assert to map[string]interface{} first, then do this in each nested level as you go forward:
package main
import (
"encoding/json"
"fmt"
)
func main() {
type mathExpVar struct {
valueVar []string `json:"var"`
}
var mathExpJson map[string][]interface{}
var input = "{\"add\": [{\"var\": [\"100\"]}, \"200\"]}"
err := json.Unmarshal([]byte(input), &mathExpJson)
if err != nil {
fmt.Println("Error in unmarshalling")
}
for _, values := range mathExpJson {
var vals mathExpVar
valMap, ok := values[0].(map[string]interface{})
if ok {
varSlice, ok := valMap["var"].([]interface{})
if ok {
for _, v := range varSlice {
nv, ok := v.(string)
if ok {
vals.valueVar = append(vals.valueVar, nv)
} else {
fmt.Printf("%T\n", v)
}
}
} else {
fmt.Printf("%T\n", valMap["var"])
}
} else {
fmt.Printf("%T\n", values[0])
}
fmt.Printf("%+v\n", vals)
}
}
See: https://go.dev/play/p/Ot_9IZr4pwM
For more on interfaces and go reflection, check out: https://go.dev/blog/laws-of-reflection
For the following JSON response {"table_contents":[{"id":100,"description":"text100"},{"id":101,"description":"text101"},{"id":1,"description":"text1"}]}
All you have to do is to produce the following code to execute it properly and be able to reads fields from the struct, such as:
package main
import (
"fmt"
"encoding/json"
)
type MyStruct1 struct {
TableContents []struct {
ID int
Description string
} `json:"table_contents"`
}
func main() {
result:= []byte(`{"table_contents":[{"id":100,"description":"text100"},{"id":101,"description":"text101"},{"id":1,"description":"text1"}]}`)
var container MyStruct1
err := json.Unmarshal(result, &container)
if err != nil {
fmt.Println(" [0] Error message: " + err.Error())
return
}
for i := range container.TableContents {
fmt.Println(container.TableContents[i].Description)
}
}
But how do you deal with the following JSON response? {"table_contents":[[{"id":100,"description":"text100"},{"id":101,"description":"text101"}],{"id":1,"description":"text1"}]} You can either get this response or the one above, it is important to modify the struct to accept both.
I did something like this, with the help of internet:
package main
import (
"fmt"
"encoding/json"
)
type MyStruct1 struct {
TableContents []TableContentUnion `json:"table_contents"`
}
type TableContentClass struct {
ID int
Description string
}
type TableContentUnion struct {
TableContentClass *TableContentClass
TableContentClassArray []TableContentClass
}
func main() {
result:= []byte(`{"table_contents":[[{"id":100,"description":"text100"},{"id":101,"description":"text101"}],{"id":1,"description":"text1"}]}`)
var container MyStruct1
err := json.Unmarshal(result, &container)
if err != nil {
fmt.Println(" [0] Error message: " + err.Error())
return
}
for i := range container.TableContents {
fmt.Println(container.TableContents[i])
}
}
but it does not go past the error message :(
[0] Error message: json: cannot unmarshal array into Go struct field MyStruct1.table_contents of type main.TableContentUnion*
Been struggling to come up with a solution for hours. If someone could help I would be happy. Thank you for reading. Let me know if you have questions
Inside table_contents you have two type options (json object or list of json objects). What you can do is to unmarshall into an interface and then run type-check on it when using it:
type MyStruct1 struct {
TableContents []interface{} `json:"table_contents"`
}
...
for i := range container.TableContents {
switch container.TableContents[i].(type){
case map[string]interface{}:
fmt.Println("json object")
case []interface{}:
fmt.Println("list")
}
}
From there you can use some library (e.g. https://github.com/mitchellh/mapstructure) to map unmarshalled struct to your TableContentClass type. See PoC playground here: https://play.golang.org/p/NhVUhQayeL_C
Custom UnmarshalJSON function
You can also create a custom UnmarshalJSON function on the object that has the 2 possibilities. In you case that would be TableContentUnion.
In the custom unmarshaller you can then decide how to unmarshal the content.
func (s *TableContentUnion) UnmarshalJSON(b []byte) error {
// Note that we get `b` as bytes, so we can also manually check to see
// if it is an array (starts with `[`) or an object (starts with `{`)
var jsonObj interface{}
if err := json.Unmarshal(b, &jsonObj); err != nil {
return err
}
switch jsonObj.(type) {
case map[string]interface{}:
// Note: instead of using json.Unmarshal again, we could also cast the interface
// and build the values as in the example above
var tableContentClass TableContentClass
if err := json.Unmarshal(b, &tableContentClass); err != nil {
return err
}
s.TableContentClass = &tableContentClass
case []interface{}:
// Note: instead of using json.Unmarshal again, we could also cast the interface
// and build the values as in the example above
if err := json.Unmarshal(b, &s.TableContentClassArray); err != nil {
return err
}
default:
return errors.New("TableContentUnion.UnmarshalJSON: unknown content type")
}
return nil
}
The rest then works like in your test code that was failing before. Here the working Go Playground
Unmarshal to map and manually build struct
You can always unmarshal a json (with an object at the root) into a map[string]interface{}. Then you can iterate things and further unmarshal them after checking what type they are.
Working example:
func main() {
result := []byte(`{"table_contents":[[{"id":100,"description":"text100"},{"id":101,"description":"text101"}],{"id":1,"description":"text1"}]}`)
var jsonMap map[string]interface{}
err := json.Unmarshal(result, &jsonMap)
if err != nil {
fmt.Println(" [0] Error message: " + err.Error())
return
}
cts, ok := jsonMap["table_contents"].([]interface{})
if !ok {
// Note: nil or missing 'table_contents" will also lead to this path.
fmt.Println("table_contents is not a slice")
return
}
var unions []TableContentUnion
for _, content := range cts {
var union TableContentUnion
if contents, ok := content.([]interface{}); ok {
for _, content := range contents {
contCls := parseContentClass(content)
if contCls == nil {
continue
}
union.TableContentClassArray = append(union.TableContentClassArray, *contCls)
}
} else {
contCls := parseContentClass(content)
union.TableContentClass = contCls
}
unions = append(unions, union)
}
container := MyStruct1{
TableContents: unions,
}
for i := range container.TableContents {
fmt.Println(container.TableContents[i])
}
}
func parseContentClass(value interface{}) *TableContentClass {
m, ok := value.(map[string]interface{})
if !ok {
return nil
}
return &TableContentClass{
ID: int(m["id"].(float64)),
Description: m["description"].(string),
}
}
This is most useful if the json has too many variations. For cases like this it might also make sense sometimes to switch to a json package that works differently like https://github.com/tidwall/gjson which gets values based on their path.
Use json.RawMessage to capture the varying parts of the JSON document. Unmarshal each raw message as appropriate.
func (ms *MyStruct1) UnmarshalJSON(data []byte) error {
// Declare new type with same base type as MyStruct1.
// This breaks recursion in call to json.Unmarshal below.
type x MyStruct1
v := struct {
*x
// Override TableContents field with raw message.
TableContents []json.RawMessage `json:"table_contents"`
}{
// Unmarshal all but TableContents directly to the
// receiver.
x: (*x)(ms),
}
err := json.Unmarshal(data, &v)
if err != nil {
return err
}
// Unmarahal raw elements as appropriate.
for _, tcData := range v.TableContents {
if bytes.HasPrefix(tcData, []byte{'{'}) {
var v TableContentClass
if err := json.Unmarshal(tcData, &v); err != nil {
return err
}
ms.TableContents = append(ms.TableContents, v)
} else {
var v []TableContentClass
if err := json.Unmarshal(tcData, &v); err != nil {
return err
}
ms.TableContents = append(ms.TableContents, v)
}
}
return nil
}
Use it like this:
var container MyStruct1
err := json.Unmarshal(result, &container)
if err != nil {
// handle error
}
Run it on the Go playground.
This approach does not add any outside dependencies. The function code does not need to modified when fields are added or removed from MyStruct1 or TableContentClass.
How to check if a given string is in form of multiple json string separated by spaces/newline?
For example,
given: "test" 123 {"Name": "mike"} (3 json concatenated with space)
return: true, since each of item ("test" 123 and {"Name": "mike"}) is a valid json.
In Go, I can write a O(N^2) function like:
// check given string is json or multiple json concatenated with space/newline
func validateJSON(str string) error {
// only one json string
if isJSON(str) {
return nil
}
// multiple json string concatenate with spaces
str = strings.TrimSpace(str)
arr := []rune(str)
start := 0
end := 0
for start < len(str) {
for end < len(str) && !unicode.IsSpace(arr[end]) {
end++
}
substr := str[start:end]
if isJSON(substr) {
for end < len(str) && unicode.IsSpace(arr[end]) {
end++
}
start = end
} else {
if end == len(str) {
return errors.New("error when parsing input: " + substr)
}
for end < len(str) && unicode.IsSpace(arr[end]) {
end++
}
}
}
return nil
}
func isJSON(str string) bool {
var js json.RawMessage
return json.Unmarshal([]byte(str), &js) == nil
}
But this won't work for large input.
There are two options. The simplest, from a coding standpoint, is going to be just to decode the JSON string normally. You can make this most efficient by decoding to an empty struct:
package main
import "encoding/json"
func main() {
input := []byte(`{"a":"b", "c": 123}`)
var x struct{}
if err := json.Unmarshal(input, &x); err != nil {
panic(err)
}
input = []byte(`{"a":"b", "c": 123}xxx`) // This one fails
if err := json.Unmarshal(input, &x); err != nil {
panic(err)
}
}
(playground link)
This method has a few potential drawbacks:
It only works with a single JSON object. That is, a list of objects (as requested in the question) will fail, without additional logic.
As pointed out by #icza in comments, it only works with JSON objects, so bare arrays, numbers, or strings will fail. To accomodate these types, interface{} must be used, which introduces the potential for some serious performance penalties.
The throw-away x value must still be allocated, and at least one reflection call is likely under the sheets, which may introduce a noticeable performance penalty for some workloads.
Given these limitations, my recommendation is to use the second option: loop through the entire JSON input, ignoring the actual contents. This is made simple with the standard library json.Decoder:
package main
import (
"bytes"
"encoding/json"
"io"
)
func main() {
input := []byte(`{"a":"b", "c": 123}`)
dec := json.NewDecoder(bytes.NewReader(input))
for {
_, err := dec.Token()
if err == io.EOF {
break // End of input, valid JSON
}
if err != nil {
panic(err) // Invalid input
}
}
input = []byte(`{"a":"b", "c": 123}xxx`) // This input fails
dec = json.NewDecoder(bytes.NewReader(input))
for {
_, err := dec.Token()
if err == io.EOF {
break // End of input, valid JSON
}
if err != nil {
panic(err) // Invalid input
}
}
}
(playground link)
As Volker mentioned in the comments, use a *json.Decoder to decode all json documents in your input successively:
package main
import (
"encoding/json"
"io"
"log"
"strings"
)
func main() {
input := `"test" 123 {"Name": "mike"}`
dec := json.NewDecoder(strings.NewReader(input))
for {
var x json.RawMessage
switch err := dec.Decode(&x); err {
case nil:
// not done yet
case io.EOF:
return // success
default:
log.Fatal(err)
}
}
}
Try it on the playground: https://play.golang.org/p/1OKOii9mRHn
Try fastjson.Scanner:
s := `"test" 123 {"Name": "mike"}`
var sc fastjson.Scanner
sc.Init(s)
// Iterate over a stream of json objects
for sc.Next() {}
if sc.Error() != nil {
fmt.Println("ok")
} else {
fmt.Println("false")
}
Is it possible to generate an error if a field was not found while parsing a JSON input using Go?
I could not find it in documentation.
Is there any tag that specifies the field as required?
There is no tag in the encoding/json package that sets a field to "required". You will either have to write your own MarshalJSON() method, or do a post check for missing fields.
To check for missing fields, you will have to use pointers in order to distinguish between missing/null and zero values:
type JsonStruct struct {
String *string
Number *float64
}
Full working example:
package main
import (
"fmt"
"encoding/json"
)
type JsonStruct struct {
String *string
Number *float64
}
var rawJson = []byte(`{
"string":"We do not provide a number"
}`)
func main() {
var s *JsonStruct
err := json.Unmarshal(rawJson, &s)
if err != nil {
panic(err)
}
if s.String == nil {
panic("String is missing or null!")
}
if s.Number == nil {
panic("Number is missing or null!")
}
fmt.Printf("String: %s Number: %f\n", *s.String, *s.Number)
}
Playground
You can also override the unmarshalling for a specific type (so a required field buried in a few json layers) without having to make the field a pointer. UnmarshalJSON is defined by the Unmarshaler interface.
type EnumItem struct {
Named
Value string
}
func (item *EnumItem) UnmarshalJSON(data []byte) (err error) {
required := struct {
Value *string `json:"value"`
}{}
all := struct {
Named
Value string `json:"value"`
}{}
err = json.Unmarshal(data, &required)
if err != nil {
return
} else if required.Value == nil {
err = fmt.Errorf("Required field for EnumItem missing")
} else {
err = json.Unmarshal(data, &all)
item.Named = all.Named
item.Value = all.Value
}
return
}
Here is another way by checking your customized tag
you can create a tag for your struct like:
type Profile struct {
Name string `yourprojectname:"required"`
Age int
}
Use reflect to check if the tag is assigned required value
func (p *Profile) Unmarshal(data []byte) error {
err := json.Unmarshal(data, p)
if err != nil {
return err
}
fields := reflect.ValueOf(p).Elem()
for i := 0; i < fields.NumField(); i++ {
yourpojectTags := fields.Type().Field(i).Tag.Get("yourprojectname")
if strings.Contains(yourpojectTags, "required") && fields.Field(i).IsZero() {
return errors.New("required field is missing")
}
}
return nil
}
And test cases are like:
func main() {
profile1 := `{"Name":"foo", "Age":20}`
profile2 := `{"Name":"", "Age":21}`
var profile Profile
err := profile.Unmarshal([]byte(profile1))
if err != nil {
log.Printf("profile1 unmarshal error: %s\n", err.Error())
return
}
fmt.Printf("profile1 unmarshal: %v\n", profile)
err = profile.Unmarshal([]byte(profile2))
if err != nil {
log.Printf("profile2 unmarshal error: %s\n", err.Error())
return
}
fmt.Printf("profile2 unmarshal: %v\n", profile)
}
Result:
profile1 unmarshal: {foo 20}
2009/11/10 23:00:00 profile2 unmarshal error: required field is missing
You can go to Playground to have a look at the completed code
You can just implement the Unmarshaler interface to customize how your JSON gets unmarshalled.
you can also make use of JSON schema validation.
package main
import (
"encoding/json"
"fmt"
"github.com/alecthomas/jsonschema"
"github.com/xeipuuv/gojsonschema"
)
type Bird struct {
Species string `json:"birdType"`
Description string `json:"what it does" jsonschema:"required"`
}
func main() {
var bird Bird
sc := jsonschema.Reflect(&bird)
b, _ := json.Marshal(sc)
fmt.Println(string(b))
loader := gojsonschema.NewStringLoader(string(b))
documentLoader := gojsonschema.NewStringLoader(`{"birdType": "pigeon"}`)
schema, err := gojsonschema.NewSchema(loader)
if err != nil {
panic("nop")
}
result, err := schema.Validate(documentLoader)
if err != nil {
panic("nop")
}
if result.Valid() {
fmt.Printf("The document is valid\n")
} else {
fmt.Printf("The document is not valid. see errors :\n")
for _, err := range result.Errors() {
// Err implements the ResultError interface
fmt.Printf("- %s\n", err)
}
}
}
Outputs
{"$schema":"http://json-schema.org/draft-04/schema#","$ref":"#/definitions/Bird","definitions":{"Bird":{"required":["birdType","what it does"],"properties":{"birdType":{"type":"string"},"what it does":{"type":"string"}},"additionalProperties":false,"type":"object"}}}
The document is not valid. see errors :
- (root): what it does is required
code example taken from Strict JSON parsing
Is it possible to generate an error if a field was not found while parsing a JSON input using Go?
I could not find it in documentation.
Is there any tag that specifies the field as required?
There is no tag in the encoding/json package that sets a field to "required". You will either have to write your own MarshalJSON() method, or do a post check for missing fields.
To check for missing fields, you will have to use pointers in order to distinguish between missing/null and zero values:
type JsonStruct struct {
String *string
Number *float64
}
Full working example:
package main
import (
"fmt"
"encoding/json"
)
type JsonStruct struct {
String *string
Number *float64
}
var rawJson = []byte(`{
"string":"We do not provide a number"
}`)
func main() {
var s *JsonStruct
err := json.Unmarshal(rawJson, &s)
if err != nil {
panic(err)
}
if s.String == nil {
panic("String is missing or null!")
}
if s.Number == nil {
panic("Number is missing or null!")
}
fmt.Printf("String: %s Number: %f\n", *s.String, *s.Number)
}
Playground
You can also override the unmarshalling for a specific type (so a required field buried in a few json layers) without having to make the field a pointer. UnmarshalJSON is defined by the Unmarshaler interface.
type EnumItem struct {
Named
Value string
}
func (item *EnumItem) UnmarshalJSON(data []byte) (err error) {
required := struct {
Value *string `json:"value"`
}{}
all := struct {
Named
Value string `json:"value"`
}{}
err = json.Unmarshal(data, &required)
if err != nil {
return
} else if required.Value == nil {
err = fmt.Errorf("Required field for EnumItem missing")
} else {
err = json.Unmarshal(data, &all)
item.Named = all.Named
item.Value = all.Value
}
return
}
Here is another way by checking your customized tag
you can create a tag for your struct like:
type Profile struct {
Name string `yourprojectname:"required"`
Age int
}
Use reflect to check if the tag is assigned required value
func (p *Profile) Unmarshal(data []byte) error {
err := json.Unmarshal(data, p)
if err != nil {
return err
}
fields := reflect.ValueOf(p).Elem()
for i := 0; i < fields.NumField(); i++ {
yourpojectTags := fields.Type().Field(i).Tag.Get("yourprojectname")
if strings.Contains(yourpojectTags, "required") && fields.Field(i).IsZero() {
return errors.New("required field is missing")
}
}
return nil
}
And test cases are like:
func main() {
profile1 := `{"Name":"foo", "Age":20}`
profile2 := `{"Name":"", "Age":21}`
var profile Profile
err := profile.Unmarshal([]byte(profile1))
if err != nil {
log.Printf("profile1 unmarshal error: %s\n", err.Error())
return
}
fmt.Printf("profile1 unmarshal: %v\n", profile)
err = profile.Unmarshal([]byte(profile2))
if err != nil {
log.Printf("profile2 unmarshal error: %s\n", err.Error())
return
}
fmt.Printf("profile2 unmarshal: %v\n", profile)
}
Result:
profile1 unmarshal: {foo 20}
2009/11/10 23:00:00 profile2 unmarshal error: required field is missing
You can go to Playground to have a look at the completed code
You can just implement the Unmarshaler interface to customize how your JSON gets unmarshalled.
you can also make use of JSON schema validation.
package main
import (
"encoding/json"
"fmt"
"github.com/alecthomas/jsonschema"
"github.com/xeipuuv/gojsonschema"
)
type Bird struct {
Species string `json:"birdType"`
Description string `json:"what it does" jsonschema:"required"`
}
func main() {
var bird Bird
sc := jsonschema.Reflect(&bird)
b, _ := json.Marshal(sc)
fmt.Println(string(b))
loader := gojsonschema.NewStringLoader(string(b))
documentLoader := gojsonschema.NewStringLoader(`{"birdType": "pigeon"}`)
schema, err := gojsonschema.NewSchema(loader)
if err != nil {
panic("nop")
}
result, err := schema.Validate(documentLoader)
if err != nil {
panic("nop")
}
if result.Valid() {
fmt.Printf("The document is valid\n")
} else {
fmt.Printf("The document is not valid. see errors :\n")
for _, err := range result.Errors() {
// Err implements the ResultError interface
fmt.Printf("- %s\n", err)
}
}
}
Outputs
{"$schema":"http://json-schema.org/draft-04/schema#","$ref":"#/definitions/Bird","definitions":{"Bird":{"required":["birdType","what it does"],"properties":{"birdType":{"type":"string"},"what it does":{"type":"string"}},"additionalProperties":false,"type":"object"}}}
The document is not valid. see errors :
- (root): what it does is required
code example taken from Strict JSON parsing