Calling functions with unknown number of arguments - function

I have the following struct:
// fninfo holds a function and its arity.
type fninfo struct {
arity int // number of arguments that fn takes
fn any // fn must take string arguments and return one string
}
and a map of it:
var fnmap = map[string]fninfo{
"not": fninfo{
arity: 1,
fn: func(a string) string // body omitted...
},
"add": fninfo{
arity: 2,
fn: func(a, b string) string // body omitted...
},
// more items...
}
What I want to do is, with any fninfo,
pop fninfo.arity number of strings from a stack (implementation omitted)
and call fninfo.fn with all the popped strings as arguments,
in a way similar to this:
info := fnmap[fnname]
args := make([]string, info.arity)
for i := 0; i < len(args); i++ {
if args[i], err = stk.pop(); err != nil {
// Handle error...
}
}
info.fn(args...)
However, this is forbidden for two reasons:
info.fn cannot be directly called because
it is of type any (interface{})
info.fn cannot be called with a slice as its arguments
because it is not a variadic function
Currently, all the functions I have take between 1-3 arguments,
so as a workaround,
I am manually popping each item and checking each type assertion:
info := fnmap[fnname]
if fn, ok := info.fn.(func(string) string); ok {
a, err := stk.pop()
if err != nil {
// Handle error...
}
stk.push(fn(a))
} else if fn, ok := info.fn.(func(string, string) string); ok {
a, err := stk.pop()
if err != nil {
// Handle error...
}
b, err := stk.pop()
if err != nil {
// Handle error...
}
stk.push(fn(a, b))
} else if fn, ok := info.fn.(func(string, string, string) string); ok {
a, err := stk.pop()
if err != nil {
// Handle error...
}
b, err := stk.pop()
if err != nil {
// Handle error...
}
c, err := stk.pop()
if err != nil {
// Handle error...
}
stk.push(fn(a, b, c))
}
This is very long and repetitive,
and it becomes unmaintainable
should I add functions with 4, 5, or even more arguments.
How can I bypass the two problems I listed above
and have something similar to my second last example?
Is this possible to achieve in Go?
without reflection, and if not, with?

Related

Unmarshaling from JSON key containing a single quote

I feel quite puzzled by this.
I need to load some data (coming from a French database) that is serialized in JSON and in which some keys have a single quote.
Here is a simplified version:
package main
import (
"encoding/json"
"fmt"
)
type Product struct {
Name string `json:"nom"`
Cost int64 `json:"prix d'achat"`
}
func main() {
var p Product
err := json.Unmarshal([]byte(`{"nom":"savon", "prix d'achat": 170}`), &p)
fmt.Printf("product cost: %d\nerror: %s\n", p.Cost, err)
}
// product cost: 0
// error: %!s(<nil>)
Unmarshaling leads to no errors however the "prix d'achat" (p.Cost) is not correctly parsed.
When I unmarshal into a map[string]any, the "prix d'achat" key is parsed as I would expect:
package main
import (
"encoding/json"
"fmt"
)
func main() {
blob := map[string]any{}
err := json.Unmarshal([]byte(`{"nom":"savon", "prix d'achat": 170}`), &blob)
fmt.Printf("blob: %f\nerror: %s\n", blob["prix d'achat"], err)
}
// blob: 170.000000
// error: %!s(<nil>)
I checked the json.Marshal documentation on struct tags and I cannot find any issue with the data I'm trying to process.
Am I missing something obvious here?
How can I parse a JSON key containing a single quote using struct tags?
Thanks a lot for any insight!
I didn't find anything in the documentation, but the JSON encoder considers single quote to be a reserved character in tag names.
func isValidTag(s string) bool {
if s == "" {
return false
}
for _, c := range s {
switch {
case strings.ContainsRune("!#$%&()*+-./:;<=>?#[]^_{|}~ ", c):
// Backslash and quote chars are reserved, but
// otherwise any punctuation chars are allowed
// in a tag name.
case !unicode.IsLetter(c) && !unicode.IsDigit(c):
return false
}
}
return true
}
I think opening an issue is justified here. In the meantime, you're going to have to implement json.Unmarshaler and/or json.Marshaler. Here is a start:
func (p *Product) UnmarshalJSON(b []byte) error {
type product Product // revent recursion
var _p product
if err := json.Unmarshal(b, &_p); err != nil {
return err
}
*p = Product(_p)
return unmarshalFieldsWithSingleQuotes(p, b)
}
func unmarshalFieldsWithSingleQuotes(dest interface{}, b []byte) error {
// Look through the JSON tags. If there is one containing single quotes,
// unmarshal b again, into a map this time. Then unmarshal the value
// at the map key corresponding to the tag, if any.
var m map[string]json.RawMessage
t := reflect.TypeOf(dest).Elem()
v := reflect.ValueOf(dest).Elem()
for i := 0; i < t.NumField(); i++ {
tag := t.Field(i).Tag.Get("json")
if !strings.Contains(tag, "'") {
continue
}
if m == nil {
if err := json.Unmarshal(b, &m); err != nil {
return err
}
}
if j, ok := m[tag]; ok {
if err := json.Unmarshal(j, v.Field(i).Addr().Interface()); err != nil {
return err
}
}
}
return nil
}
Try it on the playground: https://go.dev/play/p/aupACXorjOO

Golang Interface to struct conversion giving error

I have an json as a string of following format:
{"add": [{"var": ["100"]}, "200"]}
Here the key 'add' is not a constant value for all the jsons. In some cases, it can be 'minus', 'multiply' etc.
The value of that key is an array. In this case [{"var": ["100"]}, "200"]. This means that the 100 should be added to existing value 200.
I am trying to parse this expression. Since the main key(in this case 'add') is not a constant, I cannot convert into a struct. So, I converted it to a json object by following way:
type mathExpVar struct {
valueVar []string `json:"var"`
}
var mathExpJson map[string][]interface{}
var input = "{\"add\": [{\"var\": [\"100\"]}, \"200\"]}"
err := json.Unmarshal([]byte(input), &mathExpJson)
for operator, values := range mathExpJson{
vals, ok := values[0].(mathExpVar) // here values[0] will be {"var": ["100"]}
if !ok{
return nil
}
}
Here 'ok' is always returning false. I am not sure why. There is no additional error message for me to check why this is failing. Could someone help me in resolving this?
Link to go playground for the same: https://go.dev/play/p/POfQmEoPbjD
A Working example: https://go.dev/play/p/02YzI5cv8vV
The whole reason the original response from Burak Serdar was not valid in my opinion is that it does not take into account the fact that you would need to handle the rest of the params as well. If you look closely enough, then you see that the expression is not an array of strings, its of varying type. This implementation handles the custom Unmarshalling and stores all the extra parameters in the Extra field.
Also code:
package main
import (
"encoding/json"
"log"
"reflect"
)
const jsonPayload = "{\"add\": [{\"var\": [\"100\"]}, \"200\"]}"
func main() {
data := MathExpressions{}
err := json.Unmarshal([]byte(jsonPayload), &data)
if err != nil {
log.Println("Failed to unmarshal json, error:", err)
return
}
log.Println(data)
for operation, expression := range data {
log.Print("Op:", operation, "Exp:", expression)
}
log.Println("Finished..")
}
/**
The sub definition for a specific expression in the object
*/
type ExpressionDefinition struct {
Vars []string `json:"var"`
Extra []string
}
func (e *ExpressionDefinition) UnmarshalJSON(data []byte) error {
tokens := make([]interface{}, 0)
err := json.Unmarshal(data, &tokens)
if err != nil {
return err
}
for _, token := range tokens {
log.Println("Processing token:", token, "type:", reflect.TypeOf(token))
switch token.(type) {
case map[string]interface{}:
for _, v := range token.(map[string]interface{})["var"].([]interface{}) {
e.Vars = append(e.Vars, v.(string))
}
case string:
e.Extra = append(e.Extra, token.(string))
}
}
log.Println(tokens)
return nil
}
/**
The main expressions object which contains all the sub-expressions.
*/
type MathExpressions map[string]ExpressionDefinition
Here the entire structure of the parsed json value will be stored in nested map[string]interface{}(json object) or []interface{}(json array) types.
In the line:
vals, ok := values[0].(mathExpVar)
values[0] would be of type map[string]interface{}, which cannot be asserted to mathExpVar, which is a struct, an entirely different datatype.
You need to type assert to map[string]interface{} first, then do this in each nested level as you go forward:
package main
import (
"encoding/json"
"fmt"
)
func main() {
type mathExpVar struct {
valueVar []string `json:"var"`
}
var mathExpJson map[string][]interface{}
var input = "{\"add\": [{\"var\": [\"100\"]}, \"200\"]}"
err := json.Unmarshal([]byte(input), &mathExpJson)
if err != nil {
fmt.Println("Error in unmarshalling")
}
for _, values := range mathExpJson {
var vals mathExpVar
valMap, ok := values[0].(map[string]interface{})
if ok {
varSlice, ok := valMap["var"].([]interface{})
if ok {
for _, v := range varSlice {
nv, ok := v.(string)
if ok {
vals.valueVar = append(vals.valueVar, nv)
} else {
fmt.Printf("%T\n", v)
}
}
} else {
fmt.Printf("%T\n", valMap["var"])
}
} else {
fmt.Printf("%T\n", values[0])
}
fmt.Printf("%+v\n", vals)
}
}
See: https://go.dev/play/p/Ot_9IZr4pwM
For more on interfaces and go reflection, check out: https://go.dev/blog/laws-of-reflection

Go reading map from json stream

I need to parse really long json file (more than million items). I don't want to load it to the memory and read it chunk by chunk. There's a good example with the array of items here. The problem is that I deal with the map. And when I call Decode I get not at beginning of value.
I can't get what should be changed.
const data = `{
"object1": {"name": "cattle","location": "kitchen"},
"object2": {"name": "table","location": "office"}
}`
type ReadObject struct {
Name string `json:"name"`
Location string `json:"location"`
}
func ParseJSON() {
dec := json.NewDecoder(strings.NewReader(data))
tkn, err := dec.Token()
if err != nil {
log.Fatalf("failed to read opening token: %v", err)
}
fmt.Printf("opening token: %v\n", tkn)
objects := make(map[string]*ReadObject)
for dec.More() {
var nextSymbol string
if err := dec.Decode(&nextSymbol); err != nil {
log.Fatalf("failed to parse next symbol: %v", err)
}
nextObject := &ReadObject{}
if err := dec.Decode(&nextObject); err != nil {
log.Fatalf("failed to parse next object")
}
objects[nextSymbol] = nextObject
}
tkn, err = dec.Token()
if err != nil {
log.Fatalf("failed to read closing token: %v", err)
}
fmt.Printf("closing token: %v\n", tkn)
fmt.Printf("OBJECTS: \n%v\n", objects)
}
TL,DR: when you are calling Token() method for a first time, you move offset from the beginning (of a JSON value) and therefore you get the error.
You are working with this struct (link):
type Decoder struct {
// others fields omits for simplicity
tokenState int
}
Pay attention for a tokenState field. This value could be one of (link):
const (
tokenTopValue = iota
tokenArrayStart
tokenArrayValue
tokenArrayComma
tokenObjectStart
tokenObjectKey
tokenObjectColon
tokenObjectValue
tokenObjectComma
)
Let's back to your code. You are calling Token() method. This method obtains first JSON-valid token { and changes tokenState from tokenObjectValue to the tokenObjectStart (link). Now you are "in-an-object" state.
If you try to call Decode() at this point you will get an error (not at beginning of value). This is because allowed states of tokenState for calling Decode() are tokenTopValue, tokenArrayStart, tokenArrayValue, tokenObjectValue, i.e. "full" value, not part of it (link).
To avoid this you can just don't call Token() at all and do something like this:
dec := json.NewDecoder(strings.NewReader(dataMapFromJson))
objects := make(map[string]*ReadObject)
if err := dec.Decode(&objects); err != nil {
log.Fatalf("failed to parse next symbol: %v", err)
}
fmt.Printf("OBJECTS: \n%v\n", objects)
Or, if you want to read chunk-by-chunk, you could keep calling Token() until you reach "full" value. And then call Decode() on this value (I guess this should work).
After consuming the initial { with your first call to dec.Token(), you must :
use dec.Token() to extract the next key
after extracting the key, you can call dec.Decode(&nextObject) to decode an entry
example code :
for dec.More() {
key, err := dec.Token()
if err != nil {
// handle error
}
var val interface{}
err = dec.Decode(&val)
if err != nil {
// handle error
}
fmt.Printf(" %s : %v\n", key, val)
}
https://play.golang.org/p/5r1d8MsNlKb

How do you modify this struct in Golang to accept two different results?

For the following JSON response {"table_contents":[{"id":100,"description":"text100"},{"id":101,"description":"text101"},{"id":1,"description":"text1"}]}
All you have to do is to produce the following code to execute it properly and be able to reads fields from the struct, such as:
package main
import (
"fmt"
"encoding/json"
)
type MyStruct1 struct {
TableContents []struct {
ID int
Description string
} `json:"table_contents"`
}
func main() {
result:= []byte(`{"table_contents":[{"id":100,"description":"text100"},{"id":101,"description":"text101"},{"id":1,"description":"text1"}]}`)
var container MyStruct1
err := json.Unmarshal(result, &container)
if err != nil {
fmt.Println(" [0] Error message: " + err.Error())
return
}
for i := range container.TableContents {
fmt.Println(container.TableContents[i].Description)
}
}
But how do you deal with the following JSON response? {"table_contents":[[{"id":100,"description":"text100"},{"id":101,"description":"text101"}],{"id":1,"description":"text1"}]} You can either get this response or the one above, it is important to modify the struct to accept both.
I did something like this, with the help of internet:
package main
import (
"fmt"
"encoding/json"
)
type MyStruct1 struct {
TableContents []TableContentUnion `json:"table_contents"`
}
type TableContentClass struct {
ID int
Description string
}
type TableContentUnion struct {
TableContentClass *TableContentClass
TableContentClassArray []TableContentClass
}
func main() {
result:= []byte(`{"table_contents":[[{"id":100,"description":"text100"},{"id":101,"description":"text101"}],{"id":1,"description":"text1"}]}`)
var container MyStruct1
err := json.Unmarshal(result, &container)
if err != nil {
fmt.Println(" [0] Error message: " + err.Error())
return
}
for i := range container.TableContents {
fmt.Println(container.TableContents[i])
}
}
but it does not go past the error message :(
[0] Error message: json: cannot unmarshal array into Go struct field MyStruct1.table_contents of type main.TableContentUnion*
Been struggling to come up with a solution for hours. If someone could help I would be happy. Thank you for reading. Let me know if you have questions
Inside table_contents you have two type options (json object or list of json objects). What you can do is to unmarshall into an interface and then run type-check on it when using it:
type MyStruct1 struct {
TableContents []interface{} `json:"table_contents"`
}
...
for i := range container.TableContents {
switch container.TableContents[i].(type){
case map[string]interface{}:
fmt.Println("json object")
case []interface{}:
fmt.Println("list")
}
}
From there you can use some library (e.g. https://github.com/mitchellh/mapstructure) to map unmarshalled struct to your TableContentClass type. See PoC playground here: https://play.golang.org/p/NhVUhQayeL_C
Custom UnmarshalJSON function
You can also create a custom UnmarshalJSON function on the object that has the 2 possibilities. In you case that would be TableContentUnion.
In the custom unmarshaller you can then decide how to unmarshal the content.
func (s *TableContentUnion) UnmarshalJSON(b []byte) error {
// Note that we get `b` as bytes, so we can also manually check to see
// if it is an array (starts with `[`) or an object (starts with `{`)
var jsonObj interface{}
if err := json.Unmarshal(b, &jsonObj); err != nil {
return err
}
switch jsonObj.(type) {
case map[string]interface{}:
// Note: instead of using json.Unmarshal again, we could also cast the interface
// and build the values as in the example above
var tableContentClass TableContentClass
if err := json.Unmarshal(b, &tableContentClass); err != nil {
return err
}
s.TableContentClass = &tableContentClass
case []interface{}:
// Note: instead of using json.Unmarshal again, we could also cast the interface
// and build the values as in the example above
if err := json.Unmarshal(b, &s.TableContentClassArray); err != nil {
return err
}
default:
return errors.New("TableContentUnion.UnmarshalJSON: unknown content type")
}
return nil
}
The rest then works like in your test code that was failing before. Here the working Go Playground
Unmarshal to map and manually build struct
You can always unmarshal a json (with an object at the root) into a map[string]interface{}. Then you can iterate things and further unmarshal them after checking what type they are.
Working example:
func main() {
result := []byte(`{"table_contents":[[{"id":100,"description":"text100"},{"id":101,"description":"text101"}],{"id":1,"description":"text1"}]}`)
var jsonMap map[string]interface{}
err := json.Unmarshal(result, &jsonMap)
if err != nil {
fmt.Println(" [0] Error message: " + err.Error())
return
}
cts, ok := jsonMap["table_contents"].([]interface{})
if !ok {
// Note: nil or missing 'table_contents" will also lead to this path.
fmt.Println("table_contents is not a slice")
return
}
var unions []TableContentUnion
for _, content := range cts {
var union TableContentUnion
if contents, ok := content.([]interface{}); ok {
for _, content := range contents {
contCls := parseContentClass(content)
if contCls == nil {
continue
}
union.TableContentClassArray = append(union.TableContentClassArray, *contCls)
}
} else {
contCls := parseContentClass(content)
union.TableContentClass = contCls
}
unions = append(unions, union)
}
container := MyStruct1{
TableContents: unions,
}
for i := range container.TableContents {
fmt.Println(container.TableContents[i])
}
}
func parseContentClass(value interface{}) *TableContentClass {
m, ok := value.(map[string]interface{})
if !ok {
return nil
}
return &TableContentClass{
ID: int(m["id"].(float64)),
Description: m["description"].(string),
}
}
This is most useful if the json has too many variations. For cases like this it might also make sense sometimes to switch to a json package that works differently like https://github.com/tidwall/gjson which gets values based on their path.
Use json.RawMessage to capture the varying parts of the JSON document. Unmarshal each raw message as appropriate.
func (ms *MyStruct1) UnmarshalJSON(data []byte) error {
// Declare new type with same base type as MyStruct1.
// This breaks recursion in call to json.Unmarshal below.
type x MyStruct1
v := struct {
*x
// Override TableContents field with raw message.
TableContents []json.RawMessage `json:"table_contents"`
}{
// Unmarshal all but TableContents directly to the
// receiver.
x: (*x)(ms),
}
err := json.Unmarshal(data, &v)
if err != nil {
return err
}
// Unmarahal raw elements as appropriate.
for _, tcData := range v.TableContents {
if bytes.HasPrefix(tcData, []byte{'{'}) {
var v TableContentClass
if err := json.Unmarshal(tcData, &v); err != nil {
return err
}
ms.TableContents = append(ms.TableContents, v)
} else {
var v []TableContentClass
if err := json.Unmarshal(tcData, &v); err != nil {
return err
}
ms.TableContents = append(ms.TableContents, v)
}
}
return nil
}
Use it like this:
var container MyStruct1
err := json.Unmarshal(result, &container)
if err != nil {
// handle error
}
Run it on the Go playground.
This approach does not add any outside dependencies. The function code does not need to modified when fields are added or removed from MyStruct1 or TableContentClass.

Appending to json file without writing entire file

I have a json which contains one its attributes value as an array and I need to keep appending values to the array and write to a file. Is there a way I could avoid rewrite of the existing data and only append the new values?
----- Moving next question on different thread ---------------
what is recommended way for writing big data sets onto the file incremental file write or file dump at the end process?
A general solution makes the most sense if the existing JSON is actually an array, or if it's an object that has an array as the last or only pair, as in your case. Otherwise, you're inserting instead of appending. You probably don't want to read the entire file either.
One approach is not much different than what you were thinking, but handles several details
Read the end of the file to verify that it "ends with an array"
Retain that part
Position the file at that ending array bracket
Take the output from a standard encoder for an array of new data, dropping its opening bracket, and inserting a comma if necessary
The end of the the new output replaces the original ending array bracket
Tack the rest of the tail back on
import (
"bytes"
"errors"
"io"
"io/ioutil"
"os"
"regexp"
"unicode"
)
const (
tailCheckLen = 16
)
var (
arrayEndsObject = regexp.MustCompile("(\\[\\s*)?](\\s*}\\s*)$")
justArray = regexp.MustCompile("(\\[\\s*)?](\\s*)$")
)
type jsonAppender struct {
f *os.File
strippedBracket bool
needsComma bool
tail []byte
}
func (a jsonAppender) Write(b []byte) (int, error) {
trimmed := 0
if !a.strippedBracket {
t := bytes.TrimLeftFunc(b, unicode.IsSpace)
if len(t) == 0 {
return len(b), nil
}
if t[0] != '[' {
return 0, errors.New("not appending array: " + string(t))
}
trimmed = len(b) - len(t) + 1
b = t[1:]
a.strippedBracket = true
}
if a.needsComma {
a.needsComma = false
n, err := a.f.Write([]byte(", "))
if err != nil {
return n, err
}
}
n, err := a.f.Write(b)
return trimmed + n, err
}
func (a jsonAppender) Close() error {
if _, err := a.f.Write(a.tail); err != nil {
defer a.f.Close()
return err
}
return a.f.Close()
}
func JSONArrayAppender(file string) (io.WriteCloser, error) {
f, err := os.OpenFile(file, os.O_RDWR, 0664)
if err != nil {
return nil, err
}
pos, err := f.Seek(0, io.SeekEnd)
if err != nil {
return nil, err
}
if pos < tailCheckLen {
pos = 0
} else {
pos -= tailCheckLen
}
_, err = f.Seek(pos, io.SeekStart)
if err != nil {
return nil, err
}
tail, err := ioutil.ReadAll(f)
if err != nil {
return nil, err
}
hasElements := false
if len(tail) == 0 {
_, err = f.Write([]byte("["))
if err != nil {
return nil, err
}
} else {
var g [][]byte
if g = arrayEndsObject.FindSubmatch(tail); g != nil {
} else if g = justArray.FindSubmatch(tail); g != nil {
} else {
return nil, errors.New("does not end with array")
}
hasElements = len(g[1]) == 0
_, err = f.Seek(-int64(len(g[2])+1), io.SeekEnd) // 1 for ]
if err != nil {
return nil, err
}
tail = g[2]
}
return jsonAppender{f: f, needsComma: hasElements, tail: tail}, nil
}
Usage is then like in this test fragment
a, err := JSONArrayAppender(f)
if err != nil {
t.Fatal(err)
}
added := []struct {
Name string `json:"name"`
}{
{"Wonder Woman"},
}
if err = json.NewEncoder(a).Encode(added); err != nil {
t.Fatal(err)
}
if err = a.Close(); err != nil {
t.Fatal(err)
}
You can use whatever settings on the Encoder you want. The only hard-coded part is handling needsComma, but you can add an argument for that.
If your JSON array is simple you can use something like the following code. In this code, I create JSON array manually.
type item struct {
Name string
}
func main() {
fd, err := os.Create("hello.json")
if err != nil {
log.Fatal(err)
}
fd.Write([]byte{'['})
for i := 0; i < 10; i++ {
b, err := json.Marshal(item{
"parham",
})
if err != nil {
log.Fatal(err)
}
if i != 0 {
fd.Write([]byte{','})
}
fd.Write(b)
}
fd.Write([]byte{']'})
}
If you want to have a valid array in each step you can write ']' at the end of each iteration and then seek back on the start of the next iteration.