Wrapping json member fields to object - json

My objective is to add fields to json on user request.
Everything is great, but when displaying the fields with
fmt.Printf("%s: %s\n", content.Date, content.Description)
an error occurs:
invalid character '{' after top-level value
And that is because after adding new fields the file looks like this:
{"Date":"2017-03-20 10:46:48","Description":"new"}
{"Date":"2017-03-20 10:46:51","Description":"new .go"}
The biggest problem is with the writting to file
reminder := &Name{dateString[:19], text} //text - input string
newReminder, _ := json.Marshal(&reminder)
I dont really know how to do this properly
My question is how should I wrap all member fields into one object?
And what is the best way to iterate through member fields?
The code is available here: https://play.golang.org/p/NunV_B6sud

You should store the reminders into an array inside the json file, as mentioned by #Gerben Jacobs, and then, every time you want to add a new reminder to the array you need to read the full contents of rem.json, append the new reminder in Go, truncate the file, and write the new slice into the file. Here's a quick implentation https://play.golang.org/p/UKR91maQF2.
If you have lots of reminders and the process of reading, decoding, encoding, and writing the whole content becomes a pain you could open the file, implement a way to truncate only the last ] from the file contents, and then write only , + new reminder + ].

So after some research, people in the go-nuts group helped me and suggested me to use a streaming json parser that parses items individually.
So I needed to change my reminder listing function:
func listReminders() error {
f, err := os.Open("rem.json")
if err != nil {
return err
}
dec := json.NewDecoder(f)
for {
var content Name
switch dec.Decode(&content) {
case nil:
fmt.Printf("%#v\n", content)
case io.EOF:
return nil
default:
return err
}
}
}
Now everything works the way I wanted.

Related

Golang POST request in JSON format from a csv file

so I am trying to POST a csv file in JSON format to a website in Golang. I have been successful in POSTing a singular JSON file. However, that is not what I want the purpose of my program to be. Basically, I'm trying to create an account generator for a site. I want to be able to generate multiple accounts at once. I feel the best way to do this is with a csv file.
I've tried using encoding/csv to read the csv file then marshal to JSON. Also, ioutil.ReadFile(). However, the response from the site is, 'first name is mandatory field, last name is a mandatory field' etc etc. So, this obviously means the csv data is not going into JSON format. I'll show my code with the ioutil.ReadFile() below.
func main() {
file, _ := ioutil.ReadFile("accounts.csv")
jsonConv, _ := json.Marshal(file)
client := http.Client{}
req, err := http.NewRequest("POST", "websiteUrlHere", bytes.NewBuffer(jsonConv))
req.Header.Add("cookie", `"really long cookie goes here"`)
req.Header.Set("user-agent", "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/81.0.4044.138 Safari/537.36")
req.Header.Set("content-type", "application/json")
resp, err := client.Do(req)
if err != nil {
fmt.Print(err)
}
defer resp.Body.Close()
}
(^ This is just a snippet of the code).
I'm still pretty new to all of this so please understand if the question lacks anything. I've also looked for similar questions but all I find is the use of a struct. I feel this wouldn't be applicable for this as the goal is to create unlimited accounts at once.
Hope the above is sufficient. Thank you.
The issue with your code is actually that you're trying to convert a file to bytes with:
file, _ := ioutil.ReadFile("accounts.csv")
...and then you are AGAIN trying to convert that slice of bytes to JSON bytes with:
jsonConv, _ := json.Marshal(file)
Where the text contents of the file are stored as a slice of bytes in the variable file, and then that slice of bytes (the file contents in bytes form) is then being converted to a JSON array of bytes. So you are basically sending a JSON array of numbers...not good.
The normal flow here would be to take the file bytes and then create a Go struct(s) out of it. Once your Go objects are in place, THEN you would marshal to JSON. That converts the Go objects to a slice of bytes AFTER it has been converted to JSON text form.
So what you are missing is the Go structure middle step but you also should keep in mind that converting a Go struct to JSON bytes with json.Marshal() will only show fields that are exported. Also, usually you should use struct tags to customize exactly how the fields will show up.
Your best bet is just to stick with JSON, forget about the CSV. In your own code example, you are taking a CSV and then trying to convert it to JSON...so, then, just use JSON.
If you want to send multiple accounts, just make your Go structure a slice, which will marshal into a JSON array, which is basically what you are trying to do. The end result will be a JSON array of accounts. Here's a simple example:
package main
import (
"fmt"
"encoding/json"
)
type Account struct {
Username string `json:"username"`
Email string `json:"email"`
}
type AccountsRequest struct {
Accounts []Account `json:"accounts"`
}
func main() {
//gather account info
acct1 := Account{Username: "tom", Email: "tom#example.com"}
acct2 := Account{Username: "dick", Email: "dick#example.com"}
acct3 := Account{Username: "harry", Email: "harry#example.com"}
//create request
acctsReq := AccountsRequest{Accounts: []Account{acct1, acct2, acct3}}
//convert to JSON bytes/data
//jsonData, _ := json.Marshal(acctsReq)
//debug/output
jsonDataPretty, _ := json.MarshalIndent(acctsReq, "", " ")
fmt.Println(string(jsonDataPretty))
//send request with data
//...
}
Runnable here in playground.
The key is that the structs are set up and ready to go and the struct tags determine what the JSON field names will be (i.e. username & email for each account and accounts for the overall array of accounts).
Hope that helps. Drop a comment if you need more specific help.
You need to parse the CSV file first and convert it into the list that you want:
package main
func main() {
file, err := os.Open("file.csv")
if err != nil {
log.Fatal("failed opening file because: %s", err.Error())
}
r := csv.NewReader(file)
records, err := r.ReadAll()
if err != nil {
log.Fatal(err)
}
fmt.Print(records)
}
The above code is parsing the list into a [][]string array. you will now need to iterate over that array and turn it into the json object that the page needs. Then you can send it. You can read more about the csv package here : https://golang.org/pkg/encoding/csv/
A word of advise: never ignore errors, they might give you useful information.

How we can read a JSON file with Go Programming Language?

I'm working on a translation project on my Angular app. I already create all the different keys for that. I try now to use Go Programming Language to add some functionalities in my translation, to work quickly after.
I try to code a function in Go Programming Language in order to read an input user on the command line. I need to read this input file in order to know if there is missing key inside. This input user must be a JSON file. I have a problem with this function, is blocked at functions.Check(err), in order to debug my function I displayed the different variable with fmt.Printf(variable to display).
I call this function readInput() in my main function.
The readInput() function is the following :
// this function is used to read the user's input on the command line
func readInput() string {
// we create a reader
reader := bufio.NewReader(os.Stdin)
// we read the user's input
answer, err := reader.ReadString('\n')
// we check if any errors have occured while reading
functions.Check(err)
// we trim the "\n" from the answer to only keep the string input by the user
answer = strings.Trim(answer, "\n")
return answer
}
In my main function I call readInput() for a specific command I created. This command line is usefull to update a JSON file and add a missing key automatically.
My func main is :
func main() {
if os.Args[1] == "update-json-from-json" {
fmt.Printf("please enter the name of the json file that will be used to
update the json file:")
jsonFile := readInput()
fmt.Printf("please enter the ISO code of the locale for which you want to update the json file: ")
// we read the user's input
locale := readInput()
// we launch the script
scripts.AddMissingKeysToJsonFromJson(jsonFile, locale)
}
I can give you the command line I use for this code go run mis-t.go update-json-from-json
Do you what I'm missing in my code please ?
Presuming that the file contains dynamic and unknown keys and values, and you cannot model them in your application. Then you can do something like:
func main() {
if os.Args[1] == "update-json-from-json" {
...
jsonFile := readInput()
var jsonKeys interface{}
err := json.Unmarshal(jsonFile, &jsonKeys)
functions.Check(err)
...
}
}
to load the contents into the empty interface, and then use the go reflection library (https://golang.org/pkg/reflect/) to iterate over the fields, find their names and values and update them according to your needs.
The alternative is to Unmarshal into a map[string]string, but that won't cope very well with nested JSON, whereas this might (but I haven't tested it).

Why does json.Encoder add an extra line?

json.Encoder seems to behave slightly different than json.Marshal. Specifically it adds a new line at the end of the encoded value. Any idea why is that? It looks like a bug to me.
package main
import "fmt"
import "encoding/json"
import "bytes"
func main() {
var v string
v = "hello"
buf := bytes.NewBuffer(nil)
json.NewEncoder(buf).Encode(v)
b, _ := json.Marshal(&v)
fmt.Printf("%q, %q", buf.Bytes(), b)
}
This outputs
"\"hello\"\n", "\"hello\""
Try it in the Playground
Because they explicitly added a new line character when using Encoder.Encode. Here's the source code to that func, and it actually states it adds a newline character in the documentation (see comment, which is the documentation):
https://golang.org/src/encoding/json/stream.go?s=4272:4319
// Encode writes the JSON encoding of v to the stream,
// followed by a newline character.
//
// See the documentation for Marshal for details about the
// conversion of Go values to JSON.
func (enc *Encoder) Encode(v interface{}) error {
if enc.err != nil {
return enc.err
}
e := newEncodeState()
err := e.marshal(v)
if err != nil {
return err
}
// Terminate each value with a newline.
// This makes the output look a little nicer
// when debugging, and some kind of space
// is required if the encoded value was a number,
// so that the reader knows there aren't more
// digits coming.
e.WriteByte('\n')
if _, err = enc.w.Write(e.Bytes()); err != nil {
enc.err = err
}
encodeStatePool.Put(e)
return err
}
Now, why did the Go developers do it other than "makes the output look a little nice"? One answer:
Streaming
The go json Encoder is optimized for streaming (e.g. MB/GB/PB of json data). It is typical that when streaming you need a way to deliminate when your stream has completed. In the case of Encoder.Encode(), that is a \n newline character. Sure, you can certainly write to a buffer. But you can also write to an io.Writer which would stream the block of v.
This is opposed to the use of json.Marshal which is generally discouraged if your input is from an untrusted (and unknown limited) source (e.g. an ajax POST method to your web service - what if someone posts a 100MB json file?). And, json.Marshal would be a final complete set of json - e.g. you wouldn't expect to concatenate a few 100 Marshal entries together. You'd use Encoder.Encode() for that to build a large set and write to the buffer, stream, file, io.Writer, etc.
Whenever in doubt if it's a bug, I always lookup the source - that's one of the advantages to Go, it's source and compiler is just pure Go. Within [n]vim I use \gb to open the source definition in a browser with my .vimrc settings.
You can erease the newline by backward stream:
f, _ := os.OpenFile(fname, ...)
encoder := json.NewEncoder(f)
encoder.Encode(v)
f.Seek(-1, 1)
f.WriteString("other data ...")
They should let user control this strange behavior:
a build option to disable it
Encoder.SetEOF(eof string)
Encoder.SetIndent(prefix, indent, eof string)
The Encoder writes a stream of documents. The extra whitespace terminates a JSON document in the stream.
A terminator is required for stream readers. Consider a stream containing these JSON documents: 1, 2, 3. Without the extra whitespace, the data on the wire is the sequence of bytes 123. This is a single JSON document with the number 123, not three documents.

In Go templates, I can get Parse to work but cannot get ParseFiles to work in like manner. Why?

I have the following code:
t, err := template.New("template").Funcs(funcMap).Parse("Howdy {{ myfunc . }}")
In this form everything works fine. But if I do exactly the same thing with ParseFiles, placing the text above in template.html it's a no go:
t, err := template.New("template").Funcs(funcMap).ParseFiles("template.html")
I was able to get ParseFiles to work in the following form, but cannot get Funcs to take effect:
t, err := template.ParseFiles("template.html")
t.Funcs(funcMap)
Of course, this last form is a direct call to a function instead of a call to the receiver method, so not the same thing.
Any one have any ideas what's going on here? Difficult to find a lot of detail on templates out in the either.
Did some digging and found this comment for template.ParseFiles in the source:
First template becomes return value if not already defined, and we use that one for subsequent New calls to associate all the templates together. Also, if this file has the same name as t, this file becomes the contents of t, so t, err := New(name).Funcs(xxx).ParseFiles(name) works. Otherwise we create a new template associated with t.
So the format should be as follows, given my example above:
t, err := template.New("template.html").Funcs(funcMap).ParseFiles("path/template.html")
.New("template.html") creates an empty template with the given name, .Funcs(funcMap) associates any custom functions we want to apply to our templates, and then .ParseFiles("path/template.html") parses one or more templates with an awareness of those functions and associates the contents with a template of that name.
Note that the base name of the first file MUST be the same as the name used in New. Any content parsed will be associated with either an empty preexisting template having the same base name of the first file in the series or with a new template having that base name.
So in my example above, one empty template named "template" was created and had a function map associated with it. Then a new template named "template.html" was created. THESE ARE NOT THE SAME! And since, ParseFiles was called last, t ends up being the "template.html" template, without any functions attached.
What about the last example? Why didn't this work? template.ParseFiles calls the receiver method Parse, which in turn applies any previously registered functions:
trees, err := parse.Parse(t.name, text, t.leftDelim, t.rightDelim, t.parseFuncs, builtins)
This means that custom functions have to be registered prior to parsing. Adding the functions after parsing the templates doesn't have any affect, leading to nil pointer errors at runtime when attempting to call a custom function.
So I think that covers my original question. The design here seems a little clunky. Doesn't make sense that I can chain on ParseFiles to one template and end up returning a different template from the one I am chaining from if they don't happen to be named the same. That is counterintuitive and probably ought to be addressed in future releases to avoid confusion.
ParseFiles should have some names of templates which is basename of filename. But you call template.New, it's create new one of template named error. So you should select one of templates.
foo.go
package main
import (
"text/template"
"log"
"os"
"strings"
)
func main() {
tmpl, err := template.New("error").Funcs(template.FuncMap{
"trim": strings.TrimSpace,
}).ParseFiles("foo.tmpl")
if err != nil {
log.Fatal(err)
}
tmpl = tmpl.Lookup("foo.tmpl")
err = tmpl.Execute(os.Stdout, " string contains spaces both ")
if err != nil {
log.Fatal(err)
}
}
foo.tmpl
{{. | trim}}
try this:
var templates = template.Must(template.New("").Funcs(fmap).ParseFiles("1.tmpl, "2.tmpl"))

How do I substitute bytes in a stream using Go standard library?

I have a io.Reader which I get from http.Request.Body that reads a JSON byte slice from a server.
I would like to stream this to json.NewDecoder. However I would also like to intercept the JSON before it hits json.NewDecoder and substitute certain parts of it. For example, the JSON string contains empty hashes "{}" which I would like to remove due to a bug in the server's JSON output.
I am currently achieving my goal using json.Unmarshal but not using the JSON streaming parser:
data, _ := ioutil.ReadAll(r.Body)
data = bytes.Replace(data, []byte("{}"), "", -1)
json.Unmarshal(data, [my struct])
How can I achieve the same thing as above but using json.NewDecoder so I can save the many times the above code has to parse through r.Body's data? Here's some code using a pseudo function ReplaceStream(r io.Reader, old, new []byte):
reader := ReplaceStream(r.Body, []byte("{}"), "")
dec := json.NewDecoder(reader)
dec.Decode([my struct])
I know ReplaceStream might be fairly trivial to make, but is there anything in the standard library to do this that I am unaware of?
My advice is to just treat that kind of message as a special case and avoid the extra parsing / substituting for all the other requests
data, _ := ioutil.ReadAll(r.Body)
// FIXME: overcome bug #12312 of json server
if data == `{"list": [{}]}` {
return []
}
// Normal datastruct ..