Changing the last character of a file - json

I want to continuously write json objects to a file. To be able to read it, I need to wrap them into an array. I don't want to read the whole file, for simple appending. So what I' doing now:
comma := []byte(", ")
file, err := os.OpenFile(erp.TransactionsPath, os.O_WRONLY|os.O_APPEND|os.O_CREATE, 0666)
if err != nil {
return err
}
transaction, err := json.Marshal(t)
if err != nil {
return err
}
transaction = append(transaction, comma...)
file.Write(transaction)
But with this implementation I will need to add []scopes by hand(or via some script) before reading. How can I add an object before closing scope on each writing?

You don't need to wrap the JSON objects into an array, you can just write them as-is. You may use json.Encoder to write them to the file, and you may use json.Decoder to read them. Encoder.Encode() and Decoder.Decode() encode and decode individual JSON values from a stream.
To prove it works, see this simple example:
const src = `{"id":"1"}{"id":"2"}{"id":"3"}`
dec := json.NewDecoder(strings.NewReader(src))
for {
var m map[string]interface{}
if err := dec.Decode(&m); err != nil {
if err == io.EOF {
break
}
panic(err)
}
fmt.Println("Read:", m)
}
It outputs (try it on the Go Playground):
Read: map[id:1]
Read: map[id:2]
Read: map[id:3]
When writing to / reading from a file, pass the os.File to json.NewEncoder() and json.NewDecoder().
Here's a complete demo which creates a temporary file, uses json.Encoder to write JSON objects into it, then reads them back with json.Decoder:
objs := []map[string]interface{}{
map[string]interface{}{"id": "1"},
map[string]interface{}{"id": "2"},
map[string]interface{}{"id": "3"},
}
file, err := ioutil.TempFile("", "test.json")
if err != nil {
panic(err)
}
// Writing to file:
enc := json.NewEncoder(file)
for _, obj := range objs {
if err := enc.Encode(obj); err != nil {
panic(err)
}
}
// Debug: print file's content
fmt.Println("File content:")
if data, err := ioutil.ReadFile(file.Name()); err != nil {
panic(err)
} else {
fmt.Println(string(data))
}
// Reading from file:
if _, err := file.Seek(0, io.SeekStart); err != nil {
panic(err)
}
dec := json.NewDecoder(file)
for {
var obj map[string]interface{}
if err := dec.Decode(&obj); err != nil {
if err == io.EOF {
break
}
panic(err)
}
fmt.Println("Read:", obj)
}
It outputs (try it on the Go Playground):
File content:
{"id":"1"}
{"id":"2"}
{"id":"3"}
Read: map[id:1]
Read: map[id:2]
Read: map[id:3]

Related

Exporting JSON into single file from loop function

I wrote some code which hits one public API and saves the JSON output in a file. But the data is storing line by line into the file instead of a single JSON format.
For eg.
Current Output:
{"ip":"1.1.1.1", "Country":"US"}
{"ip":"8.8.8.8", "Country":"IN"}
Desired Output:
[
{"ip":"1.1.1.1", "Country":"US"},
{"ip":"8.8.8.8", "Country":"IN"}
]
I know this should be pretty simple and i am missing out something.
My Current Code is:
To read IP from file and hit the API one by one on each IP.
func readIPfromFile(filename string, outFile string, timeout int) {
data := jsonIn{}
//open input file
jsonFile, err := os.Open(filename) //open input file
...
...
jsonData := bufio.NewScanner(jsonFile)
for jsonData.Scan() {
// marshal json data & check for logs
if err := json.Unmarshal(jsonData.Bytes(), &data); err != nil {
log.Fatal(err)
}
//save to file
url := fmt.Sprintf("http://ipinfo.io/%s", data.Host)
GetGeoIP(url, outFile, timeout)
}
}
To make HTTP Request with custom request header and call write to file function.
func GetGeoIP(url string, outFile string, timeout int) {
geoClient := http.Client{
Timeout: time.Second * time.Duration(timeout), // Timeout after 5 seconds
}
req, err := http.NewRequest(http.MethodGet, url, nil)
if err != nil {
log.Fatal(err)
}
req.Header.Set("accept", "application/json")
res, getErr := geoClient.Do(req)
if getErr != nil {
log.Fatal(getErr)
}
if res.Body != nil {
defer res.Body.Close()
}
body, readErr := ioutil.ReadAll(res.Body)
if readErr != nil {
log.Fatal(readErr)
}
jsonout := jsonOut{}
jsonErr := json.Unmarshal(body, &jsonout)
if jsonErr != nil {
log.Fatal(jsonErr)
}
file, _ := json.Marshal(jsonout)
write2file(outFile, file)
}
To Write data to file:
func write2file(outFile string, file []byte) {
f, err := os.OpenFile(outFile, os.O_APPEND|os.O_WRONLY|os.O_CREATE, 0600)
if err != nil {
log.Fatal(err)
}
defer f.Close()
if _, err = f.WriteString(string(file)); err != nil {
log.Fatal(err)
}
if _, err = f.WriteString("\n"); err != nil {
log.Fatal(err)
}
I know, i can edit f.WriteString("\n"); to f.WriteString(","); to add comma but still adding [] in the file is challenging for me.
First, please do not invent a new way of json marshaling, just use golang built-in encoding/json or other library on github.
Second, if you want to create a json string that represents an array of object, you need to create the array of objects in golang and marshal it into string (or more precisely, into array of bytes)
I create a simple as below, but please DIY if possible.
https://go.dev/play/p/RR_ok-fUTb_4

Process csv file from upload

I have a gin application that receives a post request containing a csv file which I want to read without saving it. I'm stuck here trying to read from the post request with the following error message: cannot use file (variable of type *multipart.FileHeader) as io.Reader value in argument to csv.NewReader: missing method Read
file, err := c.FormFile("file")
if err != nil {
errList["Invalid_body"] = "Unable to get request"
c.JSON(http.StatusUnprocessableEntity, gin.H{
"status": http.StatusUnprocessableEntity,
"error": errList,
})
}
r := csv.NewReader(file) // <= Error message
records, err := r.ReadAll()
for _, record := range records {
fmt.Println(record)
}
Is there a good example that I could use?
first read the file and header
csvPartFile, csvHeader, openErr := r.FormFile("file")
if openErr != nil {
// handle error
}
then read the lines from the file
csvLines, readErr := csv.NewReader(csvPartFile).ReadAll()
if readErr != nil {
//handle error
}
you can go through the lines looping through the records
for _, line := range csvLines {
fmt.Println(line)
}
As other answers have mentioned, you should Open() it first.
The latest version of gin.Context.FromFile(string) seems to return only two values.
This worked for me:
func (c *gin.Context) {
file_ptr, err := c.FormFile("file")
if err != nil {
log.Println(err.Error())
c.Status(http.StatusUnprocessableEntity)
return
}
log.Println(file_ptr.Filename)
file, err := file_ptr.Open()
if err != nil {
log.Println(err.Error())
c.Status(http.StatusUnprocessableEntity)
return
}
defer file.Close()
records, err := csv.NewReader(file).ReadAll()
if err != nil {
log.Println(err.Error())
c.Status(http.StatusUnprocessableEntity)
return
}
for _, line := range records {
fmt.Println(line)
}
}

how to stream object to gzipped json?

Currently the way to convert an object to json and gzip it is:
jsonBytes, _ := json.Marshal(payload)
//gzip json
var body bytes.Buffer
g := gzip.NewWriter(&body)
g.Write(jsonBytes)
g.Close()
This results in an intermediate large byte buffer jsonBytes, whose only purpose is to be then converted into gzipped buffer.
Is there any way to stream the marshalling of the payload object so it comes out gzipped in the first place?
Yes, you may use json.Encoder to stream the JSON output, and similarly json.Decoder to decode a streamed JSON input. They take any io.Writer and io.Reader to write the JSON result to / read from, including gzip.Writer and gzip.Reader.
For example:
var body bytes.Buffer
w := gzip.NewWriter(&body)
enc := json.NewEncoder(w)
payload := map[string]interface{}{
"one": 1, "two": 2,
}
if err := enc.Encode(payload); err != nil {
panic(err)
}
if err := w.Close(); err != nil {
panic(err)
}
To verify that it works, this is how we can decode it:
r, err := gzip.NewReader(&body)
if err != nil {
panic(err)
}
dec := json.NewDecoder(r)
payload = nil
if err := dec.Decode(&payload); err != nil {
panic(err)
}
fmt.Println("Decoded:", payload)
Which will output (try it on the Go Playground):
Decoded: map[one:1 two:2]

detect duplicate in JSON String Golang

I have JSON string like
"{\"a\": \"b\", \"a\":true,\"c\":[\"field_3 string 1\",\"field3 string2\"]}"
how to detect the duplicate attribute in this json string using Golang
Use the json.Decoder to walk through the JSON. When an object is found, walk through keys and values checking for duplicate keys.
func check(d *json.Decoder, path []string, dup func(path []string) error) error {
// Get next token from JSON
t, err := d.Token()
if err != nil {
return err
}
// Is it a delimiter?
delim, ok := t.(json.Delim)
// No, nothing more to check.
if !ok {
// scaler type, nothing to do
return nil
}
switch delim {
case '{':
keys := make(map[string]bool)
for d.More() {
// Get field key.
t, err := d.Token()
if err != nil {
return err
}
key := t.(string)
// Check for duplicates.
if keys[key] {
// Duplicate found. Call the application's dup function. The
// function can record the duplicate or return an error to stop
// the walk through the document.
if err := dup(append(path, key)); err != nil {
return err
}
}
keys[key] = true
// Check value.
if err := check(d, append(path, key), dup); err != nil {
return err
}
}
// consume trailing }
if _, err := d.Token(); err != nil {
return err
}
case '[':
i := 0
for d.More() {
if err := check(d, append(path, strconv.Itoa(i)), dup); err != nil {
return err
}
i++
}
// consume trailing ]
if _, err := d.Token(); err != nil {
return err
}
}
return nil
}
Here's how to call it:
func printDup(path []string) error {
fmt.Printf("Duplicate %s\n", strings.Join(path, "/"))
return nil
}
...
data := `{"a": "b", "a":true,"c":["field_3 string 1","field3 string2"], "d": {"e": 1, "e": 2}}`
if err := check(json.NewDecoder(strings.NewReader(data)), nil, printDup); err != nil {
log.Fatal(err)
}
The output is:
Duplicate a
Duplicate d/e
Run it on the Playground
Here's how to generate an error on the first duplicate key:
var ErrDuplicate = errors.New("duplicate")
func dupErr(path []string) error {
return ErrDuplicate
}
...
data := `{"a": "b", "a":true,"c":["field_3 string 1","field3 string2"], "d": {"e": 1, "e": 2}}`
err := check(json.NewDecoder(strings.NewReader(data)), nil, dupErr)
if err == ErrDuplicate {
fmt.Println("found a duplicate")
} else if err != nil {
// some other error
log.Fatal(err)
}
One that would probably work well would be to simply decode, reencode, then check the length of the new json against the old json:
https://play.golang.org/p/50P-x1fxCzp
package main
import (
"encoding/json"
"fmt"
)
func main() {
jsn := []byte("{\"a\": \"b\", \"a\":true,\"c\":[\"field_3 string 1\",\"field3 string2\"]}")
var m map[string]interface{}
err := json.Unmarshal(jsn, &m)
if err != nil {
panic(err)
}
l := len(jsn)
jsn, err = json.Marshal(m)
if err != nil {
panic(err)
}
if l != len(jsn) {
panic(fmt.Sprintf("%s: %d (%d)", "duplicate key", l, len(jsn)))
}
}
The right way to do it would be to re-implement the json.Decode function, and store a map of keys found, but the above should work (especially if you first stripped any spaces from the json using jsn = bytes.Replace(jsn, []byte(" "), []byte(""), -1) to guard against false positives.

Best way to parse problematic JSON files in Golang

I have some valid JSON files and some which are not (without the surrounding brackets)
Currently I have a method for each case: one uses json.Unmarshal for the valid ones and the other uses json.NewDecoder for the bracketless ones.
How can I merge it into one function what can handle both cases?
EDIT:
Here is the code of the two cases:
func getDrivers() []Drivers {
raw, err := ioutil.ReadFile("/home/ubuntu/drivers.json")
if err != nil {
fmt.Println(err.Error())
os.Exit(1)
}
var d []Drivers
json.Unmarshal(raw, &d)
return d
}
func getMetrics() []Metrics {
file, err := os.Open("/home/ubuntu/metrics.json")
if err != nil {
fmt.Println("bad err!")
}
r := bufio.NewReader(file)
dec := json.NewDecoder(r)
// while the array contains values
var metrics []Metrics
for dec.More() {
var m Metrics
err := dec.Decode(&m)
if err != nil {
log.Fatal(err)
}
metrics = append(metrics, m)
}
return metrics
}
Thank you