Exporting JSON into single file from loop function - json

I wrote some code which hits one public API and saves the JSON output in a file. But the data is storing line by line into the file instead of a single JSON format.
For eg.
Current Output:
{"ip":"1.1.1.1", "Country":"US"}
{"ip":"8.8.8.8", "Country":"IN"}
Desired Output:
[
{"ip":"1.1.1.1", "Country":"US"},
{"ip":"8.8.8.8", "Country":"IN"}
]
I know this should be pretty simple and i am missing out something.
My Current Code is:
To read IP from file and hit the API one by one on each IP.
func readIPfromFile(filename string, outFile string, timeout int) {
data := jsonIn{}
//open input file
jsonFile, err := os.Open(filename) //open input file
...
...
jsonData := bufio.NewScanner(jsonFile)
for jsonData.Scan() {
// marshal json data & check for logs
if err := json.Unmarshal(jsonData.Bytes(), &data); err != nil {
log.Fatal(err)
}
//save to file
url := fmt.Sprintf("http://ipinfo.io/%s", data.Host)
GetGeoIP(url, outFile, timeout)
}
}
To make HTTP Request with custom request header and call write to file function.
func GetGeoIP(url string, outFile string, timeout int) {
geoClient := http.Client{
Timeout: time.Second * time.Duration(timeout), // Timeout after 5 seconds
}
req, err := http.NewRequest(http.MethodGet, url, nil)
if err != nil {
log.Fatal(err)
}
req.Header.Set("accept", "application/json")
res, getErr := geoClient.Do(req)
if getErr != nil {
log.Fatal(getErr)
}
if res.Body != nil {
defer res.Body.Close()
}
body, readErr := ioutil.ReadAll(res.Body)
if readErr != nil {
log.Fatal(readErr)
}
jsonout := jsonOut{}
jsonErr := json.Unmarshal(body, &jsonout)
if jsonErr != nil {
log.Fatal(jsonErr)
}
file, _ := json.Marshal(jsonout)
write2file(outFile, file)
}
To Write data to file:
func write2file(outFile string, file []byte) {
f, err := os.OpenFile(outFile, os.O_APPEND|os.O_WRONLY|os.O_CREATE, 0600)
if err != nil {
log.Fatal(err)
}
defer f.Close()
if _, err = f.WriteString(string(file)); err != nil {
log.Fatal(err)
}
if _, err = f.WriteString("\n"); err != nil {
log.Fatal(err)
}
I know, i can edit f.WriteString("\n"); to f.WriteString(","); to add comma but still adding [] in the file is challenging for me.

First, please do not invent a new way of json marshaling, just use golang built-in encoding/json or other library on github.
Second, if you want to create a json string that represents an array of object, you need to create the array of objects in golang and marshal it into string (or more precisely, into array of bytes)
I create a simple as below, but please DIY if possible.
https://go.dev/play/p/RR_ok-fUTb_4

Related

Process csv file from upload

I have a gin application that receives a post request containing a csv file which I want to read without saving it. I'm stuck here trying to read from the post request with the following error message: cannot use file (variable of type *multipart.FileHeader) as io.Reader value in argument to csv.NewReader: missing method Read
file, err := c.FormFile("file")
if err != nil {
errList["Invalid_body"] = "Unable to get request"
c.JSON(http.StatusUnprocessableEntity, gin.H{
"status": http.StatusUnprocessableEntity,
"error": errList,
})
}
r := csv.NewReader(file) // <= Error message
records, err := r.ReadAll()
for _, record := range records {
fmt.Println(record)
}
Is there a good example that I could use?
first read the file and header
csvPartFile, csvHeader, openErr := r.FormFile("file")
if openErr != nil {
// handle error
}
then read the lines from the file
csvLines, readErr := csv.NewReader(csvPartFile).ReadAll()
if readErr != nil {
//handle error
}
you can go through the lines looping through the records
for _, line := range csvLines {
fmt.Println(line)
}
As other answers have mentioned, you should Open() it first.
The latest version of gin.Context.FromFile(string) seems to return only two values.
This worked for me:
func (c *gin.Context) {
file_ptr, err := c.FormFile("file")
if err != nil {
log.Println(err.Error())
c.Status(http.StatusUnprocessableEntity)
return
}
log.Println(file_ptr.Filename)
file, err := file_ptr.Open()
if err != nil {
log.Println(err.Error())
c.Status(http.StatusUnprocessableEntity)
return
}
defer file.Close()
records, err := csv.NewReader(file).ReadAll()
if err != nil {
log.Println(err.Error())
c.Status(http.StatusUnprocessableEntity)
return
}
for _, line := range records {
fmt.Println(line)
}
}

Golang Read JSON from S3 into struct in memory

I have a JSON file in S3 that takes the format of the following struct:
type StockInfo []struct {
Ticker string `json:"ticker"`
BoughtPrice string `json:"boughtPrice"`
NumberOfShares string `json:"numberOfShares"`
}
and I want to read it into a struct value from S3. In python the code would look something like this:
import boto3
import json
s3 = boto3.client('s3', 'us-east-1')
obj = s3.get_object(Bucket=os.environ["BucketName"], Key=os.environ["Key"])
fileContents = obj['Body'].read().decode('utf-8')
json_content = json.loads(fileContents)
However I'm kinda stuck on how to make this happen in Go. I've gotten this far:
package main
import (
"archive/tar"
"bytes"
"fmt"
"log"
"os"
"github.com/aws/aws-sdk-go/aws"
"github.com/aws/aws-sdk-go/aws/session"
"github.com/aws/aws-sdk-go/service/s3"
"github.com/aws/aws-sdk-go/service/s3/s3manager"
"github.com/joho/godotenv"
)
type StockInfo []struct {
Ticker string `json:"ticker"`
BoughtPrice string `json:"boughtPrice"`
NumberOfShares string `json:"numberOfShares"`
}
func init() {
// loads values from .env into the system
if err := godotenv.Load(); err != nil {
log.Print("No .env file found")
}
return
}
func main() {
// Store the PATH environment variable in a variable
sess, err := session.NewSession(&aws.Config{
Region: aws.String("us-east-1")},
)
if err != nil {
panic(err)
}
s3Client := s3.New(sess)
bucket := "ian-test-bucket-go-python"
key := "StockInfo.json"
requestInput := &s3.GetObjectInput{
Bucket: aws.String(bucket),
Key: aws.String(key),
}
result, err := s3Client.GetObject(requestInput)
if err != nil {
fmt.Println(err)
}
fmt.Println(result)
which returns to me the body/object buffer, but im not sure how to read that into a string so I can marshal it into my struct. I found this code in a similar question:
requestInput := &s3.GetObjectInput{
Bucket: aws.String(bucket),
Key: aws.String(key),
}
buf := new(aws.WriteAtBuffer)
numBytes, _ := *s3manager.Downloader.Download(buf, requestInput)
tr := tar.NewReader(bytes.NewReader(buf.Bytes()))
but I get the following errors:
not enough arguments in call to method expression s3manager.Downloader.Download
have (*aws.WriteAtBuffer, *s3.GetObjectInput)
want (s3manager.Downloader, io.WriterAt, *s3.GetObjectInput, ...func(*s3manager.Downloader))
multiple-value s3manager.Downloader.Download() in single-value context
Can anyone point me in the right direction? kinda frustrating how hard it seems to do this compared to python.
I was able to do it with the following code:
requestInput := &s3.GetObjectInput{
Bucket: aws.String(bucket),
Key: aws.String(key),
}
result, err := s3Client.GetObject(requestInput)
if err != nil {
fmt.Println(err)
}
defer result.Body.Close()
body1, err := ioutil.ReadAll(result.Body)
if err != nil {
fmt.Println(err)
}
bodyString1 := fmt.Sprintf("%s", body1)
var s3data StockInfo
decoder := json.NewDecoder(strings.NewReader(bodyString1))
err = decoder.Decode(&s3data)
if err != nil {
fmt.Println("twas an error")
}
fmt.Println(s3data)
Alternative solution using json.Unmarshal
besed on aws-sdk-go-v2
...
params := &s3.GetObjectInput{
Bucket: aws.String(s3Record.S3.Bucket.Name),
Key: aws.String(s3Record.S3.Object.Key),
}
result, _ := client.GetObject(context.TODO(), params)
if err != nil {
panic(err)
}
defer result.Body.Close()
// capture all bytes from upload
b, err := ioutil.ReadAll(result.Body)
if err != nil {
panic(err)
}
var temp StockInfo
if err = json.Unmarshal(b, &temp); err != nil {
panic(err)
}
ftm.Println("res: ",b)

How to compare JSON with varying order?

I'm attempting to implement testing with golden files, however, the JSON my function generates varies in order but maintains the same values. I've implemented the comparison method used here:
How to compare two JSON requests?
But it's order dependent. And as stated here by brad:
JSON objects are unordered, just like Go maps. If
you're depending on the order that a specific implementation serializes your JSON
objects in, you have a bug.
I've written some sample code that simulated my predicament:
package main
import (
"bufio"
"encoding/json"
"fmt"
"io/ioutil"
"math/rand"
"os"
"reflect"
"time"
)
type example struct {
Name string
Earnings float64
}
func main() {
slice := GetSlice()
gfile, err := ioutil.ReadFile("testdata/example.golden")
if err != nil {
fmt.Println(err)
fmt.Println("Failed reading golden file")
}
testJSON, err := json.Marshal(slice)
if err != nil {
fmt.Println(err)
fmt.Println("Error marshalling slice")
}
equal, err := JSONBytesEqual(gfile, testJSON)
if err != nil {
fmt.Println(err)
fmt.Println("Error comparing JSON")
}
if !equal {
fmt.Println("Restults don't match JSON")
} else {
fmt.Println("Success!")
}
}
func GetSlice() []example {
t := []example{
example{"Penny", 50.0},
example{"Sheldon", 70.0},
example{"Raj", 20.0},
example{"Bernadette", 200.0},
example{"Amy", 250.0},
example{"Howard", 1.0}}
rand.Seed(time.Now().UnixNano())
rand.Shuffle(len(t), func(i, j int) { t[i], t[j] = t[j], t[i] })
return t
}
func JSONBytesEqual(a, b []byte) (bool, error) {
var j, j2 interface{}
if err := json.Unmarshal(a, &j); err != nil {
return false, err
}
if err := json.Unmarshal(b, &j2); err != nil {
return false, err
}
return reflect.DeepEqual(j2, j), nil
}
func WriteTestSliceToFile(arr []example, filename string) {
file, err := os.OpenFile(filename, os.O_APPEND|os.O_CREATE|os.O_WRONLY, 0644)
if err != nil {
fmt.Println("failed creating file: %s", err)
}
datawriter := bufio.NewWriter(file)
marshalledStruct, err := json.Marshal(arr)
if err != nil {
fmt.Println("Error marshalling json")
fmt.Println(err)
}
_, err = datawriter.Write(marshalledStruct)
if err != nil {
fmt.Println("Error writing to file")
fmt.Println(err)
}
datawriter.Flush()
file.Close()
}
JSON arrays are ordered. The json.Marshal function preserves order when encoding a slice to a JSON array.
JSON objects are not ordered. The json.Marshal function writes object members in sorted key order as described in the documentation.
The bradfitz comment JSON object ordering is not relevant to this question:
The application in the question is working with a JSON array, not a JSON object.
The package was updated to write object fields in sorted key order a couple of years after Brad's comment.
To compare slices while ignoring order, sort the two slices before comparing. This can be done before encoding to JSON or after decoding from JSON.
sort.Slice(slice, func(i, j int) bool {
if slice[i].Name != slice[j].Name {
return slice[i].Name < slice[j].Name
}
return slice[i].Earnings < slice[j].Earnings
})
For unit testing, you could use assert.JSONEq from Testify. If you need to do it programatically, you could follow the code of the JSONEq function.
https://github.com/stretchr/testify/blob/master/assert/assertions.go#L1551

Changing the last character of a file

I want to continuously write json objects to a file. To be able to read it, I need to wrap them into an array. I don't want to read the whole file, for simple appending. So what I' doing now:
comma := []byte(", ")
file, err := os.OpenFile(erp.TransactionsPath, os.O_WRONLY|os.O_APPEND|os.O_CREATE, 0666)
if err != nil {
return err
}
transaction, err := json.Marshal(t)
if err != nil {
return err
}
transaction = append(transaction, comma...)
file.Write(transaction)
But with this implementation I will need to add []scopes by hand(or via some script) before reading. How can I add an object before closing scope on each writing?
You don't need to wrap the JSON objects into an array, you can just write them as-is. You may use json.Encoder to write them to the file, and you may use json.Decoder to read them. Encoder.Encode() and Decoder.Decode() encode and decode individual JSON values from a stream.
To prove it works, see this simple example:
const src = `{"id":"1"}{"id":"2"}{"id":"3"}`
dec := json.NewDecoder(strings.NewReader(src))
for {
var m map[string]interface{}
if err := dec.Decode(&m); err != nil {
if err == io.EOF {
break
}
panic(err)
}
fmt.Println("Read:", m)
}
It outputs (try it on the Go Playground):
Read: map[id:1]
Read: map[id:2]
Read: map[id:3]
When writing to / reading from a file, pass the os.File to json.NewEncoder() and json.NewDecoder().
Here's a complete demo which creates a temporary file, uses json.Encoder to write JSON objects into it, then reads them back with json.Decoder:
objs := []map[string]interface{}{
map[string]interface{}{"id": "1"},
map[string]interface{}{"id": "2"},
map[string]interface{}{"id": "3"},
}
file, err := ioutil.TempFile("", "test.json")
if err != nil {
panic(err)
}
// Writing to file:
enc := json.NewEncoder(file)
for _, obj := range objs {
if err := enc.Encode(obj); err != nil {
panic(err)
}
}
// Debug: print file's content
fmt.Println("File content:")
if data, err := ioutil.ReadFile(file.Name()); err != nil {
panic(err)
} else {
fmt.Println(string(data))
}
// Reading from file:
if _, err := file.Seek(0, io.SeekStart); err != nil {
panic(err)
}
dec := json.NewDecoder(file)
for {
var obj map[string]interface{}
if err := dec.Decode(&obj); err != nil {
if err == io.EOF {
break
}
panic(err)
}
fmt.Println("Read:", obj)
}
It outputs (try it on the Go Playground):
File content:
{"id":"1"}
{"id":"2"}
{"id":"3"}
Read: map[id:1]
Read: map[id:2]
Read: map[id:3]

Golang Encode/Decode base64 with json post doesn't work

I build a client and a server in golang both are using this functions to encrypt/decrypt
func encrypt(text []byte) ([]byte, error) {
block, err := aes.NewCipher(key)
if err != nil {
return nil, err
}
b := base64.StdEncoding.EncodeToString(text)
ciphertext := make([]byte, aes.BlockSize+len(b))
iv := ciphertext[:aes.BlockSize]
if _, err := io.ReadFull(rand.Reader, iv); err != nil {
return nil, err
}
cfb := cipher.NewCFBEncrypter(block, iv)
cfb.XORKeyStream(ciphertext[aes.BlockSize:], []byte(b))
return ciphertext, nil
}
func decrypt(text []byte) ([]byte, error) {
block, err := aes.NewCipher(key)
if err != nil {
return nil, err
}
if len(text) < aes.BlockSize {
return nil, errors.New("ciphertext too short")
}
iv := text[:aes.BlockSize]
text = text[aes.BlockSize:]
cfb := cipher.NewCFBDecrypter(block, iv)
cfb.XORKeyStream(text, text)
data, err := base64.StdEncoding.DecodeString(string(text))
if err != nil {
return nil, err
}
return data, nil
}
so yeah I make a normal post request
url := "https://"+configuration.Server+configuration.Port+"/get"
// TODO maybe bugs rest here
ciphertext, err := encrypt([]byte(*getUrl))
if err != nil {
fmt.Println("Error: " + err.Error())
}
fmt.Println(string(ciphertext))
values := map[string]interface{}{"url": *getUrl, "urlCrypted": ciphertext}
jsonValue, _ := json.Marshal(values)
jsonStr := bytes.NewBuffer(jsonValue)
req, err := http.NewRequest("POST", url, jsonStr)
and the servercode is as following
requestContent := getRequestContentFromRequest(req)
url := requestContent["url"].(string)
undecryptedUrl := requestContent["urlCrypted"].(string)
decryptedurl, err := decrypt([]byte(undecryptedUrl))
if err != nil {
fmt.Println("Error: " + err.Error())
}
fmt.Println(decryptedurl)
where getRequestContentFromRequest is as following
func getRequestContentFromRequest(req *http.Request)
map[string]interface{} {
buf := new(bytes.Buffer)
buf.ReadFrom(req.Body)
data := buf.Bytes()
var requestContent map[string]interface{}
err := json.Unmarshal(data, &requestContent)
if err != nil {
fmt.Println(err)
}
return requestContent
}
Now to the problem.
If I encrypt my string in the client and decrypt it direct after that everything is fine.
But, when I send the encrypted string to the server and try to decrypt it with literrally the same function as in the client, the decrypt function throws an error.
Error: illegal base64 data at input byte 0
I think the Problem is the unmarshalling of the JSON.
Thanks for help.
P.S.
Repos are
github.com/BelphegorPrime/goSafeClient and github.com/BelphegorPrime/goSafe
UPDATE
Example JSON
{"url":"facebook2.com","urlCrypted":"/}\ufffd\ufffd\ufffdgP\ufffdN뼞\ufffd\u0016\ufffd)\ufffd\ufffd\ufffdy\u001c\u000f\ufffd\ufffd\ufffdep\ufffd\rY\ufffd\ufffd$\ufffd\ufffd"}
UPDATE2
I made a playground here
The problem is that you encode in base64 twice. The first time in the encrypt function and the second time during the JSON marshalling. byte slices are automatically converted into base64 strings by the encoding/json marshaller.
The solution is to decode the base64 string before calling decrypt.
Example on the Go PlayGround
EDIT
Working solution here