Create a new JSON file for every GET request - json

Background story: I'm retrieving details using a GET method. I managed to get the program to parse the output given by the server into a JSON file titled "output.json".
Problem: Every time I make different requests, the output will overwrite any previous content in the output.json file. Is it possible to make a new JSON file for every request?
I am rather new to GoLang and any help will be very useful :)
Note: I am only showing the method that is being used to call the API, if need to, I will show the rest of my code.
Here is my method used:
func render(endpoint string, w http.ResponseWriter, r *http.Request) {
// check if request has cookie set
if cookie, err := r.Cookie(COOKIE_NAME); err != nil {
// else redirect to OAuth Authorization EP
redirectToOAuth(w, r, endpoint)
} else {
session := cookie.Value
accessToken := sessions[session]
// pipe api endpoint
ep := fmt.Sprintf("%s/%s", fidorConfig.FidorApiUrl, endpoint)
if api_req, err := http.NewRequest("GET", ep, nil); err != nil {
w.WriteHeader(500)
w.Write([]byte(err.Error()))
} else {
api_req.Header.Set("Authorization", fmt.Sprintf("Bearer %s", accessToken))
api_req.Header.Set("Accept", "application/vnd.fidor.de; version=1,text/json")
client := &http.Client{}
if api_resp, err := client.Do(api_req); err != nil {
w.WriteHeader(500)
w.Write([]byte(err.Error()))
} else {
if api_resp.StatusCode == 401 { // token probably expired
handleLogout(w, r, endpoint)
return
}
w.Header().Set("Content-Type", "application/json")
w.WriteHeader(api_resp.StatusCode)
defer api_resp.Body.Close()
out, err := os.Create("output.json")
if err != nil {
// panic?
}
defer out.Close()
io.Copy(out, api_resp.Body)
}
}
}
}

If you want to append time in your filename (like #Florian suggested), you can do something like this when you are creating file:
out, err := os.Create(fmt.Sprintf("output-%d.json", time.Now().Unix()))
// filename => output-1257894000.json
Here time.Now().Unix() returns the number of seconds elapsed since January 1, 1970 UTC (aka Unix time). So each time it will create different json file.
More info about time: https://golang.org/pkg/time/#Time.Unix

If you don't want your file to be overwritten you can either give the file a different name (for example by appending the current time using time.Now()) or you can append the output to the file, so the output.json contains all json files.

Related

Golang - Ristretto Cache returning base 64

I am encountering a problem while using ristretto cache. Indeed, i have a little api that should return me a value stored in my ristretto cache as json.
The problem is that when i call my function, the return is the json encoded in base64 and i just can't find the way to decode it.
Here is the code i have:
Part 1: the code for initializing my ristretto cache:
func InitCache() {
var err error
ristrettoCache, err = ristretto.NewCache(&ristretto.Config{
NumCounters: 3000,
MaxCost: 1e6,
BufferItems: 64,
})
if err != nil {
panic(err)
}
}
Part 2: Putting my values in cache:
for _, t := range listTokensFromDB {
b, err := json.Marshal(t)
if err != nil {
fmt.Println(err)
}
ristrettoCache.Set(t.Symbol, b, 1)
}
Part 3: getting the value from cache
func getTokenInfo(w http.ResponseWriter, r *http.Request){
vars := mux.Vars(r)
key := vars["chain"]+vars["symbol"]
value, found := ristrettoCache.Get(key)
if !found {
return
}
json.NewEncoder(w).Encode(value)
}
The result i have when i make a call to my api is:
"eyJTeW1ib2wiOiJic2NDUllQVE8iLCJBZGRyIjoiMHgyQmNBMUFlM0U1MjQ0NzMyM0IzRWE0NzA4QTNkMTg1ODRDYWY4NWE3IiwiTHBzIjpbeyJTeW1ib2xUb2tlbiI6IkZFRyIsIlRva2VuQWRkciI6IjB4YWNGQzk1NTg1RDgwQWI2MmY2N0ExNEM1NjZDMWI3YTQ5RmU5MTE2NyIsIkxwQWRkciI6IjB4NDU5ZTJlMjQ4NGNlMDU2MWRmNTJiYzFlNjkxMzkyNDA2M2JhZDM5MCJ9LHsiU3ltYm9sVG9rZW4iOiJmQk5CIiwiVG9rZW5BZGRyIjoiMHg4N2IxQWNjRTZhMTk1OEU1MjIyMzNBNzM3MzEzQzA4NjU1MWE1Yzc2IiwiTHBBZGRyIjoiMHg3OGM2NzkzZGMxMDY1OWZlN2U0YWJhMTQwMmI5M2Y2ODljOGY0YzI3In1dfQ=="
But i want the base64 decoded version...
If I change the value b to be string when i insert it in cache like so:
for _, t := range listTokensFromDB {
b, err := json.Marshal(t)
if err != nil {
fmt.Println(err)
}
ristrettoCache.Set(t.Symbol, string(b), 1)
}
When i get the response, i get the stringified json like this:
"{"Symbol":"bscCRYPTO","Addr":"0x2BcA1Ae3E52447323B..."
And i can't find a way to get out of this string :/
Anyone would know how i could get the real json please?
Thank you in advance and i wish u a good day!
From my comments, I meant, in this line, value is most likely of type []byte (or []uint8 - which is the same thing)
value, found := ristrettoCache.Get(key)
JSON encoding a []byte will implicitly base64 the output - since JSON is text-based.
json.NewEncoder(w).Encode(value) // <- value is of type []byte
Inspecting the base64 you posted (https://play.golang.org/p/NAVS4qRfDM2) the underlying binary-bytes are already encoded in JSON - so no extra json.Encode is needed.
Just output the raw-bytes in your handler - and set the content-type to application/json:
func getTokenInfo(w http.ResponseWriter, r *http.Request){
vars := mux.Vars(r)
key := vars["chain"]+vars["symbol"]
value, found := ristrettoCache.Get(key)
if !found {
return
}
// json.NewEncoder(w).Encode(value) // not this
w.Header().Set("Content-Type", "application/json")
if bs, ok := value.([]byte); ok {
_, err := w.Write(bs) //raw bytes (already encoded in JSON)
// check 'err'
} else {
// error unexpected type behind interface{}
}
}

Parsing CSV file which includes a header block of "comments"

I'm trying to parse a CSV file hosted in a remote location, but annoyingly the file contains some readme comments atop the file, in the following format:
######
# some readme text,
# some more, comments
######
01-02-03,123,foo,http://example.com
04-05-06,789,baz,http://another.com
I'm attempting to use the following code in order to extract the URLs within the data, but it throws an error saying wrong number of fields due to the comments at the top, presumably it's trying to parse them as CSV content.
type myData struct {
URL string `json:"url"`
}
func doWork() ([]myData, error) {
rurl := "https://example.com/some.csv"
out := make([]myData, 0)
resp, err := http.Get(rurl)
if err != nil {
return []myData{}, err
}
defer resp.Body.Close()
reader := csv.NewReader(resp.Body)
reader.Comma = ','
data, err := reader.ReadAll()
if err != nil {
return []myData{}, err
}
for _, row := range data {
out = append(out, myData{URL: row[4]})
}
return out, nil
}
func main() {
data, err := doWork()
if err != nil {
panic(err)
}
// do something with data
}
Is there a way to skip over the first N lines of the remote file, or have it ignore lines which start with a #
Oh, actually I just realised I can add this:
reader.Comment = '#' // ignores the line starting with '#'
Which works perfectly with my current code, but appreciate the other suggestions.
Well, the approach is simple: do not try to interpret the lines starting with '#' as part of the CSV stream; instead, consider the whole stream of data as a concatenation of two streams: the header and the actual CSV payload.
The easiest approach probably is to employ the fact bufio.Reader is able to read lines from its underlying stream and it itself is an io.Reader so you can make a csv.Reader to read from it instead of the source stream.
So, you could roll like this (not real code, untested):
import (
"bufio"
"encoding/csv"
"io"
"strings"
)
func parse(r io.Reader) ([]MyData, error) (
br := bufio.NewReader(r)
var line string
for {
s, err := br.ReadString('\n')
if err != nil {
return nil, err
}
if len(s) == 0 || s[0] != '#' {
line = s
break
}
}
// At this point the line variable contains the 1st line of the CSV stream.
// Let's create a "multi reader" which reads first from that line
// and then — from the rest of the CSV stream.
cr := csv.NewReader(io.MultiReader(strings.NewReader(line), br))
cr.Comma = ','
data, err := cr.ReadAll()
if err != nil {
return nil, err
}
for _, row := range data {
out = append(out, myData{URL: row[4]})
}
return out, nil
}

Chrome DevTools Protocol - ContinueInterceptedRequest with gzip body in Golang

I have been working on a golang script that uses the chrome devtools protocol to:
1) Intercept a request
2) Grab the response body for the intercepted request
3) Make some modifications to the html document
4) Continue the intercepted request
The script works for HTML documents except when Content-Encoding is set to gzip. The step-by-step process looks like this"
1) Intercept Request
s.Debugger.CallbackEvent("Network.requestIntercepted", func(params godet.Params) {
iid := params.String("interceptionId")
rtype := params.String("resourceType")
reason := responses[rtype]
headers := getHeadersString(params["responseHeaders"])
log.Println("[+] Request intercepted for", iid, rtype, params.Map("request")["url"])
if reason != "" {
log.Println(" abort with reason", reason)
}
// Alter HTML in request response
if s.Options.AlterDocument && rtype == "Document" && iid != "" {
res, err := s.Debugger.GetResponseBodyForInterception(iid)
if err != nil {
log.Println("[-] Unable to get intercepted response body!")
}
rawAlteredResponse, err := AlterDocument(res, headers)
if err != nil{
log.Println("[-] Unable to alter HTML")
}
if rawAlteredResponse != "" {
log.Println("[+] Sending modified body")
err := s.Debugger.ContinueInterceptedRequest(iid, godet.ErrorReason(reason), rawAlteredResponse, "", "", "", nil)
if err != nil {
fmt.Println("OH NOES AN ERROR!")
log.Println(err)
}
}
} else {
s.Debugger.ContinueInterceptedRequest(iid, godet.ErrorReason(reason), "", "", "", "", nil)
}
})
2) Alter the response body
Here I am making small changes to the HTML markup in procesHtml() (but the code for that function is not relevant to this issue, so will not post it here). I also grab headers from the request and when necessary update the content-length and date before continue the reponse. Then, I gzip compress the body when calling r := gZipCompress([]byte(alteredBody), which returns a string. The string is then concatenated to the headers so I can craft the rawResponse.
func AlterDocument(debuggerResponse []byte, headers map[string]string) (string, error) {
alteredBody, err := processHtml(debuggerResponse)
if err != nil {
return "", err
}
alteredHeader := ""
for k, v := range headers{
switch strings.ToLower(k) {
case "content-length":
v = strconv.Itoa(len(alteredBody))
fmt.Println("Updating content-length to: " + strconv.Itoa(len(alteredBody)))
break
case "date":
v = fmt.Sprintf("%s", time.Now().Format(time.RFC3339))
break
}
alteredHeader += k + ": " + v + "\r\n"
}
r := gZipCompress([]byte(alteredBody))
rawAlteredResponse :=
base64.StdEncoding.EncodeToString([]byte("HTTP/1.1 200 OK" + "\r\n" + alteredHeader + "\r\n\r\n\r\n" + r))
return rawAlteredResponse, nil
}
Note: I am now gzip compressing the body for all responses. The above is temporary while I figure out how to solve this issue.
The gzip compress function looks like this:
func gZipCompress(dataToWorkWith []byte) string{
var b bytes.Buffer
gz, err := gzip.NewWriterLevel(&b, 5)
if err != nil{
panic(err)
}
if _, err := gz.Write(dataToWorkWith); err != nil {
panic(err)
}
if err := gz.Flush(); err != nil {
panic(err)
}
if err := gz.Close(); err != nil {
panic(err)
}
return b.String()
}
As seen in the first code snippet, the response body and headers are set here:
err := s.Debugger.ContinueInterceptedRequest(iid, godet.ErrorReason(reason), rawAlteredResponse, "", "", "", nil)
The result is a bunch of garbled characters in the browser. This works without the gzip functions for non gzipped requests. I have changed the compression level as well (without success). Am I processing the body in the wrong order (string > []byte > gzip > string > base64)? Should this be done in a different order to work? Any help would be immensely appreciated.
The response looks like this, which Chrome puts inside a <body></body> tag
����rܸ� ��_A��Q%GH��Kʔ��vU�˷c�v�}
or in the response:
I can also tell that it is compressing correctly as, when I remove headers, the request results in a .gz file download with all the correct .html when uncompressed. Additionally, the first few bytes in the object returned in gZipCompress tell me that it is gzipped correctly:
31 139 8
or
0x1f 0x8B 0x08
I ended up using a different library that handles larger responses better and more efficiently.
Now, it appears that the DevTools protocol returns the response body after decompression but before rendering it in the browser when calling Network.GetResponseBodyForInterception. This is an assumption only of course, as I do not see code for that method in https://github.com/ChromeDevTools/devtools-protocol. The assumption is based on the fact that, when calling Network.GetResponseBodyForInterception the response body obtained is NOT compressed (though it may be base64 encoded). Furthermore, the method is marked as experimental and the documentation does not mention anything in regards to compressed responses. Based on that assumption, I will further assume that, at the point that we get the response from Network.GetResponseBodyForInterception it is too late to compress the body ourselves. I confirm that the libraries that I am working with do not bother to compress or uncompress gzipped responses.
I am able to continue working with my code without a need to worry about gzip compressed responses, as I can alter the body without problems.
For reference, I am now using https://github.com/wirepair/gcd, as it is more robust and stable when intercepting larger responses.

How to output golang http json body to website page using html/javascript/jquery

I've got a Golang Website where I want to display 'scores' from my UWP Game using SQLite's Mobile App Quickstart's API called SwaggerUI. I am getting the scores by doing a HTTP GET request. The problem is that the scores output to the Golang console in JSON Format. I want to display the scores onto the actual website. How could I call my golang function in the Frontend in order to do this? The frontend is written in HTML/Javascript/JQuery.
This is my Golang Function that does the HTTP Request to SwaggerUI and outputs to the Golang Console:
func scoresPage(res http.ResponseWriter, req *http.Request) {
//Connecting to SwaggerUI API to get Scores from Azure for UWP Application
req, err := http.NewRequest("GET", os.ExpandEnv("https://brainworksappservice.azurewebsites.net/tables/TodoItem?$select=score"), nil)
if err != nil {
log.Fatal(err)
}
//You have to specify these headers
req.Header.Set("Accept", "application/json")
//If you do not specify what version your API is, you cannot receive the JSON
req.Header.Set("Zumo-Api-Version", "2.0.0")
//Do the request
resp, err := http.DefaultClient.Do(req)
//Error if the request cannot be done
if err != nil {
log.Fatal(err)
}
//You need to close the Body everytime, as if you don't you could leak information
defer resp.Body.Close()
//Read all of the information from the body
body, err := ioutil.ReadAll(resp.Body)
//Error if the info cannot be read
if err != nil {
log.Fatal(err)
}
//Write the JSON to the standard output (the Console)
_, err = os.Stdout.Write(body)
//Error if the info cannot be output to the console
if err != nil {
log.Fatal(err)
}
http.ServeFile(res, req, "Scores.html")
} `
This is the main Function which serves up the website and handles the scores page:
func main() {
http.HandleFunc("/scores", scoresPage)
//serve on the port 8000 forever
http.ListenAndServe(":8000", nil)
}
Assuming, that you don't want to dump the json as is onto you page but instead format it in some way with html and css, then you could first decode the returned body into a slice of structs that mirror the structure of your json. For example like this:
type Score struct {
Id string `json:"id"`
CreatedAt time.Time `json:"createdAt"`
UpdatedAt time.Time `json:"updatedAt"`
Version string `json:"version"`
Deleted bool `json:"deleted"`
Text string `json:"text"`
Complete bool `json:"complete"`
Score string `json:"score"`
}
scores := []*Score{}
if err := json.Unmarshal(body, &scores); err != nil {
panic(err)
}
fmt.Println(scores[0])
https://play.golang.org/p/m_ySdZulqy
After you've decoded the json you can use Go's template package to loop over the scores and format them as you wish. While you should use the html/template package for rendering html you should check out text/template for the documentation on how to actually program the templates, they have the same interface.
Here's a quick example: https://play.golang.org/p/EYfV-TzoA0
In that example I'm using the template package to parse a string (scoresPage) and output the result to stdout, but you can just as easily parse your Scores.html file with ParseFiles and return the output in the http response by passing the res http.ResponseWriter instead of os.Stdout as the first argument to template.Execute.

Decoding gZip json with Go

As a Go newbie it's difficult for me to pinpoint the problem area, but hopefully giving you some facts will help.
I'm playing with an API which returns its Content-Encoding as gzip. I have written the following to encode my response struct:
reader, err = gzip.NewReader(resp.Body)
defer reader.Close()
// print to standard out
//_, err = io.Copy(os.Stdout, reader)
//if err != nil {
// log.Fatal(err)
//}
// Decode the response into our tescoResponse struct
var response TescoResponse
err := json.NewDecoder(reader).Decode(&response)
I've removed the error handling for brevity, but the point of interest is that if I uncomment the print to stdout, I get the expected result. However, the decode doesn't give me what I expect. Any pointers? Is it that the struct has to map exactly to the response??
Here's the full example:
https://play.golang.org/p/4eCuXxXm3T
From the documenation:
DisableCompression, if true, prevents the Transport from requesting
compression with an "Accept-Encoding: gzip" request header when the
Request contains no existing Accept-Encoding value. If the Transport
requests gzip on its own and gets a gzipped response, it's
transparently decoded in the Response.Body. However, if the user
explicitly requested gzip it is not automatically uncompressed.
Proposed solution:
type gzreadCloser struct {
*gzip.Reader
io.Closer
}
func (gz gzreadCloser) Close() error {
return gz.Closer.Close()
}
// then in your http call ....
if resp.Header.Get("Content-Encoding") == "gzip" {
resp.Header.Del("Content-Length")
zr, err := gzip.NewReader(resp.Body)
if err != nil {
return nil, err
}
resp.Body = gzreadCloser{zr, resp.Body}
}
// then you will be able to decode the json transparently
if err := json.NewDecoder(resp.Body).Decode(&response); err != nil {
}
Adapted solution from your code: https://play.golang.org/p/Vt07y_xgak
As #icza mentioned in the comments, decoding isn't required because the gzip reader automatically decodes when you read using it. Perhaps try:
ubs := make([]byte, len) // check Content-Length header to set len
n, err := reader.Read(ubs)
err := json.Unmarshal(ubs, &response)