Golang http request POST works once - json

I have a master and slave. Master have api call result, which takes JSON. And i have proplem with slave, which send this result on master, the first time my code sends json well, but second time, code stop(program wait.....) on resp, err := client.Do(req), when create query on master.
salve code:
func main (){
for {
// some code, very long code
sendResult(resFiles)
}
}
func sendResult(rf common.ResultFiles) {
jsonValue, err := json.Marshal(rf)
req, err := http.NewRequest(methodPost, ResultAdress,
bytes.NewBuffer(jsonValue))
req.Header.Set("Content-Type", ContentType)
client := &http.Client{}
resp, err := client.Do(req)
if err != nil {
panic(err)
}
defer resp.Body.Close()
fmt.Println("response Status:", resp.Status)
}
master api call:
func result(c echo.Context) error {
rf := &ResultFiles{}
err := c.Bind(rf)
if err != nil {
log.Fatal(err)
}
rfChannel <- *rf
return c.JSON(http.StatusOK, nil)
}
My question: Why? May be problem in the standard client golang (http.Client) or timeout? If i set timout - my code crashed by timeout)))anticipated....
Thank you!

You need to add a timeout to your http.Client. By default http.Client has timeout specified as 0 which means no timeout at all. So if server is not responding than your application will just hang waiting for a response. This problem is fully described in this article Don’t use Go’s default HTTP client (in production). Though you create custom client you still need to specify timeout.

Problem link with channel, i send result work slave on master to channel, but channel work without loop, i add loop for read data from channel and all work.

Related

Stream data from API

I am trying to pull data on mails coming into an API from an email testing tool mailhog.
If I use a call to get a list of emails e.g
GET /api/v1/messages
I can load this data into a struct with no issues and print out values I need.
However if I use a different endpoint that is essentially a stream of new emails coming in, I have different behavior. When I run my go application I get no output whatsoever.
Do I need to do like a while loop to constantly listen to the endpoint to get the output?
My end goal is to pull some information from emails as they come in and then pass them into a different function.
Here is me trying to access the streaming endpoint
https://github.com/mailhog/MailHog/blob/master/docs/APIv1.md
res, err := http.Get("http://localhost:8084/api/v1/events")
if err != nil {
panic(err.Error())
}
body, err := ioutil.ReadAll(res.Body)
if err != nil {
panic(err.Error())
}
var data Email
json.Unmarshal(body, &data)
fmt.Printf("Email: %v\n", data)
If I do a curl request at the mailhog service with the same endpoint, I do get output as mails come in. However I cant seem to figure out why I am getting no output via my go app. The app does stay running just I dont get any output.
I am new to Go so apologies if this is a really simple question
From ioutil.ReadAll documentation:
ReadAll reads from r until an error or EOF and returns the data it read.
When you use to read the body of a regular endpoint, it works because the payload has an EOF: the server uses the header Content-Length to tell how many bytes the body response has, and once the client read that many bytes, it understands that it has read all of the body and can stop.
Your "streaming" endpoint doesn't use Content-Length though, because the body has an unknown size, it's supposed to write events as they come, so you can't use ReadAll in this case. Usually, in this case, you are supposed to read line-by-line, where each line represents an event. bufio.Scanner does exactly that:
res, err := http.Get("http://localhost:8084/api/v1/events")
if err != nil {
panic(err.Error())
}
scanner := bufio.NewScanner(res.Body)
for e.scanner.Scan() {
if err := e.scanner.Err(); err != nil {
panic(err.Error())
}
event := e.scanner.Bytes()
var data Email
json.Unmarshal(event, &data)
fmt.Printf("Email: %v\n", data)
}
curl can process the response as you expect because it checks that the endpoint will stream data, so it reacts accordinly. It may be helpful to add the response curl gets to the question.

Approach to send a large JSON payload to a web service

Consider a small Go application that reads a large JSON file 2GB+, marshals the JSON data into a struct, and POSTs the JSON data to a web service endpoint.
The web service receiving the payload changed its functionality, and now has a limit of 25MB per payload. What would be the best approach to overcome this issue using Go? I've thought of the following, however I'm not sure it is the best approach:
Creating a function to split the large JSON file into multiple smaller ones (up to 20MB), and then iterate over the files sending multiple smaller requests.
Similar function to the one being used to currently send the entire JSON payload:
func sendDataToService(data StructData) {
payload, err := json.Marshal(data)
if err != nil {
log.Println("ERROR:", err)
}
request, err := http.NewRequest("POST", endpoint, bytes.NewBuffer(payload))
if err != nil {
log.Println("ERROR:", err)
}
client := &http.Client{}
response, err := client.Do(request)
log.Println("INFORMATIONAL:", request)
if err != nil {
log.Println("ERROR:", err)
}
defer response.Body.Close()
}
You can break the input into chunks and send each piece individually:
dec := json.NewDecoder(inputStream)
tok, err := dec.Token()
if err != nil {
return err
}
if tok == json.Delim('[') {
for {
var obj json.RawMessage
if err := dec.Decode(&obj); err != nil {
return err
}
// Here, obj contains one element of the array. You can send this
// to the server.
if !dec.More() {
break
}
}
}
As the server-side can process data progressively, I assume that the large JSON object can be split into smaller pieces. From this point, I can propose several options.
Use HTTP requests
Pros: Pretty simple to implement on the client-side.
Cons: Making hundreds of HTTP requests might be slow. You will also need to handle timeouts - this is additional complexity.
Use WebSocket messages
If the receiving side supports WebSockets, a step-by-step flow will look like this:
Split the input data into smaller pieces.
Connect to the WebSocket server.
Start sending messages with the smaller pieces till the end of the file.
Close connection to the server.
This solution might be more performant as you won't need to connect and disconnect from the server each time you send a message, as you'd do with HTTP.
However, both solutions suppose that you need to assemble all pieces on the server-side. For example, you would probably need to send along with the data a correlation ID to let the server know what file you are sending right now and a specific end-of-file message to let the server know when the file ends. In the case of the WebSocket server, you could assume that the entire file is sent during a single connection session if it is relevant.

Unmarshalling JSON with Duplicate Fields

I'm still learning the go language, but I've been trying to find some practical things to work on to get a better handle on it. Currently, I'm trying to build a simple program that goes to a youtube channel and returns some information by taking the public JSON and unmarshalling it.
Thus far I've tried making a completely custom struct that only has a few fields in it, but that doesn't seem to pull in any values. I've also tried using tools like https://mholt.github.io/json-to-go/ and getting the "real" struct that way. The issue with that method is there are numerous duplicates and I don't know enough to really assess how to tackle that.
This is an example JSON (I apologize for its size) https://pastebin.com/6u0b39tU
This is the struct that I get from the above tool: https://pastebin.com/3ZCu96st
the basic pattern of code I've tried is:
jsonFile, err := os.Open("test.json")
if err != nil {
fmt.Println("Couldn't open file", err)
}
defer jsonFile.Close()
bytes, _ := ioutil.ReadAll(jsonFile)
var channel Autogenerated
json.Unmarshal(bytes, &Autogenerated)
if err != nil {
fmt.Println("Failed to Unmarshal", err)
}
fmt.Println(channel.Fieldname)
Any feedback on the correct approach for how to handle something like this would be great. I get the feeling I'm just completely missing something.
In your code, you are not unmarshaling into the channel variable. Furthermore, you can optimize your code to not use ReadAll. Also, don't forget to check for errors (all errors).
Here is an improvement to your code.
jsonFile, err := os.Open("test.json")
if err != nil {
log.Fatalf("could not open file: %v", err)
}
defer jsonFile.Close()
var channel Autogenerated
if err := json.NewDecoder(jsonFile).Decode(&channel); err != nil {
log.Fatalf("failed to parse json: %v", err)
}
fmt.Println(channel.Fieldname)
Notice how a reference to channel is passed to Decode.

Apex Process exited before completing request when with Golang

All the answers for something like this are in Javascript, and I'm not sure if it applies in Go.
I've done this
func main() {
db, err := sql.Open("mysql", "db_details")
err = db.Ping()
if err != nil {
fmt.Println("Failed to prepare connection to database")
// log.Fatal("Error:", err.Error())
}
apex.HandleFunc(func(event json.RawMessage, ctx *apex.Context) (interface{}, error) {
fmt.Println(ctx)
return map[string]string{"hello": "world"}, nil
})
}
So I'm trying to hit my Amazon RDS MySql db using golang's sql driver.
I get this error
Error: function response: Response Id: <some_id> Process exited before completing request
From looking around, there are two causes - 1. I need Go's equivalent of context.done, or 2. I need to raise the timeout.
As I'm using Apex, I raised the timeout to be 300s, which is the maximum. No luck there.
I then tried going through the Apex code to see if there was a Context.Done defined or used anywhere - there isn't.
How do I get around this?

Posting json data to remote server with http.Post

I am stumped on what seems like a very simple problem.
I am receiving a json object from a client. It looks like this:
{
"user": "test#example.com"
}
I need to simply pass this on to another part of the api as a POST request. This is what i've got until now:
//Decode incomming json
decoder := json.NewDecoder(r.Body)
var user UserInformation
err := decoder.Decode(&user)
if err != nil {
log.Println(err)
}
jsonUser, _ := json.Marshal(user)
log.Println(string(jsonUser[:])) //Correct output
buffer := bytes.NewBuffer(jsonUser)
log.Println(string(buffer.Bytes()[:])) //Also correct
resp, err := http.Post("http://example.com/api/has_publisher", "application/json", buffer)
if err != nil {
log.Println(err)
}
As I cannot test this program on a live system, I verified the resulting post request with wireshark only to find that the content is missing along with Content-Length being 0. For some reason, http.Post doesn't read from the buffer.
Am i missing something here? I would greatly appreciate if someone could point me in the right direction.
Thanks!
Shouldn`t be the root cause but replace
buffer := bytes.NewBuffer(jsonUser)
with
buffer := bytes.NewReader(jsonUser)
It is more likely that your test setup is the root cause. I assume you are pointing to a non-existing endpoint. This would result in a failure (TCP SYN fails) before the actual HTTP POST is send.
Check if you can use mockable.io as an alternative to mock your backend.