Error when sending blob of binary data to dynamodb - json

I'm running into an issue with attempting to manage a dynamodb instance using godynamo.
My code is meant to take a gob encoded byte array and put it into dynamodb.
func (c *checkPointManager) CommitGraph(pop *Population) {
var blob, err = pop.GobEncodeColorGraphs()
fitness := pop.GetTotalFitness()
if err != nil {
log.Fatal(err)
}
put1 := put.NewPutItem()
put1.TableName = "CheckPoint"
put1.Item["fitnessScore"] = &attributevalue.AttributeValue{N: string(fitness)}
put1.Item["population"] = &attributevalue.AttributeValue{N: string(1)}
put1.Item["graph"] = &attributevalue.AttributeValue{B: string(blob)}
body, code, err := put1.EndpointReq()
if err != nil || code != http.StatusOK {
log.Fatalf("put failed %d %v %s\n", code, err, body)
}
fmt.Printf("values checkpointed: %d\n %v\n %s\n", code, err, body)
}
Every time I run this code though, I get the following error.
can not be converted to a Blob: Base64 encoded length is expected a multiple of 4 bytes but found: 25
Does godynamo not handle making sure a binary array specifically converts to base64? Is there an easy way for me to handle this issue?

"Client applications must encode binary values in base64 format" according to the binary data type description of Amazon DynamoDB Data Types.
Your code could encode the value if you want, see golang's base64 package:
https://golang.org/pkg/encoding/base64
The godynamo library provides functions that will encode it for you, have a look at AttributeValue:
// InsertB_unencoded adds a new plain string to the B field.
// The argument is assumed to be plaintext and will be base64 encoded.
func (a *AttributeValue) InsertB_unencoded(k string) error {

Related

How to parse protocol buffer message and create json out of it?

Here is my minimal .proto file:
syntax = "proto3";
message getDhtParams {}
message DhtContents {
string dht_contents=1;
}
service MyApp {
rpc getDhtContent(getDhtParams) returns (DhtContents) {}
}
Two things to note related to the above proto file:
It is a minimal file. There is a lot more to it.
The server is already generated and running. The server is implemented in Python.
I am writing client in Go. And this is the fetching code I have come up with:
func fetchDht() (*pb.DhtContents, error) {
// Set up a connection to the server.
address := "localhost:9998"
conn, err := grpc.Dial(address, grpc.WithTransportCredentials(insecure.NewCredentials()))
if err != nil {
log.Fatalf("did not connect: %v", err)
}
defer conn.Close()
client := pb.NewMyAppClient(conn)
ctx, cancel := context.WithTimeout(context.Background(), time.Second)
defer cancel()
r, err := client.GetDhtContent(ctx, &pb.GetDhtParams{})
if err != nil {
return nil, errors.New("could not get dht contents")
}
return r, nil
}
For sake of simplicity, I have tripped down the output, but the output looks something like this:
dht_contents:"{'node_ids': ['dgxydhlqoopevxv'], 'peer_addrs': [['192.168.1.154', '41457']], 'peer_meta': [{'peer_id': {'nodeID': 'dgxydhlqoopevxv', 'key': 'kdlvjdictuvgxspwkdizqryr', 'mid': 'isocvavbtzkxeigkkrubzkcx', 'public_key': 'uhapwxnfeqqmnojsaijghhic', '_address': 'xklqlebqngpkxb'}, 'ip_addrs': ['192.168.1.154', '41457'], 'services': [{'service_input': '', 'service_output': '', 'price': 0}], 'timestamp': 1661319968}]}"
A few things to note about this response:
It starts with dht_contents: which I know is a field of DhtContents message.
This could be an issue from the server side; in that case I will inform the service developer. But the json enclosed is not a valid JSON as it uses single quotes.
My questions:
What is an elegant way to deal with that dht_contents? There must be the protobuf/grpc way. I aim to get the contents between double quotes.
How do I convert the content to JSON? I have already created the struct to unmarshal.
It would be enough if I am also able to convert the response which is of type *pb.DhtContents to []byte, from there I can convert it to JSON.
The generated code should have a method which will get rid of dht_contents:" from the start and " from the end.
In your case, that method should be called GetDhtContents().
You can modify your fetchDht function to something like this:
func fetchDht() (string, error) {
address := "localhost:9998"
// ...
if err != nil {
return nil, errors.New("could not get dht contents")
}
return r.GetDhtContents(), nil
}
From there on, you can work on making it a valid JSON by replacing single quotes to double quotes. Or it may be handled on the service end.
there is the methods generated by proto file to get the content from the result(the "r"), then use r.Get..., you could get the content.
convert string to the type you want.
suggest:
change proto type to bytes
then json.Unmarshal([r.Get...],[dst])

In golang json.Unmarshal() works in playground/copy pasted JSON but not in actual code

I am writing a program in Golang that interfaces with a modified version of the barefoot mapmatching library which returns results in json via netcat.
My in my actual code json.Unmarshal will only parse the response to the nil value of the struct. But if print the json to console (see code snippet below) and copy paste into goplayground it behaves as expected.
I am wondering if this is an encoding issue that is bypassed when I copy paste from the console as a result.
How do I get my code to process the same string as it is received from barefoot as when it is copy pasted from the console?
Here is the relevant code snippet (structs are identical to goplayground)
body := io_func(conn, cmd)
var obvs []Json_out
json.Unmarshal([]byte(body), &obvs)
fmt.Println(body)
fmt.Println(obvs)
and io_func() if relevant (the response is two lines, with a message on the first and a json string on the second)
func io_func(conn net.Conn, cmd string) string {
fmt.Fprintf(conn, cmd+"\n")
r := bufio.NewReader(conn)
header, _ := r.ReadString('\n')
if header == "SUCCESS\n" {
resp, _ := r.ReadString('\n')
return resp
} else {
return ""
}
}
Following Cerise Limón's advice to properly handle error messages I determined the osm_id value in the JSON was being parsed by json.Unmarshall as number when taking the string from io_func(), although it wasn't doing so when the string was passed in manually in the playground example. Although I don't understand why this is so I would have picked it up with proper error handling.
I altered barefoot code to return the osm_id explicitly in inverted commas since, although only ever composed of digits, I only use it as a string. It now works as expected. Equally I could have changed the type in the struct and convert in Go as needed.
The io_func function creates and discards a bufio.Reader and data the reader may have buffered. If the application calls io_func more than once, then the application may be discarding data read from the network. Fix by creating a single bufio.Reader outside the function and pass that single reader to each invocation of io_func.
Always check and handle errors. The error returned from any of these functions may point you in the right direction for a fix.
func io_func(r *bufio.Reader, conn net.Conn, cmd string) (string, error) {
fmt.Fprintf(conn, cmd+"\n")
header, err := r.ReadString('\n')
if err != nil {
return "", err
}
if header == "SUCCESS\n" {
return r.ReadString('\n')
}
return "", nil
}
...
r := bufio.NewReader(conn)
body, err := io_func(r, conn, cmd)
if err != nil {
// handle error
}
var obvs []Json_out
err = json.Unmarshal([]byte(body), &obvs)
if err != nil {
// handle error
}
fmt.Println(body)
fmt.Println(obvs)
// read next
body, err = io_func(r, conn, cmd)
if err != nil {
// handle error
}
The application uses newline to terminate the JSON body, but newline is valid whitespace in JSON. If the peer includes a newline in the JSON, then the application will read a partial message.

Using go-jsonnet to return pure JSON

I am using Google's go-jsonnet library to evaluate some jsonnet files.
I have a function, like so, which renders a Jsonnet document:
// Takes a list of jsonnet files and imports each one and mixes them with "+"
func renderJsonnet(files []string, param string, prune bool) string {
// empty slice
jsonnetPaths := files[:0]
// range through the files
for _, s := range files {
jsonnetPaths = append(jsonnetPaths, fmt.Sprintf("(import '%s')", s))
}
// Create a JSonnet VM
vm := jsonnet.MakeVM()
// Join the slices into a jsonnet compat string
jsonnetImport := strings.Join(jsonnetPaths, "+")
if param != "" {
jsonnetImport = "(" + jsonnetImport + ")" + param
}
if prune {
// wrap in std.prune, to remove nulls, empty arrays and hashes
jsonnetImport = "std.prune(" + jsonnetImport + ")"
}
// render the jsonnet
out, err := vm.EvaluateSnippet("file", jsonnetImport)
if err != nil {
log.Panic("Error evaluating jsonnet snippet: ", err)
}
return out
}
This function currently returns a string, because the jsonnet EvaluateSnippet function returns a string.
What I now want to do is render that result JSON using the go-prettyjson library. However, because the JSON i'm piping in is a string, it's not rendering correctly.
So, some questions:
Can I convert the returned JSON string to a JSON type, without knowing beforehand what struct to marshal it into
if not, can I render the json in a pretty manner some other way?
Is there an option, function or method I'm missing here to make this easier?
Can I convert the returned JSON string to a JSON type, without knowing beforehand what struct to marshal it into
Yes. It's very easy:
var jsonOut interface{}
err := json.Unmarshal([]byte(out), &jsonOut)
if err != nil {
log.Panic("Invalid json returned by jsonnet: ", err)
}
formatted, err := prettyjson.Marshal([]byte(jsonOut))
if err != nil {
log.Panic("Failed to format jsonnet output: ", err)
}
More info here: https://blog.golang.org/json-and-go#TOC_5.
Is there an option, function or method I'm missing here to make this easier?
Yes. The go-prettyjson library has a Format function which does the unmarshalling for you:
formatted, err := prettyjson.Format([]byte(out))
if err != nil {
log.Panic("Failed to format jsonnet output: ", err)
}
can I render the json in a pretty manner some other way?
Depends on your definition of pretty. Jsonnet normally outputs every field of an object and every array element on a separate line. This is usually considered pretty printing (as opposed to putting everything on the same line with minimal whitespace to save a few bytes). I suppose this is not good enough for you. You can write your own manifester in jsonnet which formats it to your liking (see std.manifestJson as an example).

REST API for uploading file inside JSON

I am designing an REST API to upload a largish (100MB) file together with some information. So it's natural to think of json encoding.
So something like this:
{
file: content of the file or URL?
name: string
description: string
}
The name and description are easy to do with json but I'm not sure how the file content can be added to it.
Also I'm thinking I should use http PUT method. Is this correct?
Incidentally, golang is used to implement this API if it matters.
For a JSON encoding, use a []byte value to hold the file contents. The standard encoding/json package encodes []byte values as base64 strings.
Here's a sketch of how to implement the JSON encoding. Declare a type representing the payload:
type Upload struct {
Name string
Description string
Content []byte
}
To encode the file to a request body:
v := Upload{Name: fileName, Description: description, Content: content}
var buf bytes.Buffer
if err := json.NewEncoder(&buf).Encode(v); err != nil {
// handle error
}
req, err := http.NewRequest("PUT", url, &buf)
if err != nil {
// handle error
}
resp, err := http.DefaultClient.Do(req)
To decode the from a request body on the server:
var v Upload
if err := json.NewDecoder(req.Body).Decode(&v); err != nil {
// handle error
}
Another option is to use the mime/multipart package. The multipart encoding will be more efficient than JSON encoding because no base64 or other text encoding of the file is required for multipart.
To me, the most clear-cut way to do it would be to encode the file bytes somehow. base64 seems like a good choice, and golang has built-in support for it with "encoding/base64".

Decoding a request body in Go -- Why am I getting an EOF?

I'm using the Beego framework to build a web application, and I'm trying to hand it some JSON encoded data. Roughly, this is what I have:
import (
"github.com/astaxie/beego"
)
type LoginController struct {
beego.Controller
}
func (this *LoginController) Post() {
request := this.Ctx.Request
length := request.ContentLength
p := make([]byte, length)
bytesRead, err := this.Ctx.Request.Body.Read(p)
if err == nil{
//blah
} else {
//tell me the length, bytes read, and error
}
}
Per this tutorial, the above Should Just Work (tm).
My problem is this: bytesRead, err := this.Ctx.Request.Body.Read(p) is returning 0 bytes read and the err.Error() is EOF.
The request.ContentLength, however, is a sane number of bytes (19 or more, depending on what data I type in).
I can't figure out why the request would appear to have some length, but would fail on Read. Any ideas?
If you are trying to reach a JSON payload in Beego, you'll want to call
this.Ctx.Input.RequestBody
That returns a []byte array of the sent payload. You can then pass it to a function like:
var datapoint Datapoint
json.Unmarshal(this.Ctx.Input.RequestBody, &datapoint)
Where datapoint is the struct you are attempting to unmarshall your data into.