Insert a slice result JSON into MongoDB - json

I'm using the mgo driver for MongoDB, with the Gin framework.
type Users struct {
User_id *string `json:"id user" bson:"id user"`
Images []string `json:"images" bson:"images"`
}
I have this function which tries to convert the slice into JSON.
The slice here is UsersTotal
func GetUsersApi(c *gin.Context) {
UsersTotal, err := GetUsers()
if err != nil {
fmt.Println("error:", err)
}
c.JSON(http.StatusOK, gin.H{
"Count Users": len(UsersTotal),
"Users Found ": UsersTotal,
})
session, err := mgo.Dial(URL)
if err == nil {
fmt.Println("Connection to mongodb established ok!!")
cc := session.DB("UsersDB").C("results")
err22 := cc.Insert(&UsersTotal)
if err22 != nil {
fmt.Println("error insertion ", err22)
}
}
session.Close()
}
Running it I get the following error:
error insertion Wrong type for documents[0]. Expected a object, got a array.

Inserting multiple documents is the same as inserting a single one because the Collection.Insert() method has a variadic parameter:
func (c *Collection) Insert(docs ...interface{}) error
One thing you should note is that it expects interface{} values. Value of any type qualifies "to be" an interface{}. Another thing you should note is that only the slice type []interface{} qualifies to be []interface{}, a user slice []User does not. For details, see Type converting slices of interfaces in go
So simply create a copy of your users slice where the copy has a type of []interface{}, and that you can directly pass to Collection.Insert():
docs := make([]interface{}, len(UsersTotal))
for i, u := range UsersTotal {
docs[i] = u
}
err := cc.Insert(docs...)
// Handle error
Also please do not connect to MongodB in your handler. Do it once, on app startup, store the global connection / session, and clone / copy it when needed. For details see mgo - query performance seems consistently slow (500-650ms); and too many open files in mgo go server.

Related

Is it possible to Unmarshall a JSON which has varying field?

I am tring to get League of Legends champion informations from LOL static database. Link is given below.
Get info for specific hero
The problem is that i can only make request by hero names and all JSON responses are different from each other by only one field which is a "main" field; hero name. You can find problematic field as highlighted below:
Also tree respresentation:
My goal is to get all hero informations with iteration by range of known hero names as slice. You can check the code. I need only a couple of fields but the main tag is varies with every new request.
func GetHeroInfo(heroName string) *LolHeroInfo {
getUrl := fmt.Sprintf("http://ddragon.leagueoflegends.com/cdn/12.2.1/data/en_US/champion/%s.json", heroName)
fmt.Println(getUrl)
resp, err := http.Get(getUrl)
if err != nil {
fmt.Println(err)
return nil
}
defer resp.Body.Close()
read, err := io.ReadAll(resp.Body)
if err != nil {
fmt.Println(err)
return nil
}
heroGoverted := LolHeroInfo{}
err = json.Unmarshal(read, &heroGoverted)
if err != nil {
fmt.Println("unmarshall failed:", err)
return nil
}
return &heroGoverted }
Struct type LolHeroInfo is structifyed from this link: mholt's JSON to struct
You can check JSON response for another hero eg: JSON response for Annie .
Is there any way to make an agnostic struct field/tag for a JSON field. I believe this will be very hard because encoding/json package needs to check for field for particular tag in that JSON but maybe you encountered this kind of problem before. Creating separate struct for each hero is impossible so i will drop this case if i can't find a solution.
Thanks in advance for help.
Since you know it's just a single unknown key, you could just decode into a map[string]LolHeroInfo for the Data field, then do
heroGoverted := LolHeroInfo{}
for _, v := range decoded.Data {
heroGoverted = v
}
To solve problem, I used #dave 's solution.
I breake main struct into two separate struct. This way varying JSON field eliminated:
type LolHeroInfo struct {
Type string `json:"type"`
Format string `json:"format"`
Version string `json:"version"`
Data map[string]HeroInfoStruct `json:"data"`
}
heroInfo := lib.GetHeroInfo(lolHeroes[i])
for _, v := range heroInfo.Data { //unmarshalled to first struct
a := lib.HeroInfoStruct{} //data field; second struct
a = v
fmt.Println(a.Lore)// now i can reach to every field that i need
}

Parsing data from AWS api using Golang

I am using the Connect API to get all the contact flows from a particular instance and want to store them in a DynamoDB.
type contactFlow struct {
Arn string
ContactFlowType string
Id string
Name string
}
func HandleRequest(ctx context.Context) (string, error) {
var contactFlowDetails []contactFlow
mySession := session.Must(session.NewSession())
connectSession := connect.New(mySession)
connectInstance := &connect.ListContactFlowsInput{
InstanceId: aws.String("INSTANCE_ID"),
}
connectResult, connectError := connectSession.ListContactFlows(connectInstance)
connectResultFlow := connectResult.ContactFlowSummaryList
connectFlowSummaryList := awsutil.Prettify(connectResultFlow)
fmt.Println(connectFlowSummaryList)
json.Unmarshal([]byte(connectFlowSummaryList), &contactFlowDetails)
fmt.Println(contactFlowDetails)
The API that I am trying to use is this: https://docs.aws.amazon.com/sdk-for-go/api/service/connect/#ListContactFlowsOutput
I do get the result when I print out connectFlowSummaryList on CloudWatch Logs but it always returns an empty array [] when I print out contactFlowDetails.
Edit 1: I think I found what could be the potential problem while doing this decoding. The result from the logs look something like this:
[
{
Arn: "INSTANCE_ID",
ContactFlowType: "AGENT_WHISPER",
Id: "CONTACT_FLOW_ID",
Name: "Default agent whisper"
}
]
The key values of the result are not present inside double-inverted commas, how could I go about decoding such a result?
Thanks!
What you should do is marshal connectResultFlow.ContactFlowSummaryList to a json string before passing it to awsutil.Prettify (if you need to).
You can also, skip awsutil.Prettify completely, to arrive at this:
connectResultFlow := connectResult.ContactFlowSummaryList
b, err := json.Marshal(connectResultFlow)
if err != nil {
return "", err
}
json.Unmarshal(b, &contactFlowDetails)
fmt.Println(contactFlowDetails)

Inserting JSON or a map from map[string]interface{} to MongoDB collection sets ints and floats as strings

I know the title seems generic and a duplicate, but i've tried many of the options from previous questions, and I can't use a struct here
My system is using the messaging service NATS to sends maps between a subscriber and a publisher. The subscriber takes the received map, and inserts it as a document in a MongoDB collection
The problem I have is that floats and ints are inserted as strings!
In my code, the recipe is a configuration file that sets the datatypes of the columns received in the map. Think of it as a series of keys like this:
String column: "string",
Int column: "int"
Here's the code that creates the map with the right datatypes
mapWithCorrectDataTypes := make(map[string]interface{})
for columnNameFromDataTypesInRecipe, datatypeForColumnInRecipe := range dataTypesFromRecipeForColumns {
for natsMessageColumn, natsMessageColumnValue := range mapFromNATSMessage {
//If the column in the NATS message is found in the recipe, format the data as dictated in the recipe
if natsMessageColumn == columnNameFromDataTypesInRecipe {
if datatypeForColumnInRecipe.(string) == "string" {
natsMessageColumnValue = natsMessageColumnValue.(string)
mapWithCorrectDataTypes[columnNameFromDataTypesInRecipe] = natsMessageColumnValue
}
if datatypeForColumnInRecipe.(string) == "int" {
convertedInt, err := strconv.Atoi(mapFromNATSMessage[columnNameFromDataTypesInRecipe].(string))
if err != nil {
fmt.Println("ERROR -->", err)
}
mapWithCorrectDataTypes[columnNameFromDataTypesInRecipe] = convertedInt
}
if datatypeForColumnInRecipe.(string) == "float64" {
convertedFloat, err := strconv.ParseFloat(mapFromNATSMessage[columnNameFromDataTypesInRecipe].(string), 64)
if err != nil {
fmt.Println("ERROR -->", err)
}
mapWithCorrectDataTypes[columnNameFromDataTypesInRecipe] = convertedFloat
fmt.Println("TYPE -->", reflect.TypeOf(mapWithCorrectDataTypes[columnNameFromDataTypesInRecipe]))
}
} else {
//If column not found in the recipe, format as a string
mapWithCorrectDataTypes[natsMessageColumn] = natsMessageColumnValue.(string)
}
}
}
From the last line I put in a print statement for float64s to check that the datatype for this key in the map is correct, and it passes this test!
My question is this: If the data types are correctly being set in the map, why when the map is inserted as a document in MongoDB are the floats and ints set as strings?!
What I have tried so far:
Marshalling and unmarshalling the map as an interface, then inserting the record:
jsonVersionOfMap, err := json.Marshal(mapWithCorrectDataTypes)
if err != nil {
fmt.Println("ERROR -->", err)
}
var interfaceForJSON interface{}
json.Unmarshal(jsonVersionOfMap, &interfaceForJSON)
fmt.Println("JSON -->", interfaceForJSON)
err = mongoConnection.Insert(interfaceForJSON)
if err != nil {
fmt.Println("Error inserting MongoDB documents", err)
}
What am I missing here?
See the result with the incorrectly formatted data:
this may not be a fix. But i've resolved the issue i've been having. I'm using a publisher and a subscriber via NATS. Previously I was creating a map with all the data, then sending that out as a message, then the subscriber takes the map from the message, and processing the datatypes (on the subscriber side)
To fix the problem that I was experiencing, I instead formatted the maps' values on the publisher side. So I instead moved my code that checks for the datatype over to the NATS publisher, and not the code that processes the incoming message
I understand this isn't an ideal solution, but if you're using NATS and find you're having the same issue. Try this

Golang parse complex json

I am new to golang and json and currently struggle to parse the json out from a system.
I've read a couple of blog posts on dynamic json in go and also tried the tools like json2GoStructs
Parsing my json file with this tools just gave me a huge structs which I found a bit messy. Also I had no idea how to get the info im interested in.
So, here are my problems:
How do I get to the info I am interested in?
What is the best approach to parse complex json?
I am only interested into the following 3 json fields:
Name
Guid
Task -> Property -> Name: Error
I'm thankful for every tip, code snippet or explanation!
This is what I got so far (mostly from a tutorial):
package main
import (
"encoding/json"
"fmt"
"io/ioutil"
)
func checkErr(err error) {
if err != nil {
panic(err)
}
}
func readFile(filePath string) []byte {
data, err := ioutil.ReadFile(filePath)
checkErr(err)
return data
}
func main() {
path := "/Users/andi/Documents/tmp/wfsJob.json"
data := readFile(path)
var f interface{}
err := json.Unmarshal(data, &f)
checkErr(err)
m := f.(map[string]interface{})
for k, v := range m {
switch vv := v.(type) {
case string:
fmt.Println(k, "is string", vv)
case int:
fmt.Println(k, "is int", vv)
case []interface{}:
fmt.Println(k, "is an array:")
for i, u := range vv {
fmt.Println(i, u)
}
default:
fmt.Println(k, "is of a type I don't know how to handle")
}
}
}
I can offer you this easy way to using JSON in Golang. With this tool you don't need to parse the whole json file, and you can use it without struct.
Gjson is a great solution for fetching a few fields from JSON string. But it may become slow when many (more than 2) fields must be fetched from distinct parts of the JSON, since it re-parses the JSON on each Get call. Additionally, it requires calling gjson.Valid for validating the incoming JSON, since other methods assume the caller provides valid JSON.
There is an alternative package - fastjson. Like gsjon, it is fast and has nice API. Unlike gjson it validates the input JSON and works faster when many unrelated fields must be obtained from the JSON. Here is a sample code for obtaining fields from the original question:
var p fastjson.Parser
v, err := p.ParseBytes(data)
if err != nil {
log.Fatal(err)
}
fmt.Printf("Name: %s", v.GetStringBytes("Name"))
fmt.Printf("Guid: %s", v.GetStringBytes("Guid"))
fmt.Printf("Error: %s", v.GetStringBytes("Task", "Property", "Name"))

StructScan unknown struct slice [GO]

So I would like to fill any struct via the StructScan method and so read any data I get from the db into the regarding struct I feed the test function.
This script doesn't give any compile error (if you implement the other stuff like a db connection and so on) but still the StructScan method returns an error and tells me that it expects a slice of structs.
How do I create a slice of structs that I don't know the type of?
Thanks for any advice.
package main
import (
"database/sql"
"github.com/jmoiron/sqlx"
)
var db *sql.DB
type A struct {
Name string `db:"name"`
}
type B struct {
Name string `db:"name"
}
func main() {
testA := []A{}
testB := []B{}
test(testA, "StructA")
test(testB, "StructB")
}
func test(dataStruct interface{}, name string) {
rows, err := db.Query("SELECT * FROM table WHERE name =", name)
if err != nil {
panic(err)
}
for rows.Next() {
err := sqlx.StructScan(rows, &dataStruct)
if err != nil {
panic(err)
}
}
}
Super late to the party, but ran into this question while researching another issue. For others that stumble upon it, the problem is that you're passing a pointer to dataStruct into StructScan(). dataStruct is an interface, and pointers to interfaces are almost always an error in Go (in fact, they removed the automatic dereferencing of interface pointers a few versions back). You're also passing in your destination by value.
So, you are passing a pointer to an interface that holds a copy of your destination slice, when what you want instead is to pass the interface directly, and that interface to hold a pointer to your destination slice.
Instead of:
test(testA, "StructA")
test(testB, "StructB")
// ...
err := sqlx.StructScan(rows, &dataStruct)
Use:
test(&testA, "StructA")
test(&testB, "StructB")
// ...
err := sqlx.StructScan(rows, dataStruct)
If you have no idea what the destination struct type is, use sqlx.MapScan or sqlx.SliceScan. They don't map to a struct, but both return all the columns from the query result.
See http://jmoiron.github.io/sqlx/#altScanning