Join range block in go template - json

I have a go template like this:
"environment": [
{{- range $k,$v := .env }}
{
"name": "{{ $k }}",
"value": "{{ $v }}"
},
{{- end }}
]
and i am getting the output below:
"environment": [
{
"name": "name",
"value": "test"
},
{
"name": "region",
"value": "us-east-1"
},
]
and i want to render it like below:
"environment": [
{
"name": "name",
"value": "bxbd"
},
{
"name": "region",
"value": "us-east-1"
}
]
I am not able to get rid of the last comma to make valid json.
Or is it possible to somehow send the complete range block to some custom join function?

Here's an example how to do it with templates, but I strongly recommend to use the 2nd approach if you want to generate JSON.
Sticking with templates
Since you're ranging over a map, you can't (simply) do it. In case of slices you could check the index variable (examples: Go template remove the last comma in range loop; and detect last item inside an array using range inside go-templates), but in case of maps you can't do that.
Knowing if you're at the first (or last) iteration is a state which you must maintain yourself. And example is to use custom functions or methods for this.
Here's an example implementation:
type Params struct {
Env map[string]string
Counter int
}
func (p *Params) IncMore() bool {
p.Counter++
return p.Counter < len(p.Env)
}
const src = `"environment": [
{{- range $k,$v := .Env }}
{
"name": "{{ $k }}",
"value": "{{ $v }}"
}{{if $.IncMore}},{{end}}
{{- end }}
]`
Testing it:
func main() {
t := template.Must(template.New("").Parse(src))
p := &Params{
Env: map[string]string{
"name": "test",
"region": "us-east-1",
},
}
err := t.Execute(os.Stdout, p)
if err != nil {
panic(err)
}
}
Output (try it on the Go Playground):
"environment": [
{
"name": "name",
"value": "test"
},
{
"name": "region",
"value": "us-east-1"
}
]
Use encoding/json to generate JSON
If you're goal is to generate JSON, you should use the encoding/json package to generate valid JSON document. The above template has no knowledge about JSON syntax and context, and the values of the map entries are not escaped when written to the output, so you may still end up with invalid JSON.
Best would be to generate the JSON like this:
type Entry struct {
Name string `json:"name"`
Value string `json:"value"`
}
type Params struct {
Env []Entry `json:"environment"`
}
func main() {
enc := json.NewEncoder(os.Stdout)
enc.SetIndent("", " ") // Optional
p := &Params{
Env: []Entry{
{Name: "name", Value: "test"},
{Name: "region", Value: "us-east-1"},
},
}
err := enc.Encode(p)
if err != nil {
panic(err)
}
}
Output (try it on the Go Playground):
{
"environment": [
{
"name": "name",
"value": "test"
},
{
"name": "region",
"value": "us-east-1"
}
]
}

Related

Protobuf custom options not showing in JSON made by protojson library

I'm trying to extract Protobuf custom options from a FileDescriptorSet generated by the protoc compiler. I'm unable to do so using protoreflect. So, I tried to do so using the protojson library.
PS : Importing the Go-generated code is not an option for my use case.
Here's the Protobuf Message I'm testing with :
syntax = "proto3";
option go_package = "./protoze";
import "google/protobuf/descriptor.proto";
extend google.protobuf.FieldOptions {
string Meta = 50000;
}
extend google.protobuf.FileOptions {
string Food = 50001;
}
option (Food) = "cheese";
message X {
int64 num = 1;
}
message P {
string Fname = 1 [json_name = "FNAME"];
string Lname = 2 [json_name = "0123", (Meta) = "Yo"];
string Designation = 3;
repeated string Email = 4;
string UserID = 5;
string EmpID = 6;
repeated X z = 7;
}
// protoc --go_out=. filename.proto
Here's how far I got :
package main
import (
"fmt"
"io/ioutil"
"os/exec"
"google.golang.org/protobuf/encoding/protojson"
"google.golang.org/protobuf/proto"
"google.golang.org/protobuf/types/descriptorpb"
)
func main() {
exec.Command("protoc", "-oBinaryFile", "1.proto").Run()
Fset := descriptorpb.FileDescriptorSet{}
byts, _ := ioutil.ReadFile("File")
proto.Unmarshal(byts, &Fset)
byts, _ = protojson.Marshal(Fset.File[0])
fmt.Println(string(byts))
}
And here's the output JSON
{
"name": "1.proto",
"dependency": [
"google/protobuf/descriptor.proto"
],
"messageType": [
{
"name": "X",
"field": [
{
"name": "num",
"number": 1,
"label": "LABEL_OPTIONAL",
"type": "TYPE_INT64",
"jsonName": "num"
}
]
},
{
"name": "P",
"field": [
{
"name": "Fname",
"number": 1,
"label": "LABEL_OPTIONAL",
"type": "TYPE_STRING",
"jsonName": "FNAME"
},
{
"name": "Lname",
"number": 2,
"label": "LABEL_OPTIONAL",
"type": "TYPE_STRING",
"jsonName": "0123",
"options": {}
},
{
"name": "Designation",
"number": 3,
"label": "LABEL_OPTIONAL",
"type": "TYPE_STRING",
"jsonName": "Designation"
},
{
"name": "Email",
"number": 4,
"label": "LABEL_REPEATED",
"type": "TYPE_STRING",
"jsonName": "Email"
},
{
"name": "UserID",
"number": 5,
"label": "LABEL_OPTIONAL",
"type": "TYPE_STRING",
"jsonName": "UserID"
},
{
"name": "EmpID",
"number": 6,
"label": "LABEL_OPTIONAL",
"type": "TYPE_STRING",
"jsonName": "EmpID"
},
{
"name": "z",
"number": 7,
"label": "LABEL_REPEATED",
"type": "TYPE_MESSAGE",
"typeName": ".X",
"jsonName": "z"
}
]
}
],
"extension": [
{
"name": "Meta",
"number": 50000,
"label": "LABEL_OPTIONAL",
"type": "TYPE_STRING",
"extendee": ".google.protobuf.FieldOptions",
"jsonName": "Meta"
},
{
"name": "Food",
"number": 50001,
"label": "LABEL_OPTIONAL",
"type": "TYPE_STRING",
"extendee": ".google.protobuf.FileOptions",
"jsonName": "Food"
}
],
"options": {
"goPackage": "./protoze"
},
"syntax": "proto3"
}
So, data about my custom options showed up in the extensions. But what I really wanted was the value of those Custom Options in the "options" as well. (Which in my case was (Food) = "Cheese" and I want Cheese)
Can someone tell me how I can extract my custom options from the FileDescriptorSet using Protoreflect or by using Protojson.
I tried a lot to try and extract it using Protoreflect but failed !
Although not specifically an answer to how to get the custom options in a generated JSON, I believe I have an answer to what sounds like your underlying question: how to access the custom options without loading the generated Go code. This is thanks to dsnet's answer to my question on the golang issues board. Needless to say all the credit for this tricky solution goes to him. The punchline is to Marshal and then Unmarshal the options using a runtime-populated protoregistry.Types that actually knows about the custom options.
I made a complete demonstration of this approach working in this repo, and the key section (all the guts of which come from dsnet's example) is here:
func main() {
protogen.Options{
}.Run(func(gen *protogen.Plugin) error {
gen.SupportedFeatures = uint64(pluginpb.CodeGeneratorResponse_FEATURE_PROTO3_OPTIONAL)
// The type information for all extensions is in the source files,
// so we need to extract them into a dynamically created protoregistry.Types.
extTypes := new(protoregistry.Types)
for _, file := range gen.Files {
if err := registerAllExtensions(extTypes, file.Desc); err != nil {
panic(err)
}
}
// run through the files again, extracting and printing the Message options
for _, sourceFile := range gen.Files {
if !sourceFile.Generate {
continue
}
// setup output file
outputfile := gen.NewGeneratedFile("./out.txt", sourceFile.GoImportPath)
for _, message := range sourceFile.Messages {
outputfile.P(fmt.Sprintf("\nMessage %s:", message.Desc.Name()))
// The MessageOptions as provided by protoc does not know about
// dynamically created extensions, so they are left as unknown fields.
// We round-trip marshal and unmarshal the options with
// a dynamically created resolver that does know about extensions at runtime.
options := message.Desc.Options().(*descriptorpb.MessageOptions)
b, err := proto.Marshal(options)
if err != nil {
panic(err)
}
options.Reset()
err = proto.UnmarshalOptions{Resolver: extTypes}.Unmarshal(b, options)
if err != nil {
panic(err)
}
// Use protobuf reflection to iterate over all the extension fields,
// looking for the ones that we are interested in.
options.ProtoReflect().Range(func(fd protoreflect.FieldDescriptor, v protoreflect.Value) bool {
if !fd.IsExtension() {
return true
}
outputfile.P(fmt.Sprintf("Value of option %s is %s",fd.Name(), v.String()))
// Make use of fd and v based on their reflective properties.
return true
})
}
}
return nil
})
}
// Recursively register all extensions into the provided protoregistry.Types,
// starting with the protoreflect.FileDescriptor and recursing into its MessageDescriptors,
// their nested MessageDescriptors, and so on.
//
// This leverages the fact that both protoreflect.FileDescriptor and protoreflect.MessageDescriptor
// have identical Messages() and Extensions() functions in order to recurse through a single function
func registerAllExtensions(extTypes *protoregistry.Types, descs interface {
Messages() protoreflect.MessageDescriptors
Extensions() protoreflect.ExtensionDescriptors
}) error {
mds := descs.Messages()
for i := 0; i < mds.Len(); i++ {
registerAllExtensions(extTypes, mds.Get(i))
}
xds := descs.Extensions()
for i := 0; i < xds.Len(); i++ {
if err := extTypes.RegisterExtension(dynamicpb.NewExtensionType(xds.Get(i))); err != nil {
return err
}
}
return nil
}

hcl, json, go : how can I iterate JSON?

Problem: I’m trying to iterate through JSON-content and present the result like key,value pairs.
I've written some code that read hcl-files, these are then decoded with hcldec.Decode, and the result is then converted to JSON. These hcl-files define source and target for the application like this:
source.hcl:
source json "namefile" {
attr firstName {
type = "varchar"
expr = "$.firstName"
length = "30"
}
attr lastName {
type = "varchar"
expr = "$.lastName"
length = "40"
}
attr gender {
type = "varchar"
expr = "$.gender"
length = "10"
}
attr age {
type = "varchar"
expr = "$.age"
length = "2"
}
}
target.hcl
target table {
cols firstName {
name=source.json.namefile.attr.firstName.expr
type=source.json.namefile.attr.firstName.type
length=source.json.namefile.attr.firstName.length
}
cols lastName {
name=source.json.namefile.attr.lastName.expr
type=source.json.namefile.attr.lastName.type
length=source.json.namefile.attr.lastName.length
}
}
The decoding is done like this:
tspec := hcldec.ObjectSpec{
"target": &hcldec.BlockMapSpec{
TypeName: "target",
LabelNames: []string{"table"},
Nested: hcldec.ObjectSpec{
"cols": &hcldec.BlockMapSpec{
TypeName: "cols",
LabelNames: []string{"name"},
Nested: &hcldec.ObjectSpec{
"name": &hcldec.AttrSpec{
Name: "name",
Type: cty.String, //cty.List(cty.String),
Required: false,
},
"type": &hcldec.AttrSpec{
Name: "type",
Type: cty.String, //cty.List(cty.String),
Required: false,
},
"length": &hcldec.AttrSpec{
Name: "length",
Type: cty.String, //cty.List(cty.String),
Required: false,
},
},
},
},
},
}
targ, _ := hcldec.Decode(body, tspec, &hcl.EvalContext{
Variables: map[string]cty.Value{
"source": val.GetAttr("source"),
},
Functions: nil,
})
j := decodeCtyToJson(targ, true)
log.Debugf("targ -j (spec): %s", string(j)) // debug info
Where the decodeCtyToJson return []byte like this:
func decodeCtyToJson(value cty.Value, pretty bool) []byte {
jsonified, err := ctyjson.Marshal(value, cty.DynamicPseudoType)
if err != nil {
log.Debugf("Error: #v", err)
return nil
}
if pretty {
return jsonPretty.Pretty(jsonified)
}
return jsonified
}
Now, when I'm trying to testprint the JSON-content I'm not getting what I'm looking for:
var result map[string]interface{}
json.Unmarshal(j, &result)
log.Debugf("result: %# v", result)
tgtfil := result["value"].(map[string]interface{})
log.Debugf("tgtfil: %v", tgtfil)
log.Debugf("len(tgtfil): %# v", len(tgtfil))
for key, value := range tgtfil {
log.Debugf("key: %# v", key)
log.Debugf("value: %# v", value)
}
I am trying to get key, value pairs. But I'm getting this (first the whole JSON pretty print as wanted, then I am trying to loop through the JSON):
DEBU[0000] targ -j (spec): {
"value": {
"target": {
"table": {
"cols": {
"firstName": {
"length": "30",
"name": "$.firstName",
"type": "varchar"
},
"lastName": {
"length": "40",
"name": "$.lastName",
"type": "varchar"
}
}
}
}
},
"type": [
"object",
{
"target": [
"map",
[
"object",
{
"cols": [
"map",
[
"object",
{
"length": "string",
"name": "string",
"type": "string"
}
]
]
}
]
]
}
]
}
DEBU[0000] result: map[string]interface {}{"type":[]interface {}{"object", map[string]interface {}{"target":[]interface {}{"map", []interface {}{"object", map[string]interface {}{"cols":[]interface {}{"map", []interface {}{"object", map[string]interface {}{"length":"string", "name":"string", "type":"string"}}}}}}}}, "value":map[string]interface {}{"target":map[string]interface {}{"table":map[string]interface {}{"cols":map[string]interface {}{"firstName":map[string]interface {}{"length":"30", "name":"$.firstName", "type":"varchar"}, "lastName":map[string]interface {}{"length":"40", "name":"$.lastName", "type":"varchar"}}}}}}
DEBU[0000] tgtfil: map[target:map[table:map[cols:map[firstName:map[length:30 name:$.firstName type:varchar] lastName:map[length:40 name:$.lastName type:varchar]]]]]
DEBU[0000] len(tgtfil): 1
DEBU[0000] key: "target"
DEBU[0000] value: map[string]interface {}{"table":map[string]interface {}{"cols":map[string]interface {}{"firstName":map[string]interface {}{"length":"30", "name":"$.firstName", "type":"varchar"}, "lastName":map[string]interface {}{"length":"40", "name":"$.lastName", "type":"varchar"}}}}
Process finished with exit code 0
My aim here is to eventually be able to iterate through alle the attributes defined in the target.hcl (length, name and type for each cols in this case). Then generate DDL-code from this information and finally implement the DDL in e.g. Presto.
But as of now I’m not able to isolate this information.
Any pointers on how to do this is appreciated.
Thanks,
/b
The solution for me was to create a struct for the target and not use the target spec.

Unmarshalling complex JSON into struct and accessing data (slice of slice of slices)

I've been bashing my head about this for a while now. I have a JSON file that must be in the following format that I need to iterate through and use IF statements on in Go:
[
[
{
"configName": "customer"
},
{
"config": [
{
"emailSubject": "New customer added"
},
{
"text": "Hi test 2"
},
{
"text": "added 2"
}
]
}
]
[
{
"configName": "customerAndUser"
},
{
"config": [
{
"emailSubject": "New customer added"
},
{
"text": "Hi, test 1"
},
{
"text": "added 1"
}
]
}
]
]
And I want to put it into a struct, like this:
type Config [][]struct {
configName string `json: configName`
config []struct {
Text string `json: text`
EmailSubject string `json: emailSubject`
} `json: config`
}
I can unmarshal the data fine, like this:
configData, err := ioutil.ReadFile("testing-config.json")
if err != nil {
fmt.Println(err)
}
var configDataUnmarshalled Config
json.Unmarshal([]byte(configData), &configDataUnmarshalled)
And then the data prints, sort of okay, but here is where things get a little strange: the print statement is returning blanks for the items that I do not specify to print. Here is a sample of what's printed when I print the unmarshalled data:
Print output from unmarshalled data:
[[{customer []} { [{ New customer added} {hi test 2 } {added 2 }]}] [{customerAndUser []} { [{ New customer added} {hi test 1 } {added 1 }]}]]
But then I can't seem to use IF statements or loop over the elements in the config key!
IF statement being ignored in the for loop (see output below code)
for _, configs := range configDataUnmarshalled {
for _, configurations := range configs {
fmt.Println("These are the top level elements in my struct: ", configurations.ConfigName)
if configurations.ConfigName == "customerAndUser" {
for _, config := range configurations.Config {
fmt.Println(config)
}
}
}
}
This is what is printed:
These are the top level elements in my struct: customer
These are the top level elements in my struct:
These are the top level elements in my struct: customerAndUser
These are the top level elements in my struct:
From the FOR loop you can see that I want to access the data when a config is of a certain name, in this case "customerAndUser"
Here the IF statement is being ignored completely
I have two things I want to understand/solve:
How can I access the data following an IF statement
Why is the program printing blanks?
Desired output would be printing out the emailSubject, and the two Text element's data to the console for the config with name customerAndUser
What should be printed:
New customer added
hi test 1
added 1
Thanks for your help
json config is very smell. The struct contain configName and config is two separately structs in a slice. configName have value so config is empty and backwards. This will work when json like this.
{
"configName": "customerAndUser",
"config": [
{
"emailSubject": "New customer added"
},
{
"text": "Hi, test 1"
},
{
"text": "added 1"
}
]
}
So if you can't change json config format. This is solution
endUser := false
for _, configs := range configDataUnmarshalled {
endUser = false
for _, configurations := range configs {
if configurations.ConfigName == "customerAndUser" {
endUser = true
continue
}
if !endUser || len(configurations.Config) == 0 {
continue
}
for _, config := range configurations.Config {
fmt.Println(config)
}
}
}

How to create custom json structure (ansible inventory)

I am obtaining data from the cloudstack API listVirtualMachines, and trying to create a web service that will provide a dynamic ansible inventory for certain hosts.
In my first tries, currently I am fetching all the data by doing a connection request and then within a loop parse all the output:
vms, _ := cs.Request(&cs.ListVirtualMachines{})
// use for the metadata _meta['hostvars']['IP']['key'] = val
hostvars := make(map[string]map[string]string)
// used per each host ['name']['hosts'] = [list]
hosts := make(map[string]map[string][]string)
for _, vm := range vms(*cs.ListVirtualMachinesResponse).VirtualMachine {
ip := vm.Nic[0].IPAddress.String()
if ip == "" {
continue
}
hostvars[ip] = map[string]string{
vm.DisplayName: vm.DisplayName,
"ansible_ssh_common_args": "-o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null",
}
if hosts[vm.DisplayName] == nil {
hosts[vm.DisplayName] = map[string][]string{}
}
hosts[vm.DisplayName]["hosts"] = append(hosts[vm.DisplayName]["hosts"], ip)
}
My desired output needs this JSON format:
{
"_meta": {
"hostvars": {
"172.16.0.3": {
"ansible_ssh_common_args": "-o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null",
"displayname": "server 1"
},
"172.16.0.4": {
"ansible_ssh_common_args": "-o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null",
"displayname": "server 2"
},
"172.16.0.5": {
"ansible_ssh_common_args": "-o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null",
"displayname": "server 3"
}
}
},
"group1": {
"hosts": [
"172.16.0.3",
"172.16.0.4",
"172.16.0.5"
]
},
"sandbox": {
"children": [
"server1",
"server2",
"server3"
]
},
"server1": {
"hosts": [
"172.16.0.3"
]
},
"server2": {
"hosts": [
"172.16.0.4"
]
},
"server3": {
"hosts": [
"172.16.0.5"
]
}
}
My first try was to create a huge map:
inventory := map[string]map[string]map[string]string{}
But that was covering only the structure of the _meta key :
{
"_meta": {
"hostvars": {
"x.x.x.x": {
"key": "value"
},
"x.x.x.y": {
"key": "value"
}
}
}
}
Later I came with a struct:
type Ansible struct {
Metadata Hostvars `json:"_meta"`
Hosts map[string]map[string][]string `json:",inline"` // want to remove the Hosts key
}
type Hostvars struct {
Hosts map[string]map[string]string `json:"hostvars"`
}
inventory := &Ansible{
Hostvars{hostvars},
hosts,
}
if err := json.NewEncoder(os.Stdout).Encode(inventory); err != nil {
log.Println(err)
}
The problem with this approach is that the returned JSON is adding both keys _meta and hosts:
{
"_meta": {
"hostvars": {
"x.x.x.x": {
"key": "value"
},
"x.x.x.y": {
"key": "value"
}
}
},
"hosts": { <--- want to remove this
"server1": {
"hosts": [
"172.16.0.3"
]
}
...
}
}
I would like to just have the key _meta and at the same level (inline) have each key for every hostname, something like:
{
"_meta": {...},
"server1": {
"hosts": []
},
"group1": {
"hosts": []
},
"sandbox": {
"children": []
}
}
I have a guess this probably could be solved with a map[string]interface{} maybe that could allow a dynamic JSON object with dynamic key names/structure, don't know exactly, therefore, any help would be appreciated.
You best bet is probably to let Ansible implement json.Marshaler to "flatten" the JSON document on-the-fly.
package main
import (
"encoding/json"
"fmt"
)
func main() {
a := &Ansible{
Metadata: Hostvars{
Hosts: map[string]map[string]string{
"172.16.0.3": map[string]string{
"displayname": "server 1",
},
},
},
Hosts: map[string]Group{
"group1": Group{Hosts: []string{"172.16.0.3", "172.16.0.4", "172.16.0.5"}},
"sandbox": Group{Children: []string{"server1", "server2", "server3"}},
"server1": Group{Hosts: []string{"172.16.0.3"}},
},
}
b, err := json.MarshalIndent(a, "", " ")
fmt.Println(string(b), err)
}
type Ansible struct {
Metadata Hostvars
Hosts map[string]Group
}
type Hostvars struct {
Hosts map[string]map[string]string `json:"hostvars"`
}
// I don't *think* Ansible supports anything else in host groups, so it makes
// sense to define it as a struct.
type Group struct {
Hosts []string `json:"hosts,omitempty"`
Children []string `json:"children,omitempty"`
}
// MarshalJSON implements json.Marshaler
func (a *Ansible) MarshalJSON() ([]byte, error) {
// Define a map that we will encode at the end of the function.
doc := make(map[string]interface{})
// Add all the groups.
for k, v := range a.Hosts {
doc[k] = v
}
// Add the _meta key.
doc["_meta"] = a.Metadata
return json.Marshal(doc)
}
Try it on the playground: https://play.golang.org/p/OKQHAG3vqIG

Create struct for complex JSON array in Golang

I have the following JSON array that I'm trying to convert to a struct.
[
{
"titel": "test 1",
"event": "some value",
"pair": "some value",
"condition": [
"or",
[
"contains",
"url",
"/"
]
],
"actions": [
[
"option1",
"12",
"1"
],
[
"option2",
"3",
"1"
]
]
}, {
"titel": "test 2",
"event": "some value",
"pair": "some value",
"condition": [
"or",
[
"contains",
"url",
"/"
]
],
"actions": [
[
"option1",
"12",
"1"
],
[
"option2",
"3",
"1"
]
]
}
]
This is the struct I've got so far:
type Trigger struct {
Event string `json:"event"`
Pair string `json:"pair"`
Actions [][]string `json:"actions"`
Condition []interface{} `json:"condition"`
}
type Triggers struct {
Collection []Trigger
}
However, this does not really cover the "Condition" part. Ideally id like to have a structure for that as well.
Assuming that there can be only a single condition per item in the root array, you can try a struct below. This could make using Condition clear.
https://play.golang.org/p/WxFhBjJmEN
type Trigger struct {
Event string `json:"event"`
Pair string `json:"pair"`
Actions [][]string `json:"actions"`
Condition Condition `json:"condition"`
}
type Condition []interface{}
func (c *Condition) Typ() string {
return (*c)[0].(string)
}
func (c *Condition) Val() []string {
xs := (*c)[1].([]interface{})
ys := make([]string, len(xs))
for i, x := range xs {
ys[i] = x.(string)
}
return ys
}
type Triggers struct {
Collection []Trigger
}