How to rename field names in a nested array created from input JSON using JOLT transformation without changing anything else - json

I started with an input JSON as such.
{
"trackingNumber": "1ZEA83550362028861",
"localActivityDate": "20210324",
"localActivityTime": "183500",
"scheduledDeliveryDate": "20210324",
"actualDeliveryDate": "20210324",
"actualdeliveryTime": "183500",
"gmtActivityDate": "20210324",
"gmtActivityTime": "223500",
"activityStatus": {
"type": "G",
"code": "OR",
"description": "Origin Scan"
},
"activityLocation": {
"city": "RANDALLSTOWN,",
"stateProvince": "MD",
"postalCode": "21133",
"country": "US"
}
}
This is the JOLT transformation spec that i have written as of now.
[
{
"operation": "modify-overwrite-beta",
"spec": {
"tsY": "=substring(#(1,localActivityDate),0,4)",
"tsM": "=substring(#(1,localActivityDate),4,6)",
"tsD": "=substring(#(1,localActivityDate),6,8)",
"tsH": "=substring(#(1,localActivityTime),0,2)",
"tsMi": "=substring(#(1,localActivityTime),2,4)",
"tsS": "=substring(#(1,localActivityTime),4,6)",
"timeStamp": "=concat(#(1,tsY),'-',#(1,tsM),'-',#(1,tsD),'T',#(1,tsH),':',#(1,tsMi),':',#(1,tsS),'Z')",
"aTY": "=substring(#(1,scheduledDeliveryDate),0,4)",
"aTM": "=substring(#(1,scheduledDeliveryDate),4,6)",
"aTD": "=substring(#(1,scheduledDeliveryDate),6,8)",
"appointmentTime": "=concat(#(1,aTY),'-',#(1,aTM),'-',#(1,aTD))",
"dTY": "=substring(#(1,actualDeliveryDate),0,4)",
"dTM": "=substring(#(1,actualDeliveryDate),4,6)",
"dTD": "=substring(#(1,actualDeliveryDate),6,8)",
"dTH": "=substring(#(1,actualdeliveryTime),0,2)",
"dTMi": "=substring(#(1,actualdeliveryTime),2,4)",
"dTS": "=substring(#(1,actualdeliveryTime),4,6)",
"deliveryTime": "=concat(#(1,dTY),'-',#(1,dTM),'-',#(1,dTD),'T',#(1,dTH),':',#(1,dTMi),':',#(1,dTS),'Z')"
}
},
{
"operation": "shift",
"spec": {
"*Number": "transformedPayload.&(0,1)Info",
"activityStatus": {
"*": "transformedPayload.events.&"
},
"activityLocation": {
"*": "transformedPayload.address.&"
},
"timeStamp": "transformedPayload.events[0].&",
"appointmentTime": "transformedPayload.events[1].&",
"deliveryTime": "transformedPayload.events[2].&",
"activityStatus": {
"type": "transformedPayload.events[0].type",
"code": "transformedPayload.events[0].statusCode",
"description": "transformedPayload.events[0].statusDescription"
},
"activityLocation": {
"city": "transformedPayload.address.city",
"stateProvince": "transformedPayload.address.state",
"postalCode": "transformedPayload.address.postalCode",
"country": "transformedPayload.address.country"
}
}
},
{
"operation": "modify-default-beta",
"spec": {
"metaData": {
"domain": "LTL",
"eventType": "statusUpdate",
"version": "v1"
},
"transformedPayload": {
"events": {
"[1]": {
"statusCode": "AB",
"statusDescription": "Delivery Scheduled"
},
"[2]": {
"statusCode": "D1",
"statusDescription": "Delivered"
}
}
}
}
}
]
The resultant JSON created by this transformation looks like this.
{
"transformedPayload" : {
"events" : [ {
"type" : "G",
"statusCode" : "OR",
"statusDescription" : "Origin Scan",
"timeStamp" : "2021-03-24T18:35:00Z"
}, {
"appointmentTime" : "2021-03-24",
"statusCode" : "AB",
"statusDescription" : "Delivery Scheduled"
}, {
"deliveryTime" : "2021-03-24T18:35:00Z",
"statusCode" : "D1",
"statusDescription" : "Delivered"
} ],
"address" : {
"city" : "RANDALLSTOWN,",
"state" : "MD",
"postalCode" : "21133",
"country" : "US"
},
"trackingInfo" : "1ZEA83550362028861"
},
"metaData" : {
"domain" : "LTL",
"eventType" : "statusUpdate",
"version" : "v1"
}
}
I just need a small tweak in this where the appointmentTime and the deliveryTime fields in the index [1] and [2] of the events array also need to be named as "timestamp" (as seen in the [0]th index). So that finally the correct output JSON looks something like this.
{
"transformedPayload" : {
"events" : [ {
"type" : "G",
"statusCode" : "OR",
"statusDescription" : "Origin Scan",
"timeStamp" : "2021-03-24T18:35:00Z"
}, {
"timestamp" : "2021-03-24",
"statusCode" : "AB",
"statusDescription" : "Delivery Scheduled"
}, {
"timestamp" : "2021-03-24T18:35:00Z",
"statusCode" : "D1",
"statusDescription" : "Delivered"
} ],
"address" : {
"city" : "RANDALLSTOWN,",
"state" : "MD",
"postalCode" : "21133",
"country" : "US"
},
"trackingInfo" : "1ZEA83550362028861"
},
"metaData" : {
"domain" : "LTL",
"eventType" : "statusUpdate",
"version" : "v1"
}
}
I have tried renaming the field in the shift operation itself but that did not work. I am completely new to JOLT transformation so it seems a bit tricky doing this small change. So any help is appreciated. Thanks

Just convert the related lines within the shift transformation spec from
"appointmentTime": "transformedPayload.events[1].&",
"deliveryTime": "transformedPayload.events[2].&",
to
"appointmentTime": "transformedPayload.events[1].timestamp",
"deliveryTime": "transformedPayload.events[2].timestamp",
instead of the replicating operator & used in the previous one.

Related

How to correct this jolt transformation

I have an input JSON like this.
{
"trackingNumber": "1ZEA83550362028861",
"localActivityDate": "20210324",
"localActivityTime": "183500",
"scheduledDeliveryDate": "20220525",
"actualDeliveryDate": "20220729",
"actualdeliveryTime": "183500",
"gmtActivityDate": "20210324",
"gmtActivityTime": "223500",
"activityStatus": {
"type": "G",
"code": "OR",
"description": "Origin Scan"
},
"activityLocation": {
"city": "RANDALLSTOWN,",
"stateProvince": "MD",
"postalCode": "21133",
"country": "US"
}
}
I have written a jolt transformation for this JSON
[
{
"operation": "shift",
"spec": {
"trackingNumber": "transformedPayload.trackingInfo",
"localActivityDate": "tmp_Date",
"localActivityTime": "tmp_Time",
"scheduledDeliveryDate": "tmp_App",
"actualDeliveryDate": "tmp_Del_Date",
"actualdeliveryTime": "tmp_Del_Time",
"activityStatus": {
"type": "transformedPayload.events.type",
"code": "transformedPayload.events.statusCode",
"description": "transformedPayload.events.statusDescription"
},
"activityLocation": {
"city": "transformedPayload.address.city",
"stateProvince": "transformedPayload.address.state",
"postalCode": "transformedPayload.address.postalCode",
"country": "transformedPayload.address.country"
}
}
},
{
"operation": "modify-default-beta",
"spec": {
"tmp_Year": "=substring(#(1,tmp_Date),0,4)",
"tmp_Month": "=substring(#(1,tmp_Date),4,6)",
"tmp_Day": "=substring(#(1,tmp_Date),6,8)",
"tmp_Hours": "=substring(#(1,tmp_Time),0,2)",
"tmp_Minutes": "=substring(#(1,tmp_Time),2,4)",
"tmp_Seconds": "=substring(#(1,tmp_Time),4,6)",
"timeStamp": "=concat(#(1,tmp_Year),'-',#(1,tmp_Month),'-',#(1,tmp_Day),'T',#(1,tmp_Hours),':',#(1,tmp_Minutes),':',#(1,tmp_Seconds),'Z')",
"tmp_App_Year": "=substring(#(1,tmp_App),0,4)",
"tmp_App_Month": "=substring(#(1,tmp_App),4,6)",
"tmp_App_Day": "=substring(#(1,tmp_App),6,8)",
"appointmentTime": "=concat(#(1,tmp_App_Year),'-',#(1,tmp_App_Month),'-',#(1,tmp_App_Day))",
"tmp__Del_Year": "=substring(#(1,tmp_Del_Date),0,4)",
"tmp_Del_Month": "=substring(#(1,tmp_Del_Date),4,6)",
"tmp_Del_Day": "=substring(#(1,tmp_Del_Date),6,8)",
"tmp_Del_Hours": "=substring(#(1,tmp_Del_Time),0,2)",
"tmp_Del_Minutes": "=substring(#(1,tmp_Del_Time),2,4)",
"tmp_Del_Seconds": "=substring(#(1,tmp_Del_Time),4,6)",
"deliveryTime": "=concat(#(1,tmp__Del_Year),'-',#(1,tmp_Del_Month),'-',#(1,tmp_Del_Day),'T',#(1,tmp_Del_Hours),':',#(1,tmp_Del_Minutes),':',#(1,tmp_Del_Seconds),'Z')"
}
},
{
"operation": "remove",
"spec": {
"tmp_*": ""
}
}
]
This transforms the data into this format.
{
"transformedPayload" : {
"trackingInfo" : "1ZEA83550362028861",
"events" : {
"type" : "G",
"statusCode" : "OR",
"statusDescription" : "Origin Scan"
},
"address" : {
"city" : "RANDALLSTOWN,",
"state" : "MD",
"postalCode" : "21133",
"country" : "US"
}
},
"timeStamp" : "2021-03-24T18:35:00Z",
"appointmentTime" : "2022-05-25",
"deliveryTime" : "2022-07-29T18:35:00Z"
}
What changes do i need to make in the transformation such that the timestamp, appointmentTime and deliveryTime are also nested under transformedPayload i.e it looks like this (correct format).
{
"transformedPayload" : {
"trackingInfo" : "1ZEA83550362028861",
"events" : {
"type" : "G",
"statusCode" : "OR",
"statusDescription" : "Origin Scan"
},
"address" : {
"city" : "RANDALLSTOWN,",
"state" : "MD",
"postalCode" : "21133",
"country" : "US"
},
"timeStamp" : "2021-03-24T18:35:00Z",
"appointmentTime" : "2022-05-25",
"deliveryTime" : "2022-07-29T18:35:00Z"
}
}
This is my first time doing a jolt transformation so i am confused on how to resolve this. Any help is appreciated.
You are already so close to solution,I can offer the following spec similar to yours to the desired output :
[
{
"operation": "modify-overwrite-beta",
"spec": {
"tsY": "=substring(#(1,localActivityDate),0,4)",
"tsM": "=substring(#(1,localActivityDate),4,6)",
"tsD": "=substring(#(1,localActivityDate),6,8)",
"tsH": "=substring(#(1,localActivityTime),0,2)",
"tsMi": "=substring(#(1,localActivityTime),2,4)",
"tsS": "=substring(#(1,localActivityTime),4,6)",
"timeStamp": "=concat(#(1,tsY),'-',#(1,tsM),'-',#(1,tsD),'T',#(1,tsH),':',#(1,tsMi),':',#(1,tsS),'Z')",
"aTY": "=substring(#(1,scheduledDeliveryDate),0,4)",
"aTM": "=substring(#(1,scheduledDeliveryDate),4,6)",
"aTD": "=substring(#(1,scheduledDeliveryDate),6,8)",
"appointmentTime": "=concat(#(1,aTY),'-',#(1,aTM),'-',#(1,aTD))",
"dTY": "=substring(#(1,actualDeliveryDate),0,4)",
"dTM": "=substring(#(1,actualDeliveryDate),4,6)",
"dTD": "=substring(#(1,actualDeliveryDate),6,8)",
"dTH": "=substring(#(1,actualdeliveryTime),0,2)",
"dTMi": "=substring(#(1,actualdeliveryTime),2,4)",
"dTS": "=substring(#(1,actualdeliveryTime),4,6)",
"deliveryTime": "=concat(#(1,dTY),'-',#(1,dTM),'-',#(1,dTD),'T',#(1,dTH),':',#(1,dTMi),':',#(1,dTS),'Z')"
}
},
{
"operation": "shift",
"spec": {
"*Number": "&(0,1)Info",
"activityStatus": {
"*": "events.&"
},
"activityLocation": {
"*": "address.&"
},
"timeStamp": "&",
"appointmentTime": "&",
"deliveryTime": "&"
}
}
]

Jolt transforming to array

I am getting null in my output json. Please find my spec and details below. The input json can have n numbers of COMPINFO . pls suggest
my input.json is
{
"valid": "true",
"message": "",
"data": {
"COMPINFO": [
{
"ORGID": "",
"SITEID": "BWDEMO",
"COMPID": "C2014",
"COMP_DESC": "Cherokee High School",
"ASSETTYPE": "MANUFACTURING",
"BUILDING": "Main",
"FLR_LEVEL": "Ground",
"ROOM_SPCE": "100"
},
{
"ORGID": "",
"SITEID": "BWDEMO",
"COMPID": "9001B",
"COMP_DESC": "Sludge Pump",
"ASSETTYPE": "FACILITY",
"BUILDING": "Main",
"FLR_LEVEL": "Production",
"ROOM_SPCE": "100"
}
]
}
}
my Spec.json is
[
{
"operation": "shift",
"spec": {
"data": {
"COMPINFO": {
"*": {
"COMPID": "[&1].COMPID",
"ORGID": "[&1].ORGID",
"COMP_DESC": "[&1].DESCRIPTION",
"BUILDING": "[&1].LOCATIONS.[&1].Building",
"FLR_LEVEL": "[&1].LOCATIONS.[&1].Floor_Level",
"ROOM_SPCE": "[&1].LOCATIONS.[&1].ROOM_SPCE",
"SITEID": "[&1].SITEID",
"ASSETTYPE": "[&1].ASSETTYPE"
}
}
}
}
}
]
expected output should be :
[
{
"COMPID" : "C2014",
"ORGID" : "",
"DESCRIPTION" : "Cherokee High School",
"LOCATIONS" : [ {
"Building" : "Main",
"Floor_Level" : "Ground",
"ROOM_SPCE" : "100"
} ],
"SITEID" : "BWDEMO",
"ASSETTYPE" : "MANUFACTURING"
}, {
"COMPID" : "9001B",
"ORGID" : "",
"DESCRIPTION" : "Sludge Pump",
"LOCATIONS" : [{
"Building" : "Main",
"Floor_Level" : "Production",
"ROOM_SPCE" : "100"
} ],
"SITEID" : "BWDEMO",
"ASSETTYPE" : "FACILITY"
}
]
but getting null (highlighted (Bold)) :
[
{
"COMPID" : "C2014",
"ORGID" : "",
"DESCRIPTION" : "Cherokee High School",
"LOCATIONS" : [ {
"Building" : "Main",
"Floor_Level" : "Ground",
"ROOM_SPCE" : "100"
} ],
"SITEID" : "BWDEMO",
"ASSETTYPE" : "MANUFACTURING"
}, {
"COMPID" : "9001B",
"ORGID" : "",
"DESCRIPTION" : "Sludge Pump",
"LOCATIONS" : [ null, {
"Building" : "Main",
"Floor_Level" : "Production",
"ROOM_SPCE" : "100"
} ],
"SITEID" : "BWDEMO",
"ASSETTYPE" : "FACILITY"
}
]
can someone help quickly ??
Thanks in advance .ssssssssssssssssssssss
It looks like you there is only one "location" per input item, and that you just want them to alway be the first element in a location array.
If so that is easy. If you are wanting to "group" your data and have some of the locations array have multiple items, that is a harder transform.
Spec for the simple version
[
{
"operation": "shift",
"spec": {
"data": {
"COMPINFO": {
"*": {
"COMPID": "[&1].COMPID",
"ORGID": "[&1].ORGID",
"COMP_DESC": "[&1].DESCRIPTION",
"BUILDING": "[&1].LOCATIONS[0].Building",
"FLR_LEVEL": "[&1].LOCATIONS[0].Floor_Level",
"ROOM_SPCE": "[&1].LOCATIONS[0].ROOM_SPCE",
"SITEID": "[&1].SITEID",
"ASSETTYPE": "[&1].ASSETTYPE"
}
}
}
}
}
]
There are examples around for the harder version.

Transform nested array to top level array JOLT

Is it possible to transform a input JSON
{
"root": {
"lang" : "fr-FR",
"ttp" : "ttp1",
"net" : "wifi",
"gps" : [
{"gpslon" : "1", "gpslat" : "4"},
{"gpslon" : "2", "gpslat" : "5"},
{"gpslon" : "3", "gpslat" : "6"}
]
}
}
to another JSON file using JOLT transformation?
{
[
{
"lang" : "fr-FR",
"ttp" : "ttp1",
"net" : "wifi",
"gpslon" : "1",
"gpslat" : "4"
},
{
"lang" : "fr-FR",
"ttp" : "ttp1",
"net" : "wifi",
"gpslon" : "2",
"gpslat" : "5"
},
{
"lang" : "fr-FR",
"ttp" : "ttp1",
"net" : "wifi",
"gpslon" : "3",
"gpslat" : "6"
}
]
}
In other words i would like to copy "header" data (lang, ttp, net) to each array item gps.
Ok, i have found solution, may be not very efficient
[
{
"operation": "shift",
"spec": {
"root": {
"gps": {
"*": {
"#": "[&1]",
"#(2,lang)": "[&1].lang",
"#(2,ttp)": "[&1].ttp",
"#(2,net)": "[&1].net"
}
}
}
}
}
]

Regional Opsworks stack can't be found by Cloudformation

I have a Cloudformation template that modifies an Opsworks stack by adding few resources.
The Opsworks stack is deployed in the region eu-west-1 which is the API endpoint region as well and it shows on the side of the name of the stack: Regional.
When I run the Cloudformation template ( I give the stack ID as a parameter) I get this error:
Unable to find stack with ID xxxxxxx
I guess Cloudformation can only see the opsworks resources which are in us-east-1 region?
I tried changing the region of Cloudformation and deploy the template but the stack is still unfound.
How can I let Cloudformation search for the stack in all regions?
Should I clone the opsworks stack and change the endpoint to us-east-1 region?
What would be the best solution?
Template
{
"AWSTemplateFormatVersion": "2010-09-09",
"Description": "Add a layer to an existing stack",
"Mappings": {
"Region2Principal": {
"eu-west-1": {
"EC2Principal": "ec2.amazonaws.com",
"OpsWorksPrincipal": "opsworks.amazonaws.com"
}
},
},
"Parameters": {
"Environment" : {
"Description": "The Environnement variable ",
"Type": "String",
"Default": "dev",
"AllowedValues" : ["test", "prod"]
},
"InstanceType": {
"Type": "String",
"Default": "m4.large",
"AllowedValues" : ["t2.micro", "m1.small", "m1.large","m4.large","m4.xlarge","m4.2xlarge","m4.4xlarge","m4.10xlarge","m4.16xlarge","c4.large" , "c4.xlarge" ,"c4.2xlarge" , "c4.4xlarge","c4.8xlarge" , "c3.large" , "c3.xlarge", "c3.2xlarge", "c3.4xlarge" ,"c3.8xlarge"],
"ConstraintDescription": "must be a valid EC2 instance type"
},
"StackID": {
"Type": "String",
"Description": "ID of the existing opsworks stack to edit"
},
"vpcId": {
"Description": "VPC id of corresponding to the Environment",
"Type": "String"
},
"subnetIds" :{
"Description": "list of sunbnets in the chosen VPC",
"Type": "List<AWS::EC2::Subnet::Id>"
},
"ScriptSG":{
"Description": "script security group",
"Type" : "String"
},
"SG": {
"Description": " layer security group",
"Type": "String"
}
},
"Resources":{
"Layer": {
"Type": "AWS::OpsWorks::Layer",
"Properties": {
"AutoAssignElasticIps" : false,
"AutoAssignPublicIps" : true
}
},
"SInstance1": {
"Type": "AWS::OpsWorks::Instance",
"Properties": {
"Hostname": "S1",
"AutoScalingType": "timer",
"TimeBasedAutoScaling" : {
"Friday" : { "0" : "on", "6" : "on", "12" : "on", "18" : "on" },
"Monday" : { "0" : "on", "6" : "on", "12" : "on", "18" : "on" }
},
"RootDeviceType": "ebs",
"StackId": {"Ref": "StackID"},
"LayerIds": [{"Ref": "Layer"}],
"InstanceType": {"Ref" : "InstanceType"}
}
},
"Instance2": {
"Type": "AWS::OpsWorks::Instance",
"Properties": {
"Hostname": "S2",
"AutoScalingType": "timer",
"TimeBasedAutoScaling" : {
"Saturday": { "0" : "on", "6" : "on", "12" : "on", "18" : "on" },
"Sunday" : { "0" : "on", "6" : "on", "12" : "on", "18" : "on" },
"Thursday": { "0" : "on", "6" : "on", "12" : "on", "18" : "on" },
"Tuesday" : { "0" : "on", "6" : "on", "12" : "on", "18" : "on" },
"Wednesday":{ "0" : "on", "6" : "on", "12" : "on", "18" : "on" }
},
"RootDeviceType": "ebs",
"StackId": {"Ref": "StackID"},
"LayerIds": [{"Ref": "Layer"}],
"InstanceType": {"Ref" : "InstanceType"}
}
},
"ELB": {
"Type": "AWS::ElasticLoadBalancing::LoadBalancer",
"Properties": {
"ConnectionDrainingPolicy" : {
"Enabled" : true,
"Timeout" : 300
},
"ConnectionSettings" : {
"IdleTimeout" : 60
},
"CrossZone" : true,
"HealthCheck" : {
"HealthyThreshold" : "3",
"Interval" : "30",
"Target" : "HTTP:80/ping",
"Timeout" : "5",
"UnhealthyThreshold" : "2"
},
"LoadBalancerName": "loadBalancer",
"Listeners" : [{
"InstancePort" : "80",
"InstanceProtocol" : "HTTP",
"LoadBalancerPort" : "80",
"Protocol" : "HTTP"
}],
"Scheme" : "internal",
"SecurityGroups" : [{ "Ref" : "ELBSecurityGroup" }],
"Subnets" : { "Ref" : "subnetIds"}
}
},
"ELBAttach":{
"Type": "AWS::OpsWorks::ElasticLoadBalancerAttachment",
"Properties": {
"ElasticLoadBalancerName" : {"Ref" : "ELB"},
"LayerId" : {"Ref" : "Layer" }
}
}
},
}
It looks like you will need to move them to the same region.
Resources can be managed only in the region in which they are created. Resources that are created in one regional endpoint are not available, nor can they be cloned to, another regional endpoint.
http://docs.aws.amazon.com/general/latest/gr/rande.html#opsworks_region
https://aws.amazon.com/about-aws/whats-new/2016/08/aws-opsworks-adds-nine-regional-endpoints-and-asia-pacific-seoul-region-support/
Layer is missing the stackID parameter.
{
"Type": "AWS::OpsWorks::Layer",
"Properties": {
"Attributes" : { String:String },
"AutoAssignElasticIps" : Boolean,
"AutoAssignPublicIps" : Boolean,
"CustomInstanceProfileArn" : String,
"CustomJson" : JSON object,
"CustomRecipes" : Recipes,
"CustomSecurityGroupIds" : [ String, ... ],
"EnableAutoHealing" : Boolean,
"InstallUpdatesOnBoot" : Boolean,
"LifecycleEventConfiguration" : LifeCycleEventConfiguration,
"LoadBasedAutoScaling" : LoadBasedAutoScaling,
"Name" : String,
"Packages" : [ String, ... ],
"Shortname" : String,
"StackId" : String,
"Type" : String,
"VolumeConfigurations" : [ VolumeConfiguration, ... ]
}
}
http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-opsworks-layer.html

Cloudformation in create stack error: "ELB cannot be attached to multiple subnets in the same AZ"

I trying to build infrastracture with Cloudformation json template. When I added two Subnets and SubnetRouteTableAssociation in both availability zones that i need. But creating process failing to create Loadbalancers with error:
CREATE_FAILED AWS::ElasticLoadBalancing::LoadBalancer Rest ELB cannot
be attached to multiple subnets in the same AZ.
Here is the Parameters of AZs:
"AZs" : {
"Description" : "The list of AvailabilityZones.",
"Type" : "CommaDelimitedList",
"Default" : "us-east-1a,us-east-1c"
}
Here is Resources of Subnets, SubnetRouteTableAssociation in both availability zones and ElasticLoadBalancing of Rest:
"PublicSubnet1a" : {
"Type" : "AWS::EC2::Subnet",
"Properties" : {
"VpcId" : { "Ref" : "VPC" },
"CidrBlock" : "10.0.0.0/24",
"AvailabilityZone": {
"Fn::Select": ["1", { "Ref": "AZs" }]
},
"Tags" : [
{"Key": "Name", "Value": {"Fn::Join": ["", ["Offering-", {"Ref": "Env"}, {"Ref": "EnvNum"}, "-VPC"]]}},
{"Key" : "Network", "Value" : "Public" }
]
}
},
"PublicSubnet1c" : {
"Type": "AWS::EC2::Subnet",
"Properties": {
"VpcId": { "Ref" : "VPC" },
"CidrBlock": "10.0.1.0/24",
"AvailabilityZone": {
"Fn::Select": ["1", { "Ref": "AZs" }]
},
"Tags" : [
{"Key": "Name", "Value": {"Fn::Join": ["", ["Offering-", {"Ref": "Env"}, {"Ref": "EnvNum"}, "-VPC"]]}},
{"Key" : "Network", "Value" : "Public" }
]
}
},
"PublicSubnet1aRouteTableAssociation" : {
"Type" : "AWS::EC2::SubnetRouteTableAssociation",
"Properties" : {
"SubnetId" : { "Ref" : "PublicSubnet1a" },
"RouteTableId" : { "Ref" : "PublicRouteTable" }
}
},
"PublicSubnet1cRouteTableAssociation" : {
"Type" : "AWS::EC2::SubnetRouteTableAssociation",
"Properties" : {
"SubnetId" : { "Ref" : "PublicSubnet1c" },
"RouteTableId" : { "Ref" : "PublicRouteTable" }
}
},
"RestELB" : {
"Type" : "AWS::ElasticLoadBalancing::LoadBalancer",
"DependsOn": "AttachGateway",
"Properties": {
"LoadBalancerName": {"Fn::Join": ["",["Rest-ELB-", {"Ref": "VPC"}]]},
"CrossZone" : "true",
"Subnets": [{ "Ref": "PublicSubnet1a" },{ "Ref": "PublicSubnet1c" }],
"Listeners" : [
{"LoadBalancerPort" : "80", "InstancePort" : "80","Protocol" : "HTTP"},
{"LoadBalancerPort" : "6060", "InstancePort" : "6060","Protocol" : "HTTP"}
],
"HealthCheck" : {
"Target" : "HTTP:80/",
"HealthyThreshold" : "3",
"UnhealthyThreshold" : "5",
"Interval" : "90",
"Timeout" : "60"
}
}
}
What I'm doing wrong?
Thanks!
"PublicSubnet1a" : {
...
"AvailabilityZone": {
"Fn::Select": ["1", { "Ref": "AZs" }] // <---- selects index 1 from AZs list
},
...
"PublicSubnet1c" : {
...
"AvailabilityZone": {
"Fn::Select": ["1", { "Ref": "AZs" }] // <---- selects the same index 1 from AZs list
},
both of your subnets are selecting the same index from AZs list (see "FN::select" statement). Change the select statement for PublicSubnet1a to be
"Fn::Select": ["0", { "Ref": "AZs" }]