Json for hashmap - json

I am new to json , trying to create a json workable for this hashmap:
HashMap<SomeEnum,HashMap<Integer,String>> agentNumbers;
So i created this JSON
{
"agentNumbers": [
{
"Additional": [
{
"insuranceId": 111,
"agentNumber": "09090"
},
{
"insuranceId": 1111,
"agentNumber": "090900"
}
]
},
{
"Full": [
{
"insuranceId": 1112,
"agentNumber": "090901"
}
]
}
]
}
When i do : gson.fromJson(....
It says :
com.google.gson.JsonSyntaxException: java.lang.IllegalStateException: Expected
BEGIN_ARRAY but was BEGIN_OBJECT at line 1 column 20 path $.agentNumbers[0]
Please guid me what i'm missing
thanks

I guess it'll work if you leave out the first index agentNumbers. Something like this:
{
[
{
"Additional": [
//...
]
}

{
"Additional": {
"112": "data2",
"113": "data3",
"114": "data4",
"115": "data5",
"111": "data1"
},
"Full": {
"112": "data2",
"113": "data3",
"114": "data4",
"115": "data5",
"111": "data1"
}
}
Try this

Related

Golang-Migrate not able to recognize dollar symbol db.runCommand

While running db.runCommand it is not able to recognize dollar symbol and therefore treating $answer as string and not actual value.
[
{
"update": "userfeedback",
"updates": [
{
"q": {
"userId": 8426,
"questionIdentifier": "resumeLink"
},
"u": {
"$set": {
"answer": [
{
"resumeLink": "$answer",
"resumeId": "$UUID().hex().match(/^(.{8})(.{4})(.{4})(.{4})(.{12})$/).slice(1,6).join('-')",
"uploadSizeInByte": -1,
"source":"manual",
"dateUploaded": "$updatedAt"
}
]
}
}
}
]
}
]
Output: dollar symbol is not recognized.
[
{
"resumeLink" : "$answer",
"resumeId" : "$UUID().hex().match(/^(.{8})(.{4})(.{4})(.{4})(.{12})$/).slice(1,6).join('-')",
"uploadSizeInByte" : -1,
"source" : "manual",
"dateUploaded" : "$updatedAt"
}
]
While using updateMany query the similar query works
Update Query:
db.getCollection('userfeedback').updateMany(
{userId:8426, questionIdentifier:"resumeLink"},
[{
"$set": {
answer: [{
"resumeLink": "$answer",
"resumeId": UUID().hex().match(/^(.{8})(.{4})(.{4})(.{4})(.{12})$/).slice(1,6).join('-'),
"uploadSizeInByte": -1,
"source":"manual",
"dateUploaded": "$updatedAt"
}]
}
}]
)
Result:
[
{
"resumeLink": "https://cdn.upgrad.com/resume/asasjyotiranjana11.docx",
"dateUploaded": "2051-04-26T14:30:00.000Z",
"uploadSizeInByte": 644234,
"resumeId": "7fa1478d-478f-4869-9c4b-7ca8c0b9434g",
"source": "hiration"
}
]
can some one help me how to get same result with runCommand. thanks in advance

Mongodb query on triple nested array of object

I'm having some problem to write a query to return a triple nested value from a document. The documents I'm using are structured like this
{
"areaname": "name1",
"places": [
{
"placename": "place1",
"objects": [
{
"objname": "obj1",
"tags": [
"tag1",
"tag2"
]
},
{
"objname": "obj2",
"tags": [
"tag6",
"tag7"
]
}
]
},
{
"placename": "place2",
"objects": [
{
"objname": "obj45",
"tags": [
"tag46",
"tag34"
]
},
{
"objname": "obj77",
"tags": [
"tag56",
"tag11"
]
}
]
}
]
}
It is quite simple actually but I can't find a solution to a simple query like:
"return the objname of the object that contains tag1 inside their tag"
So for the give document if I use "tag1" as a parameter it is expected for the query to return "obj1"
It should give me the same result if I use "tag2" as a parameter
Other example: using "tag56" it should return only "obj77"
Right now i have no problem returning the whole document using the dot-notation or top level field such as areaname or others
db.users.find( {"places.objects.tags":"tag1"}, { areaname: 1, _id:0 } )
Is this even possible?
Keeping it simple:
[
{
"$match" : {
"places.objects.tags" : "tag1"
}
},
{
"$unwind" : "$places"
},
{
"$unwind" : "$places.objects"
},
{
"$match" : {
"places.objects.tags" : "tag1"
}
},
{
"$group" : {
"_id" : "$_id",
"obj_names" : {
"$push" : "$places.objects.objname"
}
}
}
],
You should add any other fields you want to keep to the group stage,
this can also be done without the double $unwind stage but i choose this for read-ability.

Put Items using Json File in AWS DynamoDB using AWS CLI

While putting below JSON in dynamo DB using AWS CLI with below command:
aws dynamodb put-item --table-name ScreenList --item file://tableName.json
I am getting Parameter validation failed Exception.I have gone rigorously through AWS docs but failed to find example to insert a complicated json.Every small help is welcome.
The updated Json :
{
"itemName": {
"S": "SCREEN_LIST"
},
"productName": {
"S": "P2P_MOBITEL"
},
"screenList": {
"L": [
{
"menu": {
"L": [
{
"M": {
"menuId": {
"N": "1"
},
"menuText": {
"S": "ENG_HEADING"
},
"menuType": {
"S": "Dynamic"
}
}
}
]
},
"M": {
"screenFooter": {
"S": "F_LANGUAGE_CHANGE"
},
"screenHeader": {
"S": "H_LANGUAGE_CHANGE"
},
"screenId": {
"S": "LANGUAGE_CHANGE"
},
"screenType": {
"S": ""
}
}
}
]
}
}
It seems that you are defining complex types incorrectly. According to AWS documentation you should define a list like this:
"L": ["Cookies", "Coffee", 3.14159]
and a map should be defined like this:
"M": {"Name": {"S": "Joe"}, "Age": {"N": "35"}}
which means that a menu map should be defined like this:
"menu": {
"L": [
{
"M": {
"menuId": {"N" :"1"},
"menuText": {"S" :"PACKS_SCREEN"},
"menuType": {"S" :"Dynamic"}
}
}
]
}
Notice the "M" and "L" attributes.
You should change the rest of your JSON in a similar fashion.
You can find full JSON definition here in the Options section.
UPDATE
Now your list definition is incorrect. You have:
"screenList":{
"L":[
{
"menu":{ ... },
"M":{ ... }
}
]
}
While it should be:
"screenList":{
"L":[
{
"M":{ ... }
},
{
"M":{ ... }
},
]
}

How to access Dynamodb's original JSON elements?

I am trying to test my lambda manually with the following dynamodb event input configured in tests -
Let's call this Json-1
{
"Records": [
{
"eventID": "1",
"eventVersion": "1.0",
"dynamodb": {
"Keys": {
"Id": {
"N": "101"
}
},
"NewImage": {
"Message": {
"S": "New item!"
},
"Id": {
"N": "101"
}
},
"StreamViewType": "NEW_AND_OLD_IMAGES",
"SequenceNumber": "111",
"SizeBytes": 26
},
"awsRegion": "us-west-2",
"eventName": "INSERT",
"eventSourceARN": eventsourcearn,
"eventSource": "aws:dynamodb"
},
{
"eventID": "2",
"eventVersion": "1.0",
"dynamodb": {
"OldImage": {
"Message": {
"S": "New item!"
},
"Id": {
"N": "101"
}
},
"SequenceNumber": "222",
"Keys": {
"Id": {
"N": "101"
}
},
"SizeBytes": 59,
"NewImage": {
"Message": {
"S": "This item has changed"
},
"Id": {
"N": "101"
}
},
"StreamViewType": "NEW_AND_OLD_IMAGES"
},
"awsRegion": "us-west-2",
"eventName": "MODIFY",
"eventSourceARN": sourcearn,
"eventSource": "aws:dynamodb"
},
{
"eventID": "3",
"eventVersion": "1.0",
"dynamodb": {
"Keys": {
"Id": {
"N": "101"
}
},
"SizeBytes": 38,
"SequenceNumber": "333",
"OldImage": {
"Message": {
"S": "This item has changed"
},
"Id": {
"N": "101"
}
},
"StreamViewType": "NEW_AND_OLD_IMAGES"
},
"awsRegion": "us-west-2",
"eventName": "REMOVE",
"eventSourceARN": sourcearn,
"eventSource": "aws:dynamodb"
}
]
}
However, the json of dynamodb items look like this -
Let's call this Json-2
{
"id": {
"S": "RIGHT-aa465568-f4c8-4822-9c38-7563ae0cd37b-1131286033464633.jpg"
},
"lines": {
"L": [
{
"M": {
"points": {
"L": [
{
"L": [
{
"N": "0"
},
{
"N": "874.5625"
}
]
},
{
"L": [
{
"N": "1765.320601851852"
},
{
"N": "809.7800925925926"
}
]
},
{
"L": [
{
"N": "3264"
},
{
"N": "740.3703703703704"
}
]
}
]
},
"type": {
"S": "guard"
}
}
}
]
},
"modified": {
"N": "1483483932472"
},
"qastatus": {
"S": "reviewed"
}
}
Using the lambda function below, I can connect to my table. My goal is create a json which elastic search will accept.
#Override
public Object handleRequest(DynamodbEvent dynamodbEvent, Context context) {
List<DynamodbEvent.DynamodbStreamRecord> dynamodbStreamRecordlist = dynamodbEvent.getRecords();
DynamoDB dynamoDB = new DynamoDB(new AmazonDynamoDBClient());
log.info("Whole event - "+dynamodbEvent.toString());
dynamodbStreamRecordlist.stream().forEach(dynamodbStreamRecord -> {
if(dynamodbStreamRecord.getEventSource().equalsIgnoreCase("aws:dynamodb")){
log.info("one record - "+dynamodbStreamRecord.getDynamodb().toString());
log.info(" getting N from new image "+dynamodbStreamRecord.getDynamodb().getNewImage().toString());
String tableName = getTableNameFromARN(dynamodbStreamRecord.getEventSourceARN());
log.info("Table name :"+tableName);
Map<String, AttributeValue> keys = dynamodbStreamRecord.getDynamodb().getKeys();
log.info(keys.toString());
AttributeValue attributeValue = keys.get("Id");
log.info("Value of N: "+attributeValue.getN());
Table table = dynamoDB.getTable(tableName);
}
});
return dynamodbEvent;
}
The format of a JSON item that elastic search expects is this and this is what I want to map the test input json to-
Let's call this Json-3
{
_index: "bar-guard",
_type: "bar-guard_type",
_id: "LEFT-b1939610-442f-4d8d-9991-3ca54685b206-1147042497459511.jpg",
_score: 1,
_source: {
#SequenceNumber: "4901800000000019495704485",
#timestamp: "2017-01-04T02:24:20.560358",
lines: [{
points: [[0,
1222.7129629629628],
[2242.8252314814818,
1254.702546296296],
[4000.0000000000005,
1276.028935185185]],
type: "barr"
}],
modified: 1483483934697,
qastatus: "reviewed",
id: "LEFT-b1939610-442f-4d8d-9991-3ca54685b206-1147042497459511.jpg"
}
},
So what I need is read Json-1 and map it to Json-3.
However, Json-1 does not seem to be complete i.e. it does not have information that a dynamodb json has - like points and lines in Json-2.
And so, I was trying to get a connection to the original table and then read this additional information of lines and points by using the ID.
I am not sure if this is the right approach. Basically, want to figure out a way to get the actual JSON that dynamodb has and not the one that has attribute types
How can I get lines and points from json-2 using java? I know we have DocumentClient in javascript but I am looking for something in java.
Also, came across a converter here but doesn't help me- https://github.com/aws/aws-sdk-js/blob/master/lib/dynamodb/converter.js
Is this something that I should use DynamoDBMapper or ScanJavaDocumentAPI for ?
http://docs.aws.amazon.com/AWSJavaSDK/latest/javadoc/com/amazonaws/services/dynamodbv2/datamodeling/DynamoDBMapper.html#marshallIntoObjects-java.lang.Class-java.util.List-com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMapperConfig-
If yes, I am a little lost how to do that in the code below -
ScanRequest scanRequest = new ScanRequest().withTableName(tableName);
ScanResult result = dynamoDBClient.scan(scanRequest);
for(Map<String, AttributeValue> item : result.getItems()){
AttributeValue value = item.get("lines");
if(value != null){
List<AttributeValue> values = value.getL();
for(AttributeValue value2 : values){
//what next?
}
}
}
Ok, this seems to work for me.
ScanRequest scanRequest = new ScanRequest().withTableName(tableName);
ScanResult result = dynamoDBClient.scan(scanRequest);
for(Map<String, AttributeValue> item : result.getItems()){
AttributeValue value = item.get("lines");
if(value != null){
List<AttributeValue> values = value.getL();
for(AttributeValue value2 : values){
if(value2.getM() != null)
{
Map<String, AttributeValue> map = value2.getM();
AttributeValue points = map.get("points");
List<AttributeValue> pointsvalues = points.getL();
if(!pointsvalues.isEmpty()){
for(AttributeValue valueOfPoint : pointsvalues){
List<AttributeValue> pointList = valueOfPoint.getL();
for(AttributeValue valueOfPoint2 : pointList){
}
}
}
}
}
}
}

indexing json with json value in elasticsearch

I am trying to index a document in elasticsearch. The json I have comes from the document being transformed from XML to JSON. It is valid JSON. Looks like this:
{
"shortcasename": {
"_attributes": {
"party1": "People",
"party2": "Johnson"
},
"_children": [
"People",
{
"connector": {
"_attributes": {
"normval": "v"
},
"_children": [
" v. "
]
}
},
"Johnson"
]
}
}
Elasitcsearch seems to have a problem with the shortcasename._children. The error I get is:
{
"error": {
"root_cause": [
{
"type": "mapper_parsing_exception",
"reason": "failed to parse"
}
],
"type": "mapper_parsing_exception",
"reason": "failed to parse",
"caused_by": {
"type": "illegal_argument_exception",
"reason": "mapper [shortcasename._children] of different type, current_type [string], merged_type [ObjectMapper]"
}
},
"status": 400
}
Is there a way to get the json to be indexed the way it is?
The JSON you have has a conflict with the _children field:
{
"shortcasename": {
"_attributes": {
"party1": "People",
"party2": "Johnson"
},
"_children": [
"People",
{
"connector": {
"_attributes": {
"normval": "v"
},
"_children": [
" v. "
]
}
},
"Johnson"
]
}
}
The top-level _children field is an array containing a mix of objects ({"connector": ...}) and strings ("People", "Johnson"). Elasticsearch doesn't support that, that's why it complains that it cannot merge string and Object