WSLITE: Building JSON REST Request from Map causes StackOverflowError - json

I have been stuck with this problem for a while now. Couldn't find any solution so far.
Basicall I am doing nothing out of the ordinary.
I have a Method to configure and send a Rest Request using Wslite and this method accepts a map as payload to the closure of the clients' post method like this:
def postSomething(Map payload){
RESTClient client = new RESTClient("http://somedomain.com/path/")
return client.post( accept: ContentType.ANY,
headers: [ callTreeId: uuid, jwt: token ] )
{
json payload
}
}
The passed map comes from another class which is responsible for some transformation and building the map containing the data I want to post.
The maps is structured like this:
Map data =
[
country: "String1",
fulfillingStoreId: "String2",
customerId: "String3",
cardholderId: "String3",
deliveryDate: "String4",
deliveryAddressId: "String5",
serviceType: "String6",
paymentType: "String1",
source: "String7",
origin:
[
system: "String8",
id: "String9"
]
,
contactFirstName: "String10",
contactLastName: "String11",
contactPhoneNumber: "String12",
items: [m_itemList] //a list that holds instances of some item objects
]
def List <Item> m_itemList = []
class Item {
def porperty1 = ""
def porperty2 = ""
def porperty3 = ""
}
Using JsonOutput.prettyPrint(JsonOutput.toJson(data)) prints a nice Json string representation to the console - everything looks as expected.
Now, passing 'data" map to the post-closure (the payload) raises a "java.lang.StackOverflowError"
Stack trace:
Exception in thread "main" java.lang.StackOverflowError
at java.util.HashMap.getNode(Unknown Source)
at java.util.HashMap.get(Unknown Source)
at java.lang.ClassLoader.getPackage(Unknown Source)
at java.lang.Package.getPackage(Unknown Source)
at java.lang.Class.getPackage(Unknown Source)
at wslite.json.JSONObject.wrap(JSONObject.java:1595)
at wslite.json.JSONArray.<init>(JSONArray.java:173)
at wslite.json.JSONObject.wrap(JSONObject.java:1590)
at wslite.json.JSONObject.populateMap(JSONObject.java:1012)
at wslite.json.JSONObject.<init>(JSONObject.java:292)
at wslite.json.JSONObject.wrap(JSONObject.java:1606)
at wslite.json.JSONObject.populateMap(JSONObject.java:1012)
at wslite.json.JSONObject.<init>(JSONObject.java:292)
at wslite.json.JSONObject.wrap(JSONObject.java:1606)
at wslite.json.JSONArray.<init>(JSONArray.java:173)
at wslite.json.JSONObject.wrap(JSONObject.java:1590)
at wslite.json.JSONObject.populateMap(JSONObject.java:1012)
at wslite.json.JSONObject.<init>(JSONObject.java:292)
at wslite.json.JSONObject.wrap(JSONObject.java:1606)
at wslite.json.JSONObject.populateMap(JSONObject.java:1012)
at wslite.json.JSONObject.<init>(JSONObject.java:292)
at wslite.json.JSONObject.wrap(JSONObject.java:1606)
at wslite.json.JSONArray.<init>(JSONArray.java:173)
at wslite.json.JSONObject.wrap(JSONObject.java:1590)
at wslite.json.JSONObject.populateMap(JSONObject.java:1012)
at wslite.json.JSONObject.<init>(JSONObject.java:292)
at wslite.json.JSONObject.wrap(JSONObject.java:1606)
at wslite.json.JSONObject.populateMap(JSONObject.java:1012)
at wslite.json.JSONObject.<init>(JSONObject.java:292)
at wslite.json.JSONObject.wrap(JSONObject.java:1606)
at wslite.json.JSONArray.<init>(JSONArray.java:173)
at wslite.json.JSONObject.wrap(JSONObject.java:1590)
at wslite.json.JSONObject.populateMap(JSONObject.java:1012)
at wslite.json.JSONObject.<init>(JSONObject.java:292)
at wslite.json.JSONObject.wrap(JSONObject.java:1606)
at wslite.json.JSONObject.populateMap(JSONObject.java:1012)
at wslite.json.JSONObject.<init>(JSONObject.java:292)
at wslite.json.JSONObject.wrap(JSONObject.java:1606)
at wslite.json.JSONArray.<init>(JSONArray.java:173)
at wslite.json.JSONObject.wrap(JSONObject.java:1590)
at wslite.json.JSONObject.populateMap(JSONObject.java:1012)
...
...
...
I understand that the contentbuilder of the wslite client accepts a map and i have done it before with other (simpler) requests. So what might be the problem?
Thank you in advance for your contributions.
UPDATE / WORKAROUND SOLUTION:
So after some digging I figured to just re-slurp the built json with the JsonSlurper before passing it to the content building clossure since the "prettyPrinting" the map shows correct results. Voila! No more StackOverFlow Exception.
I now cunstruct the map using the JsonBuilder and parse the result (String) with the JsonSlurper, finally pass this to WSLITE's content builder.

I just had the same issue. Turns out, my payload Map had values in it that were not Strings (they were UUIDs and Enums, which typically auto-toString() nicely). When I manually converted the values to Strings, the error went away.
payload.each { k, v -> payload[k] = v.toString() }

Related

kotlinx.serialization.MissingFieldException: Field 'X' is required for type with serial name but it was missing Error in kotlin

I am making a dictionary application and I have my own json dataset inside the asset file. I read from this data set for the first time and save it to the room, and then I read from the room for other times. However, I am encountering this error.
E/AndroidRuntime: FATAL EXCEPTION: main
Process: com.enestigli.dictionaryapp, PID: 1868
kotlinx.serialization.MissingFieldException: Field 'example' is required for type with serial name 'com.enestigli.dictionaryapp.data.locale.datamodel.Idioms', but it was missing
at kotlinx.serialization.internal.PluginExceptionsKt.throwMissingFieldException(PluginExceptions.kt:20)
at com.enestigli.dictionaryapp.data.locale.datamodel.Idioms.<init>(Idiom.kt:28)
at com.enestigli.dictionaryapp.data.locale.datamodel.Idioms$$serializer.deserialize(Idiom.kt:28)
at com.enestigli.dictionaryapp.data.locale.datamodel.Idioms$$serializer.deserialize(Idiom.kt:28)
at kotlinx.serialization.json.internal.PolymorphicKt.decodeSerializableValuePolymorphic(Polymorphic.kt:59)
at kotlinx.serialization.json.internal.StreamingJsonDecoder.decodeSerializableValue(StreamingJsonDecoder.kt:36)
at kotlinx.serialization.encoding.AbstractDecoder.decodeSerializableValue(AbstractDecoder.kt:43)
at kotlinx.serialization.encoding.AbstractDecoder.decodeSerializableElement(AbstractDecoder.kt:70)
at kotlinx.serialization.encoding.CompositeDecoder$DefaultImpls.decodeSerializableElement$default(Decoding.kt:535)
at kotlinx.serialization.internal.ListLikeSerializer.readElement(CollectionSerializers.kt:80)
at kotlinx.serialization.internal.AbstractCollectionSerializer.readElement$default(CollectionSerializers.kt:51)
at kotlinx.serialization.internal.AbstractCollectionSerializer.merge(CollectionSerializers.kt:36)
at kotlinx.serialization.internal.AbstractCollectionSerializer.deserialize(CollectionSerializers.kt:43)
at kotlinx.serialization.json.internal.PolymorphicKt.decodeSerializableValuePolymorphic(Polymorphic.kt:59)
at kotlinx.serialization.json.internal.StreamingJsonDecoder.decodeSerializableValue(StreamingJsonDecoder.kt:36)
at kotlinx.serialization.encoding.AbstractDecoder.decodeSerializableValue(AbstractDecoder.kt:43)
at kotlinx.serialization.encoding.AbstractDecoder.decodeSerializableElement(AbstractDecoder.kt:70)
at com.enestigli.dictionaryapp.data.locale.datamodel.IdiomItem$$serializer.deserialize(Idiom.kt:15)
at com.enestigli.dictionaryapp.data.locale.datamodel.IdiomItem$$serializer.deserialize(Idiom.kt:15)
at kotlinx.serialization.json.internal.PolymorphicKt.decodeSerializableValuePolymorphic(Polymorphic.kt:59)
at kotlinx.serialization.json.internal.StreamingJsonDecoder.decodeSerializableValue(StreamingJsonDecoder.kt:36)
at kotlinx.serialization.encoding.AbstractDecoder.decodeSerializableValue(AbstractDecoder.kt:43)
at kotlinx.serialization.encoding.AbstractDecoder.decodeSerializableElement(AbstractDecoder.kt:70)
at kotlinx.serialization.encoding.CompositeDecoder$DefaultImpls.decodeSerializableElement$default(Decoding.kt:535)
at kotlinx.serialization.internal.ListLikeSerializer.readElement(CollectionSerializers.kt:80)
at kotlinx.serialization.internal.AbstractCollectionSerializer.readElement$default(CollectionSerializers.kt:51)
at kotlinx.serialization.internal.AbstractCollectionSerializer.merge(CollectionSerializers.kt:36)
at kotlinx.serialization.internal.AbstractCollectionSerializer.deserialize(CollectionSerializers.kt:43)
at kotlinx.serialization.json.internal.PolymorphicKt.decodeSerializableValuePolymorphic(Polymorphic.kt:59)
at kotlinx.serialization.json.internal.StreamingJsonDecoder.decodeSerializableValue(StreamingJsonDecoder.kt:36)
at com.enestigli.dictionaryapp.data.locale.datamodel.IdiomDataModel$$serializer.deserialize-zhuhEdE(Idiom.kt:8)
at com.enestigli.dictionaryapp.data.locale.datamodel.IdiomDataModel$$serializer.deserialize(Idiom.kt:8)
at kotlinx.serialization.json.internal.PolymorphicKt.decodeSerializableValuePolymorphic(Polymorphic.kt:59)
at kotlinx.serialization.json.internal.StreamingJsonDecoder.decodeSerializableValue(StreamingJsonDecoder.kt:36)
at kotlinx.serialization.json.Json.decodeFromString(Json.kt:100)
at com.enestigli.dictionaryapp.data.locale.datasource.AssetDataSource.getIdioms-zhuhEdE(AssetsDataSource.kt:84)
E/AndroidRuntime: at com.enestigli.dictionaryapp.data.repository.IdiomsRepositoryImpl.initData(IdiomsRepositoryImpl.kt:21)
at com.enestigli.dictionaryapp.domain.use_case.idiom.insert.InsertIdiomsUseCase.initIdiomData(InsertIdiomsUseCase.kt:12)
at com.enestigli.dictionaryapp.presentation.SplashScreen.SplashScreenViewModel$insertAllDataToRoomDb$1.invokeSuspend(SplashScreenViewModel.kt:38)
at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33)
at kotlinx.coroutines.DispatchedTask.run(DispatchedTask.kt:106)
at android.os.Handler.handleCallback(Handler.java:942)
at android.os.Handler.dispatchMessage(Handler.java:99)
at android.os.Looper.loopOnce(Looper.java:201)
at android.os.Looper.loop(Looper.java:288)
at android.app.ActivityThread.main(ActivityThread.java:7898)
at java.lang.reflect.Method.invoke(Native Method)
at com.android.internal.os.RuntimeInit$MethodAndArgsCaller.run(RuntimeInit.java:548)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:936)
Suppressed: kotlinx.coroutines.DiagnosticCoroutineContextException: [StandaloneCoroutine{Cancelling}#5aeb15c, Dispatchers.Main.immediate]
my json dataset looks like this
[
{
"letter": "A",
"idioms": [
{
"idiom": "above board",
"meaning": "If something is above board, it's been done in a legal and honest way.",
"examples": [
"I'm sure the deal was completely above board as I know James well and he'd never do anything illegal or corrupt.",
"The minister claimed all the appointments were above board and denied claims that some positions had been given to his friends."
]
},
{
"idiom": "above the law",
"meaning": "If someone is above the law, they are not subject to the laws of a society.",
"examples": [
"Just because his father is a rich and powerful man, he seems to think he's above the law and he can do whatever he likes.",
"In a democracy, no-one is above the law - not even a president or a prime-minister."
]
},
{
"idiom": "Achilles' heel",
"meaning": "An Achilles' heel is a weakness that could result in failure.",
"examples": [
"He's a good golfer, but his Achilles' heel is his putting and it's often made him lose matches.",
"The country's dependence on imported oil could prove to be its Achilles' heel if prices keep on rising."
]
},
My Data Model
#JvmInline
#Serializable
value class IdiomDataModel(
val allData: List<IdiomItem>
)
#Serializable
data class IdiomItem(
val letter: String,
val idioms: List<Idioms>
) {
fun toEntity() = IdiomEntity(
letter = letter,
idioms = idioms
)
}
#Serializable
data class Idioms(
val idiom: String,
val meaning: String,
val example: List<String>
)
Entity
#Entity(tableName = "idioms")
data class IdiomEntity(
#ColumnInfo(name = "letter") val letter:String,
#ColumnInfo(name = "idioms") val idioms:List<Idioms>,
#PrimaryKey(autoGenerate = true) val uid:Int? = null
)
I think the val example: List<String> in the Idioms data class is from this part. Could it be that this does not exactly match the example list in the Json dataset?
In your Idioms data class, rename example to examples and it should work.
Old:
#Serializable
data class Idioms(
val idiom: String,
val meaning: String,
val example: List<String>
)
New
#Serializable
data class Idioms(
val idiom: String,
val meaning: String,
val examples: List<String>
)

Python Lambda actioning CSV file once but not the second time

I am experiencing a strange issue with my Python code. It's objective is the following:
Retrieve a .csv from S3
Convert that .csv into JSON (Its an array of objects)
Add a few key value pairs to each object in the Array, and change the original key values
Validate the JSON
Sent the JSON to a /output S3 bucket
Load the JSON into Dynamo
Here's what the .csv looks like:
Prefix,Provider
ABCDE,Provider A
QWERT,Provider B
ASDFG,Provider C
ZXCVB,Provider D
POIUY,Provider E
And here's my python script:
import json
import boto3
import ast
import csv
import os
import datetime as dt
from datetime import datetime
import jsonschema
from jsonschema import validate
s3 = boto3.client('s3')
dynamodb = boto3.resource('dynamodb')
providerCodesSchema = {
"type": "array",
"items": {
"type": "object",
"properties": {
"providerCode": {"type": "string", "maxLength": 5},
"providerName": {"type": "string"},
"activeFrom": {"type": "string", "format": "date"},
"activeTo": {"type": "string"},
"apiActiveFrom": {"type": "string"},
"apiActiveTo": {"type": "string"},
"countThreshold": {"type": "string"}
},
"required": ["providerCode", "providerName"]
}
}
datestamp = dt.datetime.now().strftime("%Y/%m/%d")
timestamp = dt.datetime.now().strftime("%s")
updateTime = dt.datetime.now().strftime("%Y/%m/%d/%H:%M:%S")
nowdatetime = dt.datetime.now()
yesterday = nowdatetime - dt.timedelta(days=1)
nintydaysfromnow = nowdatetime + dt.timedelta(days=90)
def lambda_handler(event, context):
filename_json = "/tmp/file_{ts}.json".format(ts=timestamp)
filename_csv = "/tmp/file_{ts}.csv".format(ts=timestamp)
keyname_s3 = "newloader-ptv/output/{ds}/{ts}.json".format(ds=datestamp, ts=timestamp)
json_data = []
for record in event['Records']:
bucket_name = record['s3']['bucket']['name']
key_name = record['s3']['object']['key']
s3_object = s3.get_object(Bucket=bucket_name, Key=key_name)
data = s3_object['Body'].read()
contents = data.decode('latin')
with open(filename_csv, 'a', encoding='utf-8') as csv_data:
csv_data.write(contents)
with open(filename_csv, encoding='utf-8-sig') as csv_data:
csv_reader = csv.DictReader(csv_data)
for csv_row in csv_reader:
json_data.append(csv_row)
for elem in json_data:
elem['providerCode'] = elem.pop('Prefix')
elem['providerName'] = elem.pop('Provider')
for element in json_data:
element['activeFrom'] = yesterday.strftime("%Y-%m-%dT%H:%M:%S.00-00:00")
element['activeTo'] = nintydaysfromnow.strftime("%Y-%m-%dT%H:%M:%S.00-00:00")
element['apiActiveFrom'] = " "
element['apiActiveTo'] = " "
element['countThreshold'] = "3"
element['updateDate'] = updateTime
try:
validate(instance=json_data, schema=providerCodesSchema)
except jsonschema.exceptions.ValidationError as err:
print(err)
err = "Given JSON data is InValid"
return None
with open(filename_json, 'w', encoding='utf-8-sig') as json_file:
json_file.write(json.dumps(json_data, default=str))
with open(filename_json, 'r', encoding='utf-8-sig') as json_file_contents:
response = s3.put_object(Bucket=bucket_name, Key=keyname_s3, Body=json_file_contents.read())
for jsonElement in json_data:
table = dynamodb.Table('privateProviders-loader')
table.put_item(Item=jsonElement)
print("finished enriching JSON")
os.remove(filename_csv)
os.remove(filename_json)
return None
I'm new to Python, so please forgive any amateur mistakes in the code.
Here's my issue:
When I deploy the code, and add a valid .csv into my S3 bucket, everything works.
When I then add an invalid .csv into my S3 buck, again it work, the import fails as the validation kicks in and tells me the problem.
However, when I add the valid .csv back into the S3 bucket, I get the same cloudwatch log as I did for the invalid .csv, and my Dynamo isn't updated, nor is an output JSON file sent to /output in S3.
With some troubleshooting I've noticed the following behavour:
When I first deploy the code, the first .csv loads as expected (dynamo table updated + JSON file sent to S3 + cloudwatch logs documenting the process)
If I enter the same valid .csv into the S3 bucket, it gives me the same nice looking cloudwatch logs, but none of the other actions take place (Dynamo not updated etc)
If I add the invalid .csv, that seems to break the cycle, and I get a nice Cloudwatch log showing the validation has kicked in, but if I reload the valid .csv, which just previously resulted in good cloudwatch logs (but no actual real outputs), I now get a repeat of the validation error log.
In short, the first time the function is invoked, it seems to work, the second time it doesn't.
It seems as though the python function is caching something or not closing out the function when finished, and I've played about with the return command etc, but nothing I've tried works. I've sunk many hours into trying to move parts of the code around etc. thinking the structure or order of events is the problem, and I've the code above gives me the closest behaviour to expected, given that it seems to work completely the first and only time I load the .csv into S3.
Any help or general pointers would be massively appreciated.
Thanks
P.s. Here's an example of the Cloudwatch log when validation kicks in a and stops an invalid .csv from being processed. If I then add a valid .csv to S£, the function is triggered, but I get this same error, even though the file is actually good.
2021-06-29T22:12:27.709+01:00 'ABCDEE' is too long
2021-06-29T22:12:27.709+01:00 Failed validating 'maxLength' in schema['items']['properties']['providerCode']:
2021-06-29T22:12:27.709+01:00 {'maxLength': 5, 'type': 'string'}
2021-06-29T22:12:27.709+01:00 On instance[2]['providerCode']:
2021-06-29T22:12:27.709+01:00 'ABCDEE'
2021-06-29T22:12:27.710+01:00 END RequestId: 81dd6a2d-130b-4c8f-ad08-39307841adf9
2021-06-29T22:12:27.710+01:00 REPORT RequestId: 81dd6a2d-130b-4c8f-ad08-39307841adf9 Duration: 482.43 ms Billed Duration: 483

Emit Python embedded object as native JSON in YAML document

I'm importing webservice tests from Excel and serialising them as YAML.
But taking advantage of YAML being a superset of JSON I'd like the request part of the test to be valid JSON, i.e. to have delimeters, quotes and commas.
This will allow us to cut and paste requests between the automated test suite and manual test tools (e.g. Postman.)
So here's how I'd like a test to look (simplified):
- properties:
METHOD: GET
TYPE: ADDRESS
Request URL: /addresses
testCaseId: TC2
request:
{
"unitTypeCode": "",
"unitNumber": "15",
"levelTypeCode": "L",
"roadNumber1": "810",
"roadName": "HAY",
"roadTypeCode": "ST",
"localityName": "PERTH",
"postcode": "6000",
"stateTerritoryCode": "WA"
}
In Python, my request object has a dict attribute called fields which is the part of the object to be serialised as JSON. This is what I tried:
import yaml
def request_presenter(dumper, request):
json_string = json.dumps(request.fields, indent=8)
return dumper.represent_str(json_string)
yaml.add_representer(Request, request_presenter)
test = Test(...including embedded request object)
serialised_test = yaml.dump(test)
I'm getting:
- properties:
METHOD: GET
TYPE: ADDRESS
Request URL: /addresses
testCaseId: TC2
request: "{
\"unitTypeCode\": \"\",\n
\"unitNumber\": \"15\",\n
\"levelTypeCode": \"L\",\n
\"roadNumber1\": \"810\",\n
\"roadName\": \"HAY\",\n
\"roadTypeCode\": \"ST\",\n
\"localityName\": \"PERTH\",\n
\"postcode\": \"6000\",\n
\"stateTerritoryCode\": \"WA\"\n
}"
...only worse because it's all on one line and has white space all over the place.
I tried using the | style for literal multi-line strings which helps with the line breaks and escaped quotes (it's more involved but this answer was helpful.) However, escaped or multiline, the result is still a string that will need to be parsed separately.
How can I stop PyYaml analysing the JSON block as a string and make it just accept a block of text as part of the emitted YAML? I'm guessing it's something to do with overriding the emitter but I could use some help. If possible I'd like to avoid post-processing the serialised test to achieve this.
Ok, so this was the solution I came up with. Generate the YAML with a placemarker ahead of time. The placemarker marks the place where the JSON should be inserted, and also defines the root-level indentation of the JSON block.
import os
import itertools
import json
def insert_json_in_yaml(pre_insert_yaml, key, obj_to_serialise):
marker = '%s: null' % key
marker_line = line_of_first_occurrence(pre_insert_yaml, marker)
marker_indent = string_indent(marker_line)
serialised = json.dumps(obj_to_serialise, indent=marker_indent + 4)
key_with_json = '%s: %s' % (key, serialised)
serialised_with_json = pre_insert_yaml.replace(marker, key_with_json)
return serialised_with_json
def line_of_first_occurrence(basestring, substring):
"""
return line number of first occurrence of substring
"""
lineno = lineno_of_first_occurrence(basestring, substring)
return basestring.split(os.linesep)[lineno]
def string_indent(s):
"""
return indentation of a string (no of spaces before a nonspace)
"""
spaces = ''.join(itertools.takewhile(lambda c: c == ' ', s))
return len(spaces)
def lineno_of_first_occurrence(basestring, substring):
"""
return line number of first occurrence of substring
"""
return basestring[:basestring.index(substring)].count(os.linesep)
embedded_object = {
"unitTypeCode": "",
"unitNumber": "15",
"levelTypeCode": "L",
"roadNumber1": "810",
"roadName": "HAY",
"roadTypeCode": "ST",
"localityName": "PERTH",
"postcode": "6000",
"stateTerritoryCode": "WA"
}
yaml_string = """
---
- properties:
METHOD: GET
TYPE: ADDRESS
Request URL: /addresses
testCaseId: TC2
request: null
after_request: another value
"""
>>> print(insert_json_in_yaml(yaml_string, 'request', embedded_object))
- properties:
METHOD: GET
TYPE: ADDRESS
Request URL: /addresses
testCaseId: TC2
request: {
"unitTypeCode": "",
"unitNumber": "15",
"levelTypeCode": "L",
"roadNumber1": "810",
"roadName": "HAY",
"roadTypeCode": "ST",
"localityName": "PERTH",
"postcode": "6000",
"stateTerritoryCode": "WA"
}
after_request: another value

Error in Dataweave transformation involving a JSON Payload in Mule 3.8.5

I am getting an error while using a dataweave transformation on a JSON Payload. The JSON Payload is
{
"requestId": "13431#1638a2abfb8",
"result": [
{
"batchId": 1028,
"importId": "1028",
"status": "Queued"
}
],
"success": true
}
The above payload is returned by a RESTful service and I have converted that to a object using byteArray to Object transformer before applying the following dataweave transformation
%dw 1.0
%output application/json
---
batchexecution:
{
batchid:payload.result[0].batchid,
status: payload.result[0].status,
success:payload.success
} when ((payload.result != null) and (sizeOf payload.result > 0))
otherwise
{
batchid: 0,
status:"Not queued",
success:false
}
I am expecting only one record for the result object and I have a check to see whether the array is null or its size is >0. I get the following error when I execute the transformation code. Not sure what is wrong here.
I am expecting the following output for the transformation but I am getting the error while executing the transformation code
{
"batchexecution": {
"batchId": 1028,
"status": "Queued",
"success": true
}
}
But I am getting the following error as You cannot compare a value of type ::array.
Message : Exception while executing:
{"requestId":"64b3#1638e55058c","result":[{"batchId":1037,"importId":"1037","status":"Queued"}],"success":true}
^
You cannot compare a value of type ::array.
Payload : {"requestId":"64b3#1638e55058c","result":[{"batchId":1037,"importId":"1037","status":"Queued"}],"success":true}
Payload Type : java.lang.String
Element : /marketing-dbmkt-etl-marketoFlow/processors/8 # marketing-dbmkt-etl-marketo:marketing-dbmkt-etl-marketo.xml:69 (Transform Message)
Element XML : <dw:transform-message doc:name="Transform Message" metadata:id="90448cfd-5884-441a-a989-e32e4877ac24">
<dw:input-payload mimeType="application/json" doc:sample="sample_data\batchreturnObject.dwl"></dw:input-payload>
<dw:set-payload>%dw 1.0%output application/json---batchexecution:{batchid:payload.result[0].batchid,status: payload.result[0].status,success:payload.success} when ((payload.result != null) and (sizeOf payload.result > 0))otherwise{batchid: 0,status:"Not queued",success:false}</dw:set-payload>
</dw:transform-message>
--------------------------------------------------------------------------------
Root Exception stack trace:
com.mulesoft.weave.mule.exception.WeaveExecutionException: Exception while executing:
{"requestId":"64b3#1638e55058c","result":[{"batchId":1037,"importId":"1037","status":"Queued"}],"success":true}
^
You cannot compare a value of type ::array.
at com.mulesoft.weave.mule.exception.WeaveExecutionException$.apply(WeaveExecutionException.scala:10)
at com.mulesoft.weave.mule.WeaveMessageProcessor.execute(WeaveMessageProcessor.scala:121)
at com.mulesoft.weave.mule.WeaveMessageProcessor.process(WeaveMessageProcessor.scala:67)
at org.mule.execution.ExceptionToMessagingExceptionExecutionInterceptor.execute(ExceptionToMessagingExceptionExecutionInterceptor.java:27)
at org.mule.execution.MessageProcessorNotificationExecutionInterceptor.execute(MessageProcessorNotificationExecutionInterceptor.java:108)
at org.mule.execution.MessageProcessorExecutionTemplate.execute(MessageProcessorExecutionTemplate.java:44)
at org.mule.processor.BlockingProcessorExecutor.executeNext(BlockingProcessorExecutor.java:88)
at org.mule.processor.BlockingProcessorExecutor.execute(BlockingProcessorExecutor.java:59)
at org.mule.execution.ExceptionToMessagingExceptionExecutionInterceptor.execute(ExceptionToMessagingExceptionExecutionInterceptor.java:27)
at org.mule.execution.MessageProcessorExecutionTemplate.execute(MessageProcessorExecutionTemplate.java:44)
at org.mule.processor.BlockingProcessorExecutor.executeNext(BlockingProcessorExecutor.java:98)
at org.mule.processor.BlockingProcessorExecutor.execute(BlockingProcessorExecutor.java:59)
at org.mule.interceptor.AbstractEnvelopeInterceptor.processBlocking(AbstractEnvelopeInterceptor.java:58)
at org.mule.processor.AbstractRequestResponseMessageProcessor.process(AbstractRequestResponseMessageProcessor.java:47)
at org.mule.processor.AsyncInterceptingMessageProcessor.processNextTimed(AsyncInterceptingMessageProcessor.java:129)
at org.mule.processor.AsyncInterceptingMessageProcessor$AsyncMessageProcessorWorker$1.process(AsyncInterceptingMessageProcessor.java:213)
at org.mule.processor.AsyncInterceptingMessageProcessor$AsyncMessageProcessorWorker$1.process(AsyncInterceptingMessageProcessor.java:206)
at org.mule.execution.ExecuteCallbackInterceptor.execute(ExecuteCallbackInterceptor.java:16)
at org.mule.execution.CommitTransactionInterceptor.execute(CommitTransactionInterceptor.java:35)
at org.mule.execution.CommitTransactionInterceptor.execute(CommitTransactionInterceptor.java:22)
at org.mule.execution.HandleExceptionInterceptor.execute(HandleExceptionInterceptor.java:30)
at org.mule.execution.HandleExceptionInterceptor.execute(HandleExceptionInterceptor.java:14)
at org.mule.execution.BeginAndResolveTransactionInterceptor.execute(BeginAndResolveTransactionInterceptor.java:67)
at org.mule.execution.ResolvePreviousTransactionInterceptor.execute(ResolvePreviousTransactionInterceptor.java:44)
at org.mule.execution.SuspendXaTransactionInterceptor.execute(SuspendXaTransactionInterceptor.java:50)
at org.mule.execution.ValidateTransactionalStateInterceptor.execute(ValidateTransactionalStateInterceptor.java:40)
at org.mule.execution.IsolateCurrentTransactionInterceptor.execute(IsolateCurrentTransactionInterceptor.java:41)
at org.mule.execution.ExternalTransactionInterceptor.execute(ExternalTransactionInterceptor.java:48)
at org.mule.execution.RethrowExceptionInterceptor.execute(RethrowExceptionInterceptor.java:28)
at org.mule.execution.RethrowExceptionInterceptor.execute(RethrowExceptionInterceptor.java:13)
at org.mule.execution.TransactionalErrorHandlingExecutionTemplate.execute(TransactionalErrorHandlingExecutionTemplate.java:110)
at org.mule.execution.TransactionalErrorHandlingExecutionTemplate.execute(TransactionalErrorHandlingExecutionTemplate.java:30)
at org.mule.processor.AsyncInterceptingMessageProcessor$AsyncMessageProcessorWorker.doRun(AsyncInterceptingMessageProcessor.java:205)
at org.mule.work.AbstractMuleEventWork.run(AbstractMuleEventWork.java:53)
at org.mule.work.WorkerContext.run(WorkerContext.java:301)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:748)
********************************************************************************
The problem here is not obvious, but I have come across the same issue before - it's related to the sizeOf function and the poor way that Mule applies precedence to some of it's operators. When you say (sizeOf payload.result > 0) it's first trying to attempt to resolve the payload.result > 0 expression - hence the error you're seeing (it's trying to compare an array to 0).
Please use ((sizeOf payload.result) > 0) instead (I always make a point of wrapping sizeOf in parentheses for this reason).
As a side note, you have batchid:payload.result[0].batchid - it should be batchId:payload.result[0].batchId (capitalisation in batchId)
whenever you are using any function like sizeOf in dataweave try to encapsulate it with round braces to avoid these kinds of errors.
#ghoshyTech in your case
when ((payload.result != null) and ((sizeOf payload.result) > 0))

how do i pretty print a JsonSlurper.parse (url) result of type groovy.json.internal.LazyMap

I have a piece of code that calls google geocode API and returns Json result, like this
def response = new JsonSlurper().parse (url.toURL())
however the return type is actually of type groovy.json.internal.LazyMap.
when i try and pretty print that with the following
def res = JsonOutput.prettyPrint (response.toString())
i get an error like this
Caught: groovy.json.JsonException: Lexing failed on line: 1, column: 2, while reading 'r', no possible valid JSON value or punctuation could be recognized.
groovy.json.JsonException: Lexing failed on line: 1, column: 2, while reading 'r', no possible valid JSON value or punctuation could be recognized.
at org.softwood.Geolocation.Geocoder.completeLatLong(Geocoder.groovy:29)
at org.softwood.Geolocation.Geocoder$completeLatLong.call(Unknown Source)
at org.softwood.Geolocation.TestGeoScript.run(TestGeoScript.groovy:13
the actual toString() on that lazy maps gives - which doesn't put quotes round the string results - presumably why it wont parse correctly
{results=[{address_components=[{long_name=South Close, short_name=South Cl, types=[route]}, {long_name=Ipswich, short_name=Ipswich, types=[locality, political]}, {long_name=Ipswich, short_name=Ipswich, types=[postal_town]}, {long_name=Suffolk, short_name=Suffolk, types=[administrative_area_level_2, political]}, {long_name=United Kingdom, short_name=GB, types=[country, political]}, {long_name=IP4 2TH, short_name=IP4 2TH, types=[postal_code]}], formatted_address=South Cl, Ipswich, Suffolk IP4 2TH, UK, geometry={bounds={northeast={lat=52.068566, lng=1.1667458}, southwest={lat=52.0672503, lng=1.1658643}}, location={lat=52.06789149999999, lng=1.1663008}, location_type=GEOMETRIC_CENTER, viewport={northeast={lat=52.0692571302915, lng=1.167654030291502}, southwest={lat=52.0665591697085, lng=1.164956069708498}}}, place_id=ChIJr3u-xXyf2UcRJF_b9Yp2_Ng, types=[route]}, {address_components=[{long_name=IP4 2TH, short_name=IP4 2TH, types=[postal_code]}, {long_name=South Close, short_name=South Cl, types=[route]}, {long_name=Ipswich, short_name=Ipswich, types=[locality, political]}, {long_name=Ipswich, short_name=Ipswich, types=[postal_town]}, {long_name=Suffolk, short_name=Suffolk, types=[administrative_area_level_2, political]}, {long_name=United Kingdom, short_name=GB, types=[country, political]}], formatted_address=South Cl, Ipswich, Suffolk IP4 2TH, UK, geometry={bounds={northeast={lat=52.068475, lng=1.1673588}, southwest={lat=52.0666643, lng=1.1643497}}, location={lat=52.0676263, lng=1.1658643}, location_type=APPROXIMATE, viewport={northeast={lat=52.0689186302915, lng=1.1673588}, southwest={lat=52.0662206697085, lng=1.1643497}}}, place_id=ChIJ7asZ3Xyf2UcRavs18W4IXUM, types=[postal_code]}], status=OK}
query - given the returned result - how to you 'convert it' into a form that prettyPrint will parse and render ?
Using JsonBuilder seems like the easiest route to take here:
String json = new JsonBuilder(response).toPrettyString()
Should give you the pretty json you're after?
Also possible using JsonOutput:
import groovy.json.JsonOutput;
def json = JsonOutput.toJson([foo: 'bar', baz: [1]])
assert json == '{"foo":"bar","baz":[1]}'
def pretty = JsonOutput.prettyPrint(json)