MULE ENRICH with incoming pojo - esb

i want add some extra information on incomming Pojo, i have used message enricher in mule and do that, here is my full flow. I am using subflow to get the payload and select some values in DB and set that value in same pojo and returning, while in target i am setting as payload but i am getting error like this "An Expression Enricher for "payload" is not registered with Mule."
here is mu flow
<enricher doc:name="Message Enricher">
<core:flow-ref name="flows1Flow1" doc:name="Flow Reference"/>
<enrich source="#[groovy:payload]" target="#[payload]"/>
<logger message="AFTER Enrich: #[payload]" level="INFO" doc:name="Logger"/>
<component class="com.enrich.AfterEnricher" doc:name="Java"/>
<sub-flow name="flows1Flow1" doc:name="flows1Flow1">
<component class="com.enrich.MessageEnrichPattern" doc:name="Java"/>
<jdbc:outbound-endpoint exchange-pattern="request-response" queryKey="selectData" connector-ref="jdbcConnector" doc:name="Database (JDBC)">
<jdbc:query key="selectData" value="SELECT Username, Password, ModuleId from Credentials where ModuleId=#[map-payload:moduleId]"/>
</jdbc:outbound-endpoint>
<logger message="#[payload]" level="INFO" doc:name="Logger"/>
<component class="com.enrich.ReceiveMessageEnrichPattern" doc:name="Java"/>
</sub-flow>
here ReceiveMessageEnrichPattern returing
Credential credential = new Credential();
credential.setUname(hashMap.get("USERNAME").toString());
credential.setPwd(hashMap.get("PPPP").toString());
credential.setMid(hashMap.get("MODULEID").toString());
return credential;
but in after enrich component i am getting exception. Please help me how can enrich my incoming pojo with extra info can add.

According to the docs, Mule currently only supports two targets for enrichment:
flow variables,
message headers.
To achieve your goal you need to:
store the enricher result (Credential object) in a flow variable,
use a custom transformer to copy the values from the Credential object found in the flow variable to the POJO payload in your main flow.

Related

Cannot Read JSON request body parameters in wso2 ESB as an api

I have created an API in the WSO2 ESB (4.8.1) and I wanted to send a PUT request to that API with a request body. I have tried with the sample
and I wanted to log a property values in the insequence of the defined API.
This is my request body:
This is the way how I tried to log the location name:
But I’m getting an error like this:
(ERROR - SynapseJsonPath #stringValueOf. Error evaluating JSON Path . Returning empty result. Error>>> invalid path)
So how can I read these values?
To achieve your requirement, you should send the "Content-Type" HTTP header with the request like below,
"Content-Type : application/json"
enter image description here
Then you can log the specific JSON element like below.
<log>
<property name="location" expression="json-eval($.coordinates.location[0].name)"></property>
</log>
Then you can see following log,
enter image description here
Thanks.
If you want to get single variable from user in json request you can use this code
Use This json:
{
"namee":"UsmanYaqooooooooooob"
}
Api Code:
<api xmlns="http://ws.apache.org/ns/synapse" name="Hello" context="/hello">
<resource methods="POST" uri-template="/post">
<inSequence>
<log level="custom">
<property name="===========inSequence" value="****"></property>
<property name="locationsssssssss" expression="json-eval(.namee)"></property>
</log>
<payloadFactory media-type="json">
<format>{"hello":"world"}</format>
<args></args>
</payloadFactory>
<property name="messageType" value="text/xml"></property>
<respond></respond>
</inSequence>
</resource>
</api>

Extract fields from JSON response in Mule

I have a JSON response which is like {"id":10,"name":"ABCD","deptId":0,"address":null}
I need to split this JSON and extract the id to pass on to another service.
My mule xml is as below
<jersey:resources doc:name="REST">
<component class="com.employee.service.EmployeeService"/>
</jersey:resources>
<object-to-string-transformer doc:name="Object to String"/>
<logger message="Employee Response #[payload]" level="INFO" doc:name="Logger"/>
<set-payload value="#[payload]" doc:name="Set Payload" />
<json:object-to-json-transformer doc:name="Convert String to JSON" />
<logger message="JSON Response #[payload]" level="INFO" doc:name="Logger"/>
<json:json-to-object-transformer returnClass="java.util.Map" />
<expression-transformer expression="#[payload]" />
<collection-splitter />
When I run this I get the error
Object "java.util.LinkedHashMap" not of correct type. It must be of type "{interface java.lang.Iterable,interface java.util.Iterator,interface org.mule.routing.MessageSequence,interface java.util.Collection}" (java.lang.IllegalArgumentException). Message payload is of type: LinkedHashMap
How can I fix this error?
Thanks
I was able to get this done by writing a custom converter
remove your last four lines of code. set logger #[payload.id] in flowvars and access it
I believe the error you are getting is already on this part, <collection-splitter />. Have you debugged this already?
Not sure what the splitter is for but you can simply do #[payload.id] to get id once you have a HashMap type of payload.
The JSon module as well has the ability to use jsonpath in expressions, such as:
#[json:/id]

Insert json object in mysql DB with Mule ESB

Good evening!
I’m trying to insert an entire json object into mysql table. I’m using json to object transformer to convert json into HashMap. Json is this:
{
"content": {
"fill": "none",
"stroke": "#fff",
"path": [
["M", 422, 115],
["L", 472, 167.5]
],
"stroke-width": 4,
"stroke-linecap": "round",
"stroke-linejoin": "round",
"transform": [],
"type": "path",
"note": {
"id": 47,
"page":0,
"ref": 3,
"txt": "teste do serviço",
"addedAt": 1418133743604,
"addedBy": "valter.gomes"
}
}
}
I need insert "content" object, but when I try access it by #[payload.content], threws an exception :
Root Exception stack trace:
java.sql.SQLException: Incorrect string value: '\xAC\xED\x00\x05sr...' for column 'content' at row 1
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:996)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3887)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3823)
+ 3 more (set debug level logging or '-Dmule.verbose.exceptions=true' for everything)
We found what I think is a workaround. Before convert into a HashMap I get "content" object into a variable #[json:content] and record it in DB #[flowVars.rawContent]. When I retrive it from DB I convert ResultSet into String using Object to String converter.
But, Im not confortable with this solution. Is this the right way to do so? Or does exist other one ? Maybe the right one.
Tks a lot for your help.
When you receive the json you can transform to Map class (By default json:json-to-object-transformer return JsonData). For that reason I have specified Map class. So, after that you can read content from payload using #[payload.content]
I attached my flow:
<flow name="demoFlow1" doc:name="demoFlow1">
<http:inbound-endpoint exchange-pattern="request-response"
host="localhost" port="8081" path="demo" doc:name="HTTP" />
<scripting:component doc:name="Groovy">
<scripting:script engine="Groovy"><![CDATA[
Map<String, Object> map1 = new HashMap<String, Object>();
map1.put("fill","none");
map1.put("stroke","#fff");
Map<String, Object> map = new HashMap<String, Object>();
map.put("content", map1);
return map;]]></scripting:script>
</scripting:component>
<json:object-to-json-transformer doc:name="Object to JSON"/>
<logger level="INFO" message=">>1 #[payload]" doc:name="Logger" />
<json:json-to-object-transformer returnClass="java.util.Map" doc:name="JSON to Object"/>
<set-payload value="#[payload.content]" doc:name="Set Payload"/>
<json:object-to-json-transformer doc:name="Object to JSON"/>
<logger level="INFO" message=">>3 #[payload]" doc:name="Logger" />
</flow>
Eddú is right, but the example he gives is really too complex.
As he said, all you need is:
<json:json-to-object-transformer returnClass="java.util.Map" />
After that transformer, you can retrieve any field/sub-field in the Map. I suggest using message.payload instead of payload by the way, the latter has shown some odd behaviours in the past.
So use: #[message.payload.content]
Also, this will give you an object of type java.util.Map. Not sure how you're going to insert the object in the DB but since you are not showing this part in your question, I imagine you'll figure it out...

Alfresco Workflow: How to set association value when update "Task" by REST Json?

My application is interacting with Alfresco workflow by REST.
There is a task having association to an object of type cm:person, its value should be collected form the end user -to be used as assignee of the next task-. How can I set this value by REST ??
I tried to send HTTP "PUT" request (content-type:application/json) on URL
http://localhost:8080/alfresco/service/api/task-instances/activiti$11102
and body request is:
{
"cio_employee": "workspace://SpacesStore/bcb9817f-5778-484b-be16-a388eb18b5ab" }
where "workspace://SpacesStore/bcb9817f-5778-484b-be16-a388eb18b5ab" is the reference of admin person, but when I end the task (by REST also), Alfresco throws error:
...
Caused by: org.activiti.engine.ActivitiException: Unknown property
used in expression: ${cio_employee.properties.userName} ... Caused by:
org.activiti.engine.impl.javax.el.PropertyNotFoundException: Could not
find property properties in class java.lang.String
Below is the task and its model definition:
//User Task:
<userTask id="assignHandler" name="Assign Employee" activiti:assignee="admin"
activiti:formKey="cio:assignEmployeeTask">
<documentation>Please, Assign employee to the next task</documentation>
<extensionElements>
<activiti:taskListener event="complete"
class="org.alfresco.repo.workflow.activiti.tasklistener.ScriptTaskListener">
<activiti:field name="script">
<activiti:string>
execution.setVariable('cio_employee',
task.getVariable('cio_employee'));
</activiti:string>
</activiti:field>
</activiti:taskListener>
</extensionElements>
</userTask>
///////////////////////////////////////////////////////////
//Model
...
<types>
...
<type name="cio:assignEmployeeTask">
<parent>bpm:workflowTask</parent>
<mandatory-aspects>
<aspect>cio:employee</aspect>
</mandatory-aspects>
</type>
...
</types>
...
<aspects>
<aspect name="cio:employee">
<associations>
<association name="cio:employee">
<source>
<mandatory>false</mandatory>
<many>false</many>
</source>
<target>
<class>cm:person</class>
<mandatory>true</mandatory>
<many>false</many>
</target>
</association>
</associations>
</aspect>
</aspects>
////////////////////////////////////////////////////////////////////////
After searching deeply, you will need to send POST request on
http://localhost:8080/alfresco/s/api/task/[taskId]/formprocessor
with body:
{
"assoc_cio_employee_added": "workspace://SpacesStore/bcb9817f-5778-484b-be16-a388eb18b5ab"
}
and for removing use the key "assoc_cio_employee_removed"
https://wiki.alfresco.com/wiki/Forms_Developer_Guide
Hope it may help someone.
Alfresco Version 4.2.e

How to read huge CSV file in Mule

I'am using Mule Studio 3.4.0 Community Edition.
I have a big problem about how to parse a large CSV file incoming with File Endpoint. The scenario is that I have 3 CSV files and I would putting the files'content into a database.
But when I try to load a huge file (about 144MB) I get the "OutOfMemory" Exception. I thought as solution to divide/split my the large CSV into smaller size CSVs (I don't know if this solution is the best) o try to find a way to process CSV without throwing an exception.
<file:connector name="File" autoDelete="true" streaming="true" validateConnections="true" doc:name="File"/>
<flow name="CsvToFile" doc:name="CsvToFile">
<file:inbound-endpoint path="src/main/resources/inbox" moveToDirectory="src/main/resources/processed" responseTimeout="10000" doc:name="CSV" connector-ref="File">
<file:filename-wildcard-filter pattern="*.csv" caseSensitive="true"/>
</file:inbound-endpoint>
<component class="it.aizoon.grpBuyer.AddMessageProperty" doc:name="Add Message Property"/>
<choice doc:name="Choice">
<when expression="INVOCATION:nome_file=azienda" evaluator="header">
<jdbc-ee:csv-to-maps-transformer delimiter="," mappingFile="src/main/resources/companies-csv-format.xml" ignoreFirstRecord="true" doc:name="CSV2Azienda"/>
<jdbc-ee:outbound-endpoint exchange-pattern="one-way" queryKey="InsertAziende" queryTimeout="-1" connector-ref="jdbcConnector" doc:name="Database Azienda">
<jdbc-ee:query key="InsertAziende" value="INSERT INTO aw006_azienda VALUES (#[map-payload:AW006_ID], #[map-payload:AW006_ID_CLIENTE], #[map-payload:AW006_RAGIONE_SOCIALE])"/>
</jdbc-ee:outbound-endpoint>
</when>
<when expression="INVOCATION:nome_file=servizi" evaluator="header">
<jdbc-ee:csv-to-maps-transformer delimiter="," mappingFile="src/main/resources/services-csv-format.xml" ignoreFirstRecord="true" doc:name="CSV2Servizi"/>
<jdbc-ee:outbound-endpoint exchange-pattern="one-way" queryKey="InsertServizi" queryTimeout="-1" connector-ref="jdbcConnector" doc:name="Database Servizi">
<jdbc-ee:query key="InsertServizi" value="INSERT INTO ctrl_aemd_unb_servizi VALUES (#[map-payload:CTRL_ID_TIPO_OPERAZIONE], #[map-payload:CTRL_DESCRIZIONE], #[map-payload:CTRL_COD_SERVIZIO])"/>
</jdbc-ee:outbound-endpoint>
</when>
<when expression="INVOCATION:nome_file=richiesta" evaluator="header">
<jdbc-ee:csv-to-maps-transformer delimiter="," mappingFile="src/main/resources/requests-csv-format.xml" ignoreFirstRecord="true" doc:name="CSV2Richiesta"/>
<jdbc-ee:outbound-endpoint exchange-pattern="one-way" queryKey="InsertRichieste" queryTimeout="-1" connector-ref="jdbcConnector" doc:name="Database Richiesta">
<jdbc-ee:query key="InsertRichieste" value="INSERT INTO ctrl_aemd_unb_richiesta VALUES (#[map-payload:CTRL_ID_CONTROLLER], #[map-payload:CTRL_NUM_RICH_VENDITORE], #[map-payload:CTRL_VENDITORE], #[map-payload:CTRL_CANALE_VENDITORE], #[map-payload:CTRL_CODICE_SERVIZIO], #[map-payload:CTRL_STATO_AVANZ_SERVIZIO], #[map-payload:CTRL_DATA_INSERIMENTO])"/>
</jdbc-ee:outbound-endpoint>
</when>
</choice>
</flow>
Please, I do not know how to fix this problem.
Thanks in advance for any kind of help
As SteveS said, the csv-to-maps-transformer might try to load the entire file to memory before process it. What you can try to do is split the csv file in smaller parts and send those parts to VM to be processed individually.
First, create a component to achieve this first step:
public class CSVReader implements Callable{
#Override
public Object onCall(MuleEventContext eventContext) throws Exception {
InputStream fileStream = (InputStream) eventContext.getMessage().getPayload();
DataInputStream ds = new DataInputStream(fileStream);
BufferedReader br = new BufferedReader(new InputStreamReader(ds));
MuleClient muleClient = eventContext.getMuleContext().getClient();
String line;
while ((line = br.readLine()) != null) {
muleClient.dispatch("vm://in", line, null);
}
fileStream.close();
return null;
}
}
Then, split your main flow in two
<file:connector name="File"
workDirectory="yourWorkDirPath" autoDelete="false" streaming="true"/>
<flow name="CsvToFile" doc:name="Split and dispatch">
<file:inbound-endpoint path="inboxPath"
moveToDirectory="processedPath" pollingFrequency="60000"
doc:name="CSV" connector-ref="File">
<file:filename-wildcard-filter pattern="*.csv"
caseSensitive="true" />
</file:inbound-endpoint>
<component class="it.aizoon.grpBuyer.AddMessageProperty" doc:name="Add Message Property" />
<component class="com.dgonza.CSVReader" doc:name="Split the file and dispatch every line to VM" />
</flow>
<flow name="storeInDatabase" doc:name="receive lines and store in database">
<vm:inbound-endpoint exchange-pattern="one-way"
path="in" doc:name="VM" />
<Choice>
.
.
Your JDBC Stuff
.
.
<Choice />
</flow>
Maintain your current file-connector configuration to enable streaming. With this solution the csv data can be processed without the need to load the entire file to memory first.
HTH
I believe that the csv-to-maps-transformer is going to force the whole file into memory. Since you are dealing with one large file, personally, I would tend to just write a Java class to handle it. The File endpoint will pass a filestream to your custom transformer. You can then make a JDBC connection and pick off the information a row at a time without having to load the whole file. I have used OpenCSV to parse the CSV for me. So your java class would contain something like the following:
protected Object doTransform(Object src, String enc) throws TransformerException {
try {
//Make a JDBC connection here
//Now read and parse the CSV
FileReader csvFileData = (FileReader) src;
BufferedReader br = new BufferedReader(csvFileData);
CSVReader reader = new CSVReader(br);
//Read the CSV file and add the row to the appropriate List(s)
String[] nextLine;
while ((nextLine = reader.readNext()) != null) {
//Push your data into the database through your JDBC connection
}
//Close connection.
}catch (Exception e){
}