I have 3 S3 buckets:
input-files
in-progress
processed-files
The "input-files" bucket contains a list of CSV files and I want to get each input file (filename format: filename-timestamp) from the bucket one at a time and move it to the "in-progress" bucket and when the workflow is complete I want to move it to "processed-files" bucket. On error all file processing needs to stop.
In my flow I can get the content of the csv file but there is no reference to file name so not sure how I can implement the above because I can't specify the file that needs to be moved.
How can I implement the processing steps outlined above?
XML flow:
<?xml version="1.0" encoding="UTF-8"?>
<mule xmlns="http://www.mulesoft.org/schema/mule/core" xmlns:doc="http://www.mulesoft.org/schema/mule/documentation"
xmlns:spring="http://www.springframework.org/schema/beans" version="EE-3.8.1"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-current.xsd
http://www.mulesoft.org/schema/mule/core http://www.mulesoft.org/schema/mule/core/current/mule.xsd">
<flow name="CsvToMongo" >
<poll doc:name="Poll">
<s3:get-object-content config-ref="Amazon_S3__Configuration" bucketName="test-file-bucket" key="input-files/TestData.csv" mimeType="application/csv" doc:name="Amazon S3"/>
</poll>
<object-to-string-transformer encoding="UTF-8" mimeType="application/csv" doc:name="Object to String"/>
<logger message="#['**** Start writing CSV to database...']" level="INFO" doc:name="Logger: Start Process"/>
</flow>
</mule>
Software being used:
Anypoint Studio 6.2
Mule 3.8.1
Thanks
An approach that I used recently was to configure an Amazon Simple Queue Service (SQS) queue to receive S3 events from a bucket. (Configure the bucket to send events to the SQS queue).
Then in my Mule flow, my input source is an SQS poller.
The structure of the S3 event is well documented at AWS and is a JSON string (convert it to JSON object to use it) and contains all the relevant information that I needed to identify the actual file name.
It's working quite nicely.
Related
I used Microsoft EWS api SyncFolderItems to get mail changes, but got ErrorInvalidSyncStateData after several successful api calls.
The given SyncState request parameter is correct because it is the reponse from the last successful call.
The error response looks the same to the document shows
<?xml version="1.0" encoding="utf-8" ?>
<soap:Envelope xmlns:soap="http://schemas.xmlsoap.org/soap/envelope/"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:xsd="http://www.w3.org/2001/XMLSchema">
<soap:Header>
<t:ServerVersionInfo MajorVersion="8" MinorVersion="0"
MajorBuildNumber="628" MinorBuildNumber="0"
xmlns:t="https://schemas.microsoft.com/exchange/services/2006/types" />
</soap:Header>
<soap:Body>
<SyncFolderItemsResponse xmlns:m="https://schemas.microsoft.com/exchange/services/2006/messages"
xmlns:t="https://schemas.microsoft.com/exchange/services/2006/types"
xmlns="https://schemas.microsoft.com/exchange/services/2006/messages">
<m:ResponseMessages>
<m:SyncFolderItemsResponseMessage ResponseClass="Error">
<m:MessageText>Synchronization state data is corrupt or otherwise invalid.</m:MessageText>
<m:ResponseCode>ErrorInvalidSyncStateData</m:ResponseCode>
<m:DescriptiveLinkKey>0</m:DescriptiveLinkKey>
<m:SyncState />
<m:IncludesLastItemInRange>true</m:IncludesLastItemInRange>
</m:SyncFolderItemsResponseMessage>
</m:ResponseMessages>
</SyncFolderItemsResponse>
</soap:Body>
</soap:Envelope>
I use the above api to synchronize mails to my local storage. If I got ErrorInvalidSyncStateData, all I can do is delete all mails in my storage, and then re-synchronize mails (starting from empty SyncState).
I'm wondering if there is a better way to handle the error if someone has the experience using SyncFolderItems api.
Thank you.
There are few methods to avoid the above error. If you follow them you won't get the error.
Ensuring that the sync state value you are sending matches the sync state value returned during a previous synchronization.
Ensuring that you are not sending the sync state for the folder hierarchy when you attempt to sync items, and vice versa.
Ensuring that you are sending the sync state for the correct root folder.
Ensuring that the same root folder is specified in each request.
Ensuring that the previous request did not specify a root folder of
null, while the current request includes a root folder of root. Null
and root are not treated the same.
[Microsoft Documentation]: https://learn.microsoft.com/en-us/exchange/client-developer/exchange-web-services/handling-synchronization-related-errors-in-ews-in-exchange
I am a new with twilio.
I am trying to make a call using CallResource.Create(to, from, url: _url);
regarding the URL : I have the Azure account. I have uploaded custom xml.
Below you can find my xml
<?xml version="1.0" encoding="UTF-8"?>
<Response>
<Say voice="alice">Thanks for trying our documentation.Enjoy!</Say>
<Play>http://demo.twilio.com/docs/classic.mp3</Play>
</Response>
For testing i am using test message.
http://demo.twilio.com/docs/classic.mp3 , in the future i am planning to upload custom mp3 file also to azure file storage.
The end URL is
https://xxxx.file.core.windows.net/xxxxx/20180719112627.xml?sv=2017-11-09&ss=bfqt&srt=sco&sp=rwdlacup&se=2018-07-31T18:27:28Z&st=2018-07-01T10:27:28Z&spr=https,http&sig=gNqLuAofhePeOzuyVFWHSb0TCydgIW3ShOrRRfFEZ7o%3D
Unfortunately, i have got the exception:
"An attempt to retrieve content from https://xxxxxxx.file.core.windows.net/xxxxxxx/20180719112627.xml?sv=2016-05-31&sig=Pa4ery3QIruwYbNSJ1Nu7Y3EpKLjbd5mJXi46vnpoyU%3D&spr=https%2Chttp&se=2019-07-19T09%3A26%3A53Z&srt=sco&ss=bfqt&sp=raupwl returned the HTTP status code 405"
<?xml version="1.0" encoding="utf-8"?>
<Error>
<Code>UnsupportedHttpVerb</Code>
<Message>The resource doesn't support specified Http Verb.
RequestId:907409a8-d01a-0051-1c43-1f4bf1000000
Time:2018-07-19T09:30:41.8204847Z</Message>
</Error>
Rerarding CORS for FILE SERVICE :
it is
Is there a document/guide describing the requirements how to configure the azure environment for integration with twilio?
Many thanks for help.
Have you tried using the SAS url by generating one using Storage Explorer or Azure Portal?
I am writing a simple data service that exposes an operation as a resource in WSO2 EI 6.0.0. The dataservice is:
<data name="DataService" serviceNamespace="http://www.ds.com" transports="http https local">
<config id="MySQL">
<property name="carbon_datasource_name">MySQL</property>
</config>
<query id="cities" useConfig="MySQL">
<sql>SELECT ID, Name, CountryCode, District, Population from city</sql>
<result useColumnNumbers="true" outputType="json">{"Cities":{"City":[{"CityID": "$1","CityNAME":"$2","CountryCode": "$3","District": "$4","Population": "$5"}]}}</result>
</query>
<resource method="GET" path="cities">
<call-query href="cities"/>
</resource>
</data>
If I save this to a *.dbs, or generate this dataservice using WSO2 EI management console, it works properly returning the data in JSON format when I set the "Accept:application/json" header as documentation says.
If I generate a carbon file for this dataservice and deploy it in WSO2 EI, it fails to return the response in JSON format, always XML, and I have problems with other operations with input parameters (no input params are recognized).
What I have found out is that WSO2 EI deploys the dbs in the path
$WSO2EIHOME\wso2\tmp\carbonapps\-1234\1491338652277CompositeApplication_1.0.0.car\DataService_1.0.0
However, if I upload the dbs o generate it throught the management console wizard, it is placed in
$WSO2EIHOME\repository\deployment\server\dataservices
and it works. Actually, if I copy the "wrong" dbs from the location where is placed when deploying the carbon file, to $WSO2EIHOME\repository\deployment\server\dataservices it works!
Any help about how to succeed in deploy .car files containing dbs's with json resources in WSO2 EI?
I have a simple mule flow that streams a csv file to a custom java component. I need to be able to handle large files so don't want to use a Transformer that reads the file to memory.
Currently, I get the following error: "Failed to delete file "C:\temp\input\inputCSV.csv" as part of the file move operation. The file was being deleted because AutoDelete was set on the file connector."
Changing the mule XML config Autodelete="false" and specifying a destination directory for the 'processed' file results in a similar error. Could someone tell me how to stream the file and postpone autodeletion until the file has been read fully?
I'm calling .close() on my mule payloadStream, when I'm done, but mule seems to be completing the file deletion too early!
Here's the flow XML config...
<?xml version="1.0" encoding="UTF-8"?>
<mule xmlns:file="http://www.mulesoft.org/schema/mule/file" xmlns="http://www.mulesoft.org/schema/mule/core"
xmlns:doc="http://www.mulesoft.org/schema/mule/documentation"
xmlns:spring="http://www.springframework.org/schema/beans" version="CE-3.5.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://www.mulesoft.org/schema/mule/file http://www.mulesoft.org/schema/mule/file/current/mule-file.xsd
http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-current.xsd
http://www.mulesoft.org/schema/mule/core http://www.mulesoft.org/schema/mule/core/current/mule.xsd">
<spring:beans>
<spring:import resource="classpath*:/spring/config.xml" />
<spring:import resource="classpath*:/spring/extras/Rule-preprocessor-config.xml" />
</spring:beans>
<file:connector name="fileInput" streaming="true"
autoDelete="true"
moveToPattern="#[message.inboundProperties['originalFilename']]"
doc:name="File">
<!-- <service-overrides messageFactory="org.mule.transport.file.FileMuleMessageFactory" /> -->
</file:connector>
<flow name="stringflowFlow2x" doc:name="stringflowFlow2x">
<file:inbound-endpoint connector-ref="fileInput"
path="/temp/input" doc:name="inputCsv" responseTimeout="10000" fileAge="10000" />
<component class="com.benji.FileImportPreProcessor" doc:name="JavaPreProcessorLogic"/>
<logger message="Finished!" level="INFO" doc:name="Logger"/>
</flow>
</mule>
When on streaming mode, Mule will wrap the stream with a ReceiverFileInputStream. It will take care of the file removal or movement when the stream is closed. And this is the point, you should not call the close on the input stream. The stream itself will call it whenever EOF is hit.
I understand this a little differently: See considerations https://docs.mulesoft.com/mule-user-guide/v/3.6/file-transport-reference
If streaming is enabled a ReceiverFileInputStream is used as the payload for each file that is processed. This input stream’s close() method takes care of moving the file or deleting it. Streams are closed by transformers reading the input stream. If you process the stream in your own component implementation make sure to properly close the stream after reading.
Therefore I don't think mule handles this for you unless you use a transformer which is usually highly likely....but in my case some initial validation meant I had not even started to consider the payload meant I was ending the process before transforming the payload (and therefore not reading and closing the file stream)
You should you object-to-byte-array transformer in File connector itself. It will take care of closing the stream after reading the input stream.
I am creating some services using JAX-RS that need to take in arbitrarily complex objects as arguments, not just primitives like integers and strings. A discussion on the CXF mailing list says to just use a wrapper object as a single parameter in this case.
My concern is how to document the input format to the service? If creating a service that looks something like the following:
#POST
#Produces("application/json")
#Consumes("application/json")
#Path("oneParam")
public ComplexObject2 myServiceMethod(ComplexObject1 obj) {
Foo f = obj.foo
Bar b = obj.bar
...
}
the auto-generated WADL that CXF produces will only produce the following:
<resource path="/oneParam">
<method name="POST">
<request>
<representation mediaType="application/json"/>
</request>
<response>
<representation mediaType="application/json"/>
</response>
</method>
</resource>
This contains no information on what the request or response actually contains. Sergey on the CXF mailing list said it was possible to link a schema to the representation, but how am I supposed to do that? And how do I create the XSD?
(P.S. Using POST for idempotent resources might not be RESTful, but it's not important here as we are in essence doing RPC using Json. This is more or less a 1:1 clone of an existing SOAP based api.)
It is possible to link an XSD file into a WADL file and then to reference an XML element in the representation for requests and responses. However, as it is XML schema it doesn't apply to JSON representations.
To link an XSD into a WADL file, create a grammars element at the top of the file before the main resources element.
<grammars>
<include href="myapp.xsd"/>
</grammars>
Then add a reference to an XML element as follows (using a modified version of your example):
<resource path="/oneParam">
<method name="POST">
<request>
<representation mediaType="application/xml" element="myapp:oneParamRequest" />
</request>
<response>
<representation mediaType="application/xml" element="myapp:oneParamResponse" />
</response>
</method>
</resource>
The prefix myapp is defined in the XSD and can be used in the WADL file as well.
I don't know to to configure CXF to do this automatically. My experience with Jersey is similar and we use the generated WADL as a starting point for hand-editing later.