I want to expose data in a mysql data source as a soap web service using Talend. How can I achieve that. Please advice.
The most powerful way to do so to do so is probably to build a WSDL using Talend Open ESB Studio (Tutorial)
Alternatively, if you don't like to use another piece of software or you prefer a quick-and-dirty solution, you can achieve this with standard Talend Open Studio for Data Integration. You just need to draw a standard Talend job which redirects your mysql data output to a tBufferOutput instance. Then, you must export your job as Axis webservice, finally deploy it in your application server (Tutorial). This way, your buffer data will be automatically exposed via web service. But it has a drawback: you cannot tune the response (ie, add labels to fields or refactor the SOAP response structure at all).
Related
i am performing an R&D to check whether are there any open source ESB frameworks which provides the following features
1. API to host a SOAP webservice/consume a service by providing the required data. We are working on a tool to design webservices for our product and the metadata created from the tool would be input to the API. What we are looking is that the metadata generated should be hosted as a webservice instantly without restarting the ESB container.
2. API provided by the framework to define the count of a service executions performed.
3. ESB framework which supports versioning both in hosting a service and also consuming a service.
It would be very helpful if someone can provide a direction to such ESB frameworks
Talend ESB provides the functionality you described.
it can consume a WSDL (your metadata created from your tool) to quickly create a web service service. It is hot deployed into a Karaf container (i.e. do not need to restart the ESB container)
you can use built in logging or custom logging to count executions performed.
it can be hooked into git or svn for versioning.
There is a free version you can download here.
Here is a basic tutorial on the product which demonstrates a. ingesting a WSDL, b. hot deployment, and c. execution count (though it does not go into how to log that).
I have series of documents that I need to migrate into MarkLogic. The documents are available to me via RESTful services in JSON. What I want to know is there anyway, such as through the MLCP or Query Console to call those RESTful services and pull in the data, otherwise I have to create a small Java app and dump the files to a share then pick them up from MarkLogic.
mlcp is designed to source data from the file system or a MarkLogic database. Take a look at the Java Client API to perform ingestion from other sources. For example, you can fire up your favorite HTTP client in Java and add the results to a DocumentWriteSet. The write set acts like a buffer, allowing you to batch requests to MarkLogic for efficiency. You can then send that DocumentWriteSet to MarkLogic with one of the DocumentManager.write() methods. Take a look at the documentation for many more details or the "Bulk Writes" section of the getting started cookbook.
I'm creating a multi-platform app, mostly for web interface, mobile and a windows application. The app will manage user task lists and sync them to the server, but also store them locally for processing data faster.
My idea of architecture until now is:
Keeping most of the processing on client side, eventually syncing with the server.
Developing an API to provide and receive data that will be saved on the server (basically just a json wrapper web service)
The data flow:
user Authenticates -> Requests updated Json objects to the server -> populate client-side objects -> work with client-side objects -> send a json object back to the server -> server updates data.
Is this a good approach? I've never done this, can you guys give me some tips?
I think you are on the right track. The idea is to decouple the front-end from the back-end. The backend shall expose a set of CRUD (Create, Read, Update, Delete) functions as RESTful JSON web services. All your different flavours of UI (mobile, web, windows) can consume the same API.
I would recommend for the web front-end to take a look at AngularJS together with bootstrap.
Regarding the backend, you could implement it as a simple Java web application with Jersey/JAX-RS or alternatively, you could check Node.js + Express.
Has anyone had much experience with data migration into and out of NetSuite? I have to export DB2 tables into MySQL, manipulate data, and then export ina CSV file. Then take a CSV file of accounts and manipulate the data again for accounts to match up from our old system to new. Anyone tried to do this in MySQL?
A couple of options:
Invest in a data transformation tool that connects to NetSuite and DB2 or MySQL. Look at Dell Boomi, IBM Cast Iron, etc. These tools allow you to connect to both systems, define the data to be extracted, perform data transformation functions and mappings and do all the inserts/updates or whatever you need to do.
For MySQL to NetSuite, php scripts can be written to access MySQL and NetSuite. On the NetSuite side, you can either do SOAP web services, or you can write custom REST APIs within NetSuite. SOAP is probably a bit slower than REST, but with REST, you have to write the API yourself (server side JavaScript - it's not hard, but there's a learning curve).
Hope this helps.
I'm an IBM i programmer; try CPYTOIMPF to create a pretty generic CSV file. I'll go to a stream file - if you have NetServer running you can map a network drive to the IFS directory or you can use FTP to get the CSV file from the IFS to another machine in your network.
Try Adeptia's Netsuite integration tool to perform ETL. You can also try Pentaho ETL for this (As far as I know Celigo's Netsuite connector is built upon Pentaho). Also Jitterbit does have an extension for Netsuite.
We primarily have 2 options to pump data into NS:
i)SuiteTalk ---> Using which we can have SOAP based transformations.There are 2 versions of SuiteTalk synchronous and asynchronous.
Typical tools like Boomi/Mule/Jitterbit use synchronous SuiteTalk to pump data into NS.They also have decent editors to help you do mapping.
ii)RESTlets ---> which are typical REST based architures by NS can also be used but you may have to write external brokers to communicate with them.
Depending on your need you can have whatever you need.IN most of the cases you will be using SuiteTalk to bring in data to Netsuite.
Hope this helps ...
We just got done doing this. We used an iPAAS platform called Jitterbit (similar to Dell Boomi). It can connect to mySql and to NetSuite and you can do transformations in the tool. I have been really impressed with the platform overall so far
There are different approaches, I like the following to process a batch job:
To import data to Netsuite:
Export CSV from old system and place it in Netsuite's a File Cabinet folder (Use a RESTlet or Webservices for this).
Run a scheduled script to load the files in the folder and update the records.
Don't forget to handle errors. Ways to handle errors: send email, create custom record, log to file or write to record
Once the file has been processed move the file to another folder or delete it.
To export data out of Netsuite:
Gather data and export to a CSV (You can use a saved search or similar)
Place CSV in File Cabinet folder.
From external server call webservices or RESTlet to grab new CSV files in the folder.
Process file.
Handle errors.
Call webservices or RESTlet to move CSV File or Delete.
You can also use Pentaho Data Integration, its free and the learning curve is not that difficult. I took this course and I was able to play around with the tool within a couple of hours.
I am developing an education app for Kids.
The application is going to contain pictures, stories and video as well.
Including all above contents in the app will surely bloat it and hence i would like store all data on a server that will be accessed by my app.
I haven't used any remote databases (like MySQL or Oracle) with any other iOS app. In fact i am a newbie in developing such kind of apps. Can any one point me to a sample
Connecting to a remote mysql is really not recommended.
The security here is critical.
You should create a webservice and my advice to you is to make sure that the access to the webservice is restricted
The webservice can be your own "protocol" or any other well known protocol like SOAP
By your own I mean, json, csv .... or whatever.
Edit 1
The technology of your webservice should be dependent on many things.
If the system is small, and the code needs to be update very often, I would suggest to do it with PHP and some small(!) MVC framework like CI.
But if its a large system with needs of ACL (access control list) I will probably choose java with spring...
I suggest that : Do not connect to / use database directly from user application. It may causes serious security problems and your app should have native SQL drivers to connect db.
So, create a web service that receive queries from the application and response in XML, JSON or some other strings that easy to parse. This will be much easier than embed native APIs into your apps.