how to integrate a file to a ssis job? - ssis

I created an automated process for a multiple data integration with Foreach loop containers. Then I wanted to integrate a new file but without the process redoing all the steps
I created an automated process for a multiple data integration with Foreach loop containers. Then I wanted to integrate a new file but without the process redoing all the steps

Related

Azure Devops Release for application that remains the same except for appsettings.json

I am creating an Azure devops build pipeline and release. This release has a staging environment that utilizes a deployment group with 3 servers, in production it can have 50+ servers. The application will be the same across all the servers except for the appsettings file. appsettings will contain the db connections and location/server specific variables. I have looked into ways to manipulate this file on release per server, all I have come across are ways to have variable substitutions in the release for environments when you only need to switch values in a dev to staging to prod release. Is there a good way to manipulate this file per server in a deployment group rather than 50+ stages/tags, or a better way to setup my pipe and release?
Is there a good way to manipulate this file per server in a deployment
group rather than 50+ stages/tags
Afraid to say that as far as I know, we does not support this possible yet. But the if you host your app on Azure website, azure have one new feature can achieve this goal.
But if you host the app to self servers, I'm afraid that the better deployment approach in this scenario is Build once, deploy many. In another word, build the project in Build pipeline, and configure corresponding appsettings.json file on specific stage.
To improving the maintainability of the release and simplifying the configuration structure, you can make use of task group and the variable group. (Please keeping using variable substitutions in release)
Encapsulate a sequence of reusable tasks into task group, then this template will be used in every deployment group job. Note, you can make the reusable parameters as a part of template. Just abstract the app setting information and store them as variables into corresponding variable group.
At this moment, whenever you add a new server, you only need to save the corresponding app setting parameters into the created variable group. In release pipeline, you only need to add task group, and link the previous created variable group to specified stage. Execute the release pipeline, then everything will go as expect.
In post-maintenance, you just need to modify the basic configuration of deploy task task once, and it can be applied to all stages. When you need to modify the corresponding server app setting configuration, you can modify them by opening the corresponding variable group.

Schedule funtion to Pull CSV file to database

Sorry, I have not done any coding in WEB API so I don't have any sample to share here.
But I just wanted to know that, how we can schedule a task on server for WEB API to read a csv file after every 15 or 30 mins.
From client we are successfully able to push csv to my ftp server where my application is located. But how to retrieve the data from csv after every fixed interval?
WebAPI is the wrong tool for the job.
You can certainly push (or upload specifically) CSV data to a WebAPI method, and that method can do whatever you need to do with that data. But WebAPI doesn't run on a timer. It's not going to periodically look for data to process, you have to send it that data.
A web application is a request/response system. In order to do something, you have to send it a request.
For an application that periodically runs on a timer, consider a Windows Service or even simply a Console Application which is scheduled to run with some job scheduler (Windows Task Scheduler, cron, etc.). That would run periodically without the need for input or a user interface at all.
Now, if the code you need to use is in the WebAPI application already, that's ok. You can refactor that code into a separate Class Library in the same solution as the WebAPI application. Then add the new application (Windows Service or Console Application) to that solution and reference that same Class Library. Both applications can share the same supporting code. Indeed, the application layer should be as thin as possible over a set of common business logic that can be shared by multiple applications.

NetSuite Migrations

Has anyone had much experience with data migration into and out of NetSuite? I have to export DB2 tables into MySQL, manipulate data, and then export ina CSV file. Then take a CSV file of accounts and manipulate the data again for accounts to match up from our old system to new. Anyone tried to do this in MySQL?
A couple of options:
Invest in a data transformation tool that connects to NetSuite and DB2 or MySQL. Look at Dell Boomi, IBM Cast Iron, etc. These tools allow you to connect to both systems, define the data to be extracted, perform data transformation functions and mappings and do all the inserts/updates or whatever you need to do.
For MySQL to NetSuite, php scripts can be written to access MySQL and NetSuite. On the NetSuite side, you can either do SOAP web services, or you can write custom REST APIs within NetSuite. SOAP is probably a bit slower than REST, but with REST, you have to write the API yourself (server side JavaScript - it's not hard, but there's a learning curve).
Hope this helps.
I'm an IBM i programmer; try CPYTOIMPF to create a pretty generic CSV file. I'll go to a stream file - if you have NetServer running you can map a network drive to the IFS directory or you can use FTP to get the CSV file from the IFS to another machine in your network.
Try Adeptia's Netsuite integration tool to perform ETL. You can also try Pentaho ETL for this (As far as I know Celigo's Netsuite connector is built upon Pentaho). Also Jitterbit does have an extension for Netsuite.
We primarily have 2 options to pump data into NS:
i)SuiteTalk ---> Using which we can have SOAP based transformations.There are 2 versions of SuiteTalk synchronous and asynchronous.
Typical tools like Boomi/Mule/Jitterbit use synchronous SuiteTalk to pump data into NS.They also have decent editors to help you do mapping.
ii)RESTlets ---> which are typical REST based architures by NS can also be used but you may have to write external brokers to communicate with them.
Depending on your need you can have whatever you need.IN most of the cases you will be using SuiteTalk to bring in data to Netsuite.
Hope this helps ...
We just got done doing this. We used an iPAAS platform called Jitterbit (similar to Dell Boomi). It can connect to mySql and to NetSuite and you can do transformations in the tool. I have been really impressed with the platform overall so far
There are different approaches, I like the following to process a batch job:
To import data to Netsuite:
Export CSV from old system and place it in Netsuite's a File Cabinet folder (Use a RESTlet or Webservices for this).
Run a scheduled script to load the files in the folder and update the records.
Don't forget to handle errors. Ways to handle errors: send email, create custom record, log to file or write to record
Once the file has been processed move the file to another folder or delete it.
To export data out of Netsuite:
Gather data and export to a CSV (You can use a saved search or similar)
Place CSV in File Cabinet folder.
From external server call webservices or RESTlet to grab new CSV files in the folder.
Process file.
Handle errors.
Call webservices or RESTlet to move CSV File or Delete.
You can also use Pentaho Data Integration, its free and the learning curve is not that difficult. I took this course and I was able to play around with the tool within a couple of hours.

SSIS Continue a complex dataflow from step 12

I have a complex Task flow that in general creates data warehouse content from Prod to a local DW server and then a copy of that to an offsite location.
Lately we keep failing in the copy to offsite. Sure we moved offices and are getting these lines operation as expected. Till then how do I get to run steps 12->20 and skip all of the first local steps?
TIA
It might make sense to to break steps out in seperate packages and invoke them in seperate steps perhaps in an agent job. In the immediate you can disable the steps 1-11 and then run the package

Hudson as passive server

Is it possible to use Hudson only as a passive server,i.e, not using it for building purpose instead sending build results generated by some other tool in maybe XML format and using Hudson to only display the results??
It's very doable.
If it's running on the same machine, such as a cron job, check out http://wiki.hudson-ci.org/display/HUDSON/Monitoring+external+jobs.
If you need to pull data from somewhere else, use a shell script as a build target, and do what you need to to stage the data locally (scp, etc.).
It may very well be possible using periodic builds and the URL SCM plug-in to pull in the xml files and the Plot Plug-in for display but more information is required before a more detailed answer can be provided.
What build tool are you currently using to generate build results?
A couple of my Hudson jobs are just summaries and display information. The 'jobs' need to run for data to be collected and saved. The run could be based dependent jobs or just scheduled nightly. Some examples:
One of our jobs just merges together the .SER files from Cobertura and generates the Cobertura reports for an overall code coverage from all of our unit, integration and different types of system tests (hint for others doing the same: Cobertura has little logic for unsynchronized SER files. Using them will yield some odd results. There are some tweaks that can be done to the merge code that reduces the problem)
Some of our builds write data to a database. We have a once a week task that pulls the data from the database and creates an HTML file with trend charts. The results are kept as part of the job.
It sounds to me what you're describing is a plugin for Hudson. For example, the CCCC plugin:
http://wiki.hudson-ci.org/display/HUDSON/CCCC+Plugin
It takes the output, in XML form, from the CCCC analyzer app and displays it in pretty ways in the Hudson interface.
Taking the same concept, you could write a plugin that works with the XML output from whatever build tool you have in mind and display it in Hudson.