We have developed a simple web app which allows a user to upload an excel file. Our system picks the excel file and triggers an SSIS simple task to read an incoming excel file with 6 columns and push the data in the database.
Our concern is what will happen when mutiple users try to upload files simultaneously? Can SSIS serve multiple requests at the same time (which it should)?
What is the upper limit ie how many requests can it perform simultaneously?
What advice and suggestions you can give us in order for us have our server configured so that we can serve maximum requests?
same SSIS package can run more that once at the same time. The concerns you must have are the same concerns as if you were running 2 insert statments at the same time. SSIS is only the way you are using to insert the data. What happens when the data reaches the SQL Server engine is regardless of how it got there.
Just rememebr that SSIS uses bulk inserts
Related
I have an access database that uses data pulled from an API, this data export is conducted hourly.
I would like the access database to automatically run an update module (which imports the exported data and cleans it up), based on the same schedule the API exports my data, so Access will be current at all times.
The database I designed is used company wide. The accde and backend are both located on a windows 2008 server.
I just wanted suggestions on the best way to automate this, without the need of human intervention, preferably nor my local machine.
From rough research (accompanied by my own ignorance and stupidity) I’m assuming I need to setup a task scheduler on the server to launch access and run the module on open if FOS username is the local machines user (administrator).
I highly welcome any feedback as again, I just taught myself access these past two months and am horribly unknowledgeable on the implementation side.
How can I check how long it lt took to extract data and load data in SSIS.
I am using Attunity connector now for extracting Data from Oracle and then using OLE DB destination to load it in SQL Server. Using Attunity is not making any difference in speed. It took approx 5 minutes to extract and load 5,600,000 rows from oracle to SQL Server when I use attunity connector which is same as OLE DB Source.
In the Progress tab, you can see the time taken for completion of data load.
As others have pointed out, the package duration is included in the GUI and you can also write the start/end times to a logging table within the package to capture the execution time. If you're looking for more detail on specific components, enable logging, choose the PipelineComponent event, and look for this event on the tasks that you're looking to monitor. The PrimeOutput step, which is when the data is sent downstream, will be of the most interest.
I have an SSIS package which downloads data from AS400 to SQL Server.
This is working fine but:
The pre-final task (Execute SQL task) is to delete the downloaded records from AS400.
The query is simple:
DELETE FROM (TABLE_NAME)
I am pretty sure that this task is being hit because a Send Email task after this is working.
The issue is occurring on one server only. And I am unable to figure out why.
The entire setup is the same for all servers.
Try to track down the joblog of the job servicing the request. If there are errors that are being passed along by SSIS, you'll see them there.
One issue I've seen, especially with people that still call the system AS/400, is that the tables aren't being journaled; which means that when SSIS attempts to delete the rows under commitment control, it will fail with an error.
We have many SSIS packages that move, import, export around large amount of data. What is the best way to get alerts or notifications if expected amount of data is not processed? or How to get daily report on how different SSIS packages are functioning. Is there a way to write/use a custom component and simply plug it in to SSIS packages instead of writing custom component for each package?
For your first question, we use user variables in SSIS to log the
number of rows processed by each step along with the package name
and execution id. You can then run reports on the history table,
and if any of the executions have a large variance in the rowcounts
processed, you can trigger an event.
Yes. See here, or in the alternative, google "custom ssis
component tutorial".
I am new to SSIS is there is any component to load data from MYSQL to SQL server using SSIS. Currently am loading data using ODBC connection it is really slow and it around the speed of 30000 rows/Minute. Is there any way to make the load run fast.
Thanks in Advance...
You can install the .NET Connector for MySQL: http://dev.mysql.com/downloads/connector/net/
Then you can create a script task to act as a data source, import MySql.Data.MySqlClient, and query MySQL directly in C#. The data will then enter your Data Flow and you can map it to a SQL Server destination the same as normal.
I find that when using the SSIS connection manager with .Net Providers I get malformed SQL errors, but this way you write all the SQL yourself.
To improve the performance, we can add Conditional Split Transformation build some parallelism into the common data flow that directly load data from ODBC Source to OLE DB Destination.
For more information about speeding up SSIS Bulk Inserts into SQL Server, please see the following blog:
http://henkvandervalk.com/speeding-up-ssis-bulk-inserts-into-sql-server
In DataFlowTask property, Increase buffer size and no of row commit