SSIS package to process whole SSAS database - sql-server-2008

we are using SSIS packages to process Cubes in SQL Server 2008 R2 Analysis Services. Until now we have been using an Analysis Services Processing Task in the package and have been manually adding all Cube and Dimension objects to the processing queue in that task. This also means we have to adjust the package when we add Dimensions to a Cube or Cubes to the SSAS DB.
But now we need an SSIS package that will process the whole SSAS database selected, so that we can modify the Cube later on, possibly adding Dimensions, without having to modify the package as well.
In SQL Server Management Studio it is possible to right click an SSAS database and select "Process..." but for the corresping SSIS task, I could not find out how to do this.
Is there any way to process a whole SSAS database in an SSIS package?
Thanks in advance,
Christian

I'm confused as to why you can't use the SSIS Analysis Services Processing Task. I believe you have to option to select a database in the processing settings. You can choose then entire database rather than choosing individual cubes or dimensions on that database. Just make sure the Type says database.
I have also used the XMLA answer provided by #Meff and it works fine as well.

You could also use AMO, you would need to include the Microsoft.AnalysisServices reference to a SSIS script task and provide the variable values. This way doesn't lock you to the database Id but is slightly more complex:
string cubeConnectionString = Dts.Variables["User::CubeConnectionString"].Value.ToString();
string databaseName = Dts.Variables["User::DatabaseName"].Value.ToString();
Server server = new Server();
server.Connect(cubeConnectionString);
Database database = server.Databases.FindByName(databaseName);
database.Process(ProcessType.ProcessFull);
server.Disconnect();
Dts.TaskResult = (int)ScriptResults.Success;

When you go to process the whole database, before you click 'OK', you should see a "Script" button in the top-left of the process window. This will generate the processing XMLA to a new window.
Now take that processing XMLA and use it in an "Analysis Services Execute DDL" control-flow component.
Be careful with cube re-deployments as you'll see the XMLA uses the Id not the name of the database.

Related

How to store SSIS Packages duration

I need to store the duration of running SSIS packages until execution into SQL database. How can I calculate it.
If you are using SQL 2012+, you are definitely urged to use SSIS Catalogs (SSISDB). Main reason for that - new extensive package logging execution system. View catalog.executions contains necessary for your task data.
If you are on SQL 2008 or run packages out of SSIS Catalogs - then you have to craft something on your side. You might parse results output of DTExec - it always reports runtime duration.

Use SSIS to write in other CubeSystems than SSAS

I've got a project where I have different cube systems (esp.:Tm1, Infor PM 10, SSAS). Is there a way to fill these cubes with SSIS? The Connection SSIS-SSAS for sure is an easy one. But are there any approaches to write into other Cube Systems with SSIS? (Maybe an open source Interface?)
If not, what would be the best tools to use? At the Moment I only know Cubeware Importer, but that one is so slow - I definitely Need a faster one.
You can use the Execute SQL task to send processing commands to the non-SSAS OLAP Servers. I'm no expert in Tm1 or InforPM, but whatever their equivalent of SSAS's XMLA is can be sent by an SSIS package.

Fast way to load data from mysql to sqlserver using SSIS

I am new to SSIS is there is any component to load data from MYSQL to SQL server using SSIS. Currently am loading data using ODBC connection it is really slow and it around the speed of 30000 rows/Minute. Is there any way to make the load run fast.
Thanks in Advance...
You can install the .NET Connector for MySQL: http://dev.mysql.com/downloads/connector/net/
Then you can create a script task to act as a data source, import MySql.Data.MySqlClient, and query MySQL directly in C#. The data will then enter your Data Flow and you can map it to a SQL Server destination the same as normal.
I find that when using the SSIS connection manager with .Net Providers I get malformed SQL errors, but this way you write all the SQL yourself.
To improve the performance, we can add Conditional Split Transformation build some parallelism into the common data flow that directly load data from ODBC Source to OLE DB Destination.
For more information about speeding up SSIS Bulk Inserts into SQL Server, please see the following blog:
http://henkvandervalk.com/speeding-up-ssis-bulk-inserts-into-sql-server
In DataFlowTask property, Increase buffer size and no of row commit

Reading from SQL and passing it to C# method

I need to read a column from a database table depending upon some parameter. If the database table has two columns, status and ID, then I have to read the ID if the status is true. Then I have to pass this ID to a C# method.
How can I achieve this in SSIS? So basically my database package will read the data from SQL Server and pass it to a C# method.
SSIS is an ETL Tool for moving and transforming large quantities of data. If you need to do a lot of C# work and you only have one record, or a few records, SSIS may not be the right tool for this purpose. You might do better writing an ASP.NET web application or a Windows application. These applications can also use SQL to get data for processing in C#.
If you are determined to do this in SSIS and C#, here are two possible approaches:
You could use an Execute SQL Task to perform your query and save the rowset into a variable. Then you would use a C# Script Task to do something with the contents of the variable.
You could create a Data Flow Task. The dataflow should have the structure Source -> Transformation -> Destination, and can include several transformation components.
You would use, for example, an OLE DB Source Component to perform your query. Then you would use a C# Script Component to transform each record that is returned by the query. Finally, you would use a OLE DB Destination Component to do something with the output for each record.

SSIS 2005/2008 - How do I allow multiple tasks to have a common target task?

In VB pseudocode, I'm trying to accomplish the following using SSIS (either 2008 or 2005)
If FileHasAlreadyBeenDownloaded = False Then
DownloadTheFileFromFTP
End If
ImportTheDownloadedFile
To do this in SSIS I have a script task to check for the file, and if it exists it transfers directly to the DataFlow Task using conditional expressions. If the file doesn't exist, it transfers to the FTP Task, and the FTP Task transfers to the DataFlow Task.
It seems, though, that I can't have two tasks lead into one common shared task, because no matter which path the code takes it won't execute the DataFlow Task. If I make a copy of the DataFlow task and have each path go to its own Task, then all works perfectly.
Is this a documented thing with SSMS that I just haven't found? I looked through 31 pages of questions on SSMS before posting, so hopefully this isn't a stupid question.
I also tried using Expressions on the FTP task to set "Disabled=#FileAlreadyDownloaded=True" but that works only in SSMS 2008 and didn't seem to work in SSMS 2005.
Thanks so much for any pointers on this!
It might be worth trying putting the script task and the FTP task inside a container task, and link the the container to the data flow task on success.