How to consume output from Tabular Mode in .NET? - reporting-services

I have a local SQL Server in Tabular Mode in which I read data from a local regular SQL Server Database. I have created some Calculated Columns and Fields.
Now I want to use these calculated columns and fields in my "Report Server Project"; how do I do this?
Thanks!

To query Tabular models from .NET look into using the AdomdClient namespace. The following is a simple example, which sends a DAX command to the Tabular database and returns the results for one measure filtered by the column of one of the dimensions. Calculated columns can be accessed as they typically would be using this method.
AdomdConnection conn = new AdomdConnection("Data Source=localhost;Catalog=YourTabularModel");
conn.Open();
string query = "EVALUATE SUMMARIZECOLUMNS( Employee[Employee Name], \"Employee Count\", "
+ "[Total Employee Count])";
AdomdCommand modelCmd = new AdomdCommand(query, conn);
AdomdDataReader dataRdr = modelCmd.ExecuteReader();
while (dataRdr.Read())
{
MessageBox.Show(dataRdr[0].ToString() + " - " + dataRdr[1].ToString());
}
dataRdr.Close();
conn.Close();

Related

How do I check whether all the rows for a column are NULL in SSIS?

I'm working on a SSIS package where I have a text file with 5 columns. I need to check if all the rows for 5th column are NULL values.
If all the rows in 5th column are NULL then all the data should go for invalid file table.
If any row in 5th column have non NULL value then all the data should go to valid table.
You will need to read the entire file before being able to make the decision of where to write it to, so introduce a third table where you can stage the data first
Next part would be to build the logic that checks the staging table for all NULLS. Below query would return 0 if all was NULL and more than 0 if any record had a value
SELECT COUNT(*) FROM dbo.StagingTable ST WHERE ST.Column5 IS NOT NULL
Once you feed the answer into a variable you can use precedence constraints to fire the dataflow copy [staging to active] if the result was more than 0 or [staging to faulty] if the result was 0
personally if i had to perform this task I would use a script task to do it all:
Load into a data table
Use linq to check column to determine destination .Where(x => x[4]!=null).Count()
Load to destination via bulk Copy
You can check if the file is empty with C# using an OleDbDataAdapter and search the file, then determine where to load the file using SSIS Precedence Constraints. This example uses a CSV file without column names. If the columns do have names add the replacement code noted in the comments below. You will also need the using statements listed.
Add an SSIS Boolean variable. This is IsColumnNull in the following example. Next add a C# Script Task with IsColumnNull variable in the ReadWriteVariables field, and (optionally) a variable holding the file path ReadOnlyVariables pane.
Next set Precedence Constraints to check for both a true condition (has null rows) or false condition (does not have null records). Since the IsColumnNull variable is a Boolean, use just the variable itself as the expression to check for all null rows, but add ! for non-nulls, i.e. !#[User::IsColumnNull].
Connect the appropriate Data Flow Tasks with each destination table to the corresponding Precedence Constraint. For example, add the Data Flow Task with the "invalid file table" as the destination after the Precedence Constraint checking for a true value in the IsColumnNull variable.
Precedence Constraint For Rows with Nulls:
Precedence Constraint for Rows without Nulls:
Script Task Example:
using System;
using System.Data;
using System.IO;
using System.Data.OleDb;
using System.Linq;
using Microsoft.SqlServer.Dts.Runtime;
using System.Windows.Forms;
string fullFilePath = Dts.Variables["User::FilePath"].Value.ToString();
string fileName = Path.GetFileName(fullFilePath);
string filePath = Path.GetDirectoryName(fullFilePath);
string connStr = "Provider=Microsoft.Jet.OLEDB.4.0;Data Source=" + filePath
+ ";Extended Properties=\"text;HDR=No;FMT=Delimited\";";
//add filter for NOT NULL on given column to only return non-nulls
string sql = "SELECT F2 FROM " + fileName + " WHERE F2 IS NOT NULL";
//if file has column names replce "connStr" and "sql" as shown below
/*
string connStr = "Provider=Microsoft.Jet.OLEDB.4.0;Data Source=" + filePath
+ ";Extended Properties=\"text;HDR=Yes;FMT=Delimited\";";
string sql = "SELECT ID FROM " + fileName + " WHERE ID IS NOT NULL";
*/
using (OleDbDataAdapter oleAdpt = new OleDbDataAdapter(sql, connStr))
{
DataTable dt = new DataTable();
oleAdpt.Fill(dt);
//if emtpy set IsColumnNull SSIS variable to true
if (dt.Select().Count() < 1)
{
Dts.Variables["User::IsColumnNull"].Value = true;
}
else
{
Dts.Variables["User::IsColumnNull"].Value = false;
}
}

SQL parametric columns in ASP.NET

Why can you not use parameters in an SQL statement as the column name? I found that out after two hours of thinking what the problem could be. The only way it seemed possible was by doing it in a way it could be vulnerable to SQL injections (which for me wasn't a problem because the parameters are generated serverside).
This works:
string cmdgetValues = "SELECT " + column + " FROM user WHERE " + filterColumn + " = #filter";
MySqlCommand getValues = new MySqlCommand(cmdgetValues, connectionDB);
getValues.Parameters.AddWithValue("#filter", filterValue);
This doesn't work:
string cmdgetValues = "SELECT #column FROM user WHERE #filterColumn = #filter";
MySqlCommand getValues = new MySqlCommand(cmdgetValues, connectionDB);
getValues.Parameters.AddWithValue("#column", column);
getValues.Parameters.AddWithValue("#filterColumn", filterColumn);
getValues.Parameters.AddWithValue("#filter", filterValue);
Why is this? And is it intended?
Because select columns are fundamental query
You can't parameterise the fundamental query, so you have to build the query at the code.
If you want to decide the query columns runtime maybe you can try to use Prepared SQL Statement Syntax in Mysql.

SSIS write DT_NTEXT into an UTF-8 csv file

I need to write the result of an SQL query into a CSV file (UTF-8 (I need this encoding as there are French letters)). One of the columns is too large (more than 20000 char) so I can't use DT_WSTR for it. The type that is inputted is DT_TEXT so I use a Data Conversion to change it to DT_NTEXT. But then when I want to write it to the file I have this error message :
Error 2 Validation error. The data type for "input column" is
DT_NTEXT, which is not supported with ANSI files. Use DT_TEXT instead
and convert the data to DT_NTEXT using the data conversion component
Is there a way I can write the data to my file?
Thank you
I had this kind of issues also sometimes. When working with data larger than 255 characters SSIS sees it as blob data and will always handle this as such.
I then converted this blob stream data to a readable text with a script component. Then other transformation should be possible.
This was the case in ssis that came with sql server 2008 but I believe this isn't changed yet.
I ended up doing just like Samyne says, I used a script.
First I've modified my SQL SP, instead of having several columns I put all the info in one single column like follows :
Select Column1 + '^' + Column2 + '^' + Column3 ...
Then I used this code in a script
string fileName = Dts.Variables["SLTemplateFilePath"].Value.ToString();
using (var stream = new FileStream(fileName, FileMode.Truncate))
{
using (var sw = new StreamWriter(stream, Encoding.UTF8))
{
OleDbDataAdapter oleDA = new OleDbDataAdapter();
DataTable dt = new DataTable();
oleDA.Fill(dt, Dts.Variables["FileData"].Value);
foreach (DataRow row in dt.Rows)
{
foreach (DataColumn column in dt.Columns)
{
sw.WriteLine(row[column]);
}
}
sw.WriteLine();
}
}
Putting all the info in one column is optional, I just wanted to avoid handling it in the script, this way if my SP is changed I don't need to modify the SSIS.

SSIS Lookup multiple identical databases

I'm working on a project where i need to do lookups on a data warehouse server in Integration Services.
My problem is that I need to be able to change what database it i performs the lookup to. The databases are design wise identical.
I have solved this problem with a script component before, where for each row, if the database id have changed, the connection changes, example below
try {
if (databaseNr != Row.DatabaseNr) {
try {
databaseNr = Row.DatabaseNr;
currentCatalog = "db" + Row.DatabasNr;
connection.ChangeDatabase(currentCatalog);
} catch (Exception e) {
ComponentMetaData.FireWarning(0, ComponentMetaData.Name, e.Message, "", 0);
}
}
string command = "SELECT Id, Name, Surname FROM [" + currentCatalog + "].[TableName] WHERE Id = '" + Row.OrderID + "'";
But it would save me a lot of trouble if this was possible with the lookup component.
So my question is: Is it possible in any way to use column data to change what database to perform a Lookup with the Lookup component?
Grateful for any help!
What you can do is:
Goto control flow
Select your data flow task
Goto properties and select the lookup component
Create an expression for the lookup, you can reuse a query prepared in a script task.

Problem using oledb datatypes to write data to excel sheet

I am trying insert some data into excel sheet using oledb dataadapter which is obtained from MYSQL Db.This data obtained from mysql db contains very long texts whose datatypes in MYSQL have been defined as(Varchar(1023),Text,Longtext etc).When I try to pass these to the oledb Dataadapter I tried to use oledb.VarWChar,oledb.LongVarWChar with size 5000 and so on.But I am getting the following exception when I try to run da.update(...) command.
The field is too small to accept the amount of data you attempted to add. Try inserting or pasting less data
I am having trouble understanding what datatypes with what sizes should I use in oledb to map to these long text values.
Could someone please help me with this?
Thanks.
I am doing something similar and ran into the same error with varchar(max) data types that come from SQL Server. It doesn't matter where the data is coming from though. When you get the data from your database, you need to define the schema for the column data types and sizes. I do this by calling FillSchema on the data adapter that I am using -
DataTable dt = new DataTable();
SqlDataAdapter da = new SqlDataAdapter(cmd);
da.Fill(dt);
da.FillSchema(dt, SchemaType.Source);
return dt;
You could also set the column properties individually, if you wanted.
Then I loop through each column in my DataTable and set up my columns for export with oleDB using ADOX.NET. You don't have to use ADOX.NET, the main concept here is to use the sizes that came from the original database.
foreach (DataColumn col in dt.Columns)
{
columnList.Append(dt.TableName + "." + col.ColumnName + ", ");
ADOX.Column adoxCol = new Column();
adoxCol.ParentCatalog = cat;
adoxCol.Name = col.ColumnName;
adoxCol.Type = TranslateType(col.DataType, col.MaxLength);
int size = col.MaxLength > 0 ? col.MaxLength : 0;
if (col.AllowDBNull)
{
adoxCol.Attributes = ColumnAttributesEnum.adColNullable;
}
adoxTable.Columns.Append(adoxCol, adoxCol.Type, size);
}
Here is a snippet from my TranslateType method that determines whether or not to use the LongVarWChar or VarWChar. These data types are the ADOX.NET version of the oleDB data types. I believe that anything over 4000 characters should use the LongVarWChar type but I'm not sure about that. You didn't mention which version of Excel is your target, but I have this working with both Excel 2003 and Excel 2007.
case "System.String":
if (maxLength > 4000)
{
return DataTypeEnum.adLongVarWChar;
}
return DataTypeEnum.adVarWChar;
The LongVarWChar can take large sizes that can accomadate 2 GB. So don't worry about making the size too big.