I'm working on a project where i need to do lookups on a data warehouse server in Integration Services.
My problem is that I need to be able to change what database it i performs the lookup to. The databases are design wise identical.
I have solved this problem with a script component before, where for each row, if the database id have changed, the connection changes, example below
try {
if (databaseNr != Row.DatabaseNr) {
try {
databaseNr = Row.DatabaseNr;
currentCatalog = "db" + Row.DatabasNr;
connection.ChangeDatabase(currentCatalog);
} catch (Exception e) {
ComponentMetaData.FireWarning(0, ComponentMetaData.Name, e.Message, "", 0);
}
}
string command = "SELECT Id, Name, Surname FROM [" + currentCatalog + "].[TableName] WHERE Id = '" + Row.OrderID + "'";
But it would save me a lot of trouble if this was possible with the lookup component.
So my question is: Is it possible in any way to use column data to change what database to perform a Lookup with the Lookup component?
Grateful for any help!
What you can do is:
Goto control flow
Select your data flow task
Goto properties and select the lookup component
Create an expression for the lookup, you can reuse a query prepared in a script task.
Related
I have a local SQL Server in Tabular Mode in which I read data from a local regular SQL Server Database. I have created some Calculated Columns and Fields.
Now I want to use these calculated columns and fields in my "Report Server Project"; how do I do this?
Thanks!
To query Tabular models from .NET look into using the AdomdClient namespace. The following is a simple example, which sends a DAX command to the Tabular database and returns the results for one measure filtered by the column of one of the dimensions. Calculated columns can be accessed as they typically would be using this method.
AdomdConnection conn = new AdomdConnection("Data Source=localhost;Catalog=YourTabularModel");
conn.Open();
string query = "EVALUATE SUMMARIZECOLUMNS( Employee[Employee Name], \"Employee Count\", "
+ "[Total Employee Count])";
AdomdCommand modelCmd = new AdomdCommand(query, conn);
AdomdDataReader dataRdr = modelCmd.ExecuteReader();
while (dataRdr.Read())
{
MessageBox.Show(dataRdr[0].ToString() + " - " + dataRdr[1].ToString());
}
dataRdr.Close();
conn.Close();
I'm currently using couchbase-lite inside my iOS and android application to sync down files from a database running CouchDB.
Every so often I remove files that are not longer needed, and I would like the same files to be removed from the mobile app as well, but any pull replication only pulls updates or new files, and doesnt trigger a delete on the mobile app.
Is there any way to delete documents from the mobile app that are no longer on the server DB without doing a full purge on the mobile application, and then resyncing the whole database?
From the database class there is a purgeDocument() method. This removes the target document from the local database - server copies of target documents are left unchanged. If the document is later updated then the document will return to the local client on next replication.
http://docs.couchbase.com/mobile/2.1/couchbase-lite-swift/Classes/Database.html
You can delete documents from database based on id as following:
try {
for (Result result : docList) {
String id = result.getString(0);
Document doc = database.getDocument(id);
database.delete(doc);
}
} catch (CouchbaseLiteException e) {
e.printStackTrace();
}
To get the document id you have to query something like this:
Query query = QueryBuilder
.select(SelectResult.expression(Meta.id), SelectResult.all())
.from(DataSource.database(database))
ResultSet rs = query.execute();
for (Result result : rs) {
Dictionary data = result.getDictionary("db_name");
Log.e("CouchbaseLite ", "document: " + data);
Log.e("CouchbaseLite ", "id: " + result.getString(0));
}
I have a folder which contains multiple Excel files that are replaced daily. I need a total row count that gives me the sum of row counts from each individual Excel file within the folder (i.e. if there are 3 files with 10 records each, I need a result count of 30). I then need to run this package daily to add an individual record to a log table that will provide me with the daily count of records in the folder. I've been trying Foreach Loop Containers and ADO Enumerators but cannot seem to achieve a solution.
There is a good solution that you can apply without the need to use any script task. All you need to use is a "Foreach loop container" , a "Data Flow Task" and a "Execute SQL task"
Define Variables:
V_FilesPath-> (String) Will hold the path where your files are located
V_FileName-> (String) will be populated with the files name in the Foreach Container
V_RowCount (Int)
V_FileRowCount (int)
V_TotalRecords (int)
Define the Foreach:
Map your Source file to a RowCount Component and select the Variable: V_FileRowCount
In the "Execute SQL task" change the result set to "Single Row"
Map the ResultSet to the following variables:
In the Expression part of the "Execute SQL task" Choose the following Property Expression:
In the Expression Pane issue the following:
" SELECT " + (DT_STR,10,1252)#[User::V_TotalRecords] + " + " + (DT_STR,10,1252) #[User::V_FileRowCount] + ", 1 + " +(DT_STR,10,1252) #[User::V_RowCount]
Once you've completed the aforementioned you're done!
If you wish to see the result add a Script task (Just to display the results)
and paste the following Script Code instead of the part that starts with "Public Void Main"
public void Main()
{
try
{
string Variables = "Loop Counter: " + Dts.Variables["User::V_RowCount"].Value.ToString() + " Total Records in all files: " + Dts.Variables["User::V_TotalRecords"].Value.ToString();
MessageBox.Show(Variables).ToString();
}
catch (Exception Ex)
{
MessageBox.Show(Ex.Message);
}
}
You can use the Row Count component.
Create a Data Flow Task in the Control Flow. Then, inside Data Flow, use an Excel Source component connected to a Row Count component. Create an integer variable, double click the Row Count component and assign the variable to it.
If you configured your Excel Source correctly (with an Excel Connection Manager), the variable you created will hold the number of rows in the Excel file you're passing.
I'm trying to write a tool which will update a users account credentials across several databases. Each of these databases are basically supporting a different version of the same application so each database has the same table names. I'm only interested in one table called opususer on each of these databases.
I was able to create the first Linq to SQL class and using a checkbox list I create a collection of selected items and send it to a method which loops through and should update the user credentials. This works fine when I have one DataContext, but when I add another Linq to SQL class and try to recreate the same thing except on a different database I receive several Ambiguity errors and "The member is defined more than once" errors.
I don't understand as the LinqToSQL class is pointing to a completely different database although the table name is the same, but why should that matter as the datacontext should keep it separate no?
I've tried using one DataContext and also adding an alias, but I'm not sure if this can be done. I'm new to ASP.NET...
if (DatabaseName == "clincomm_243x")
{
using (UserAccountDataContext Data = new UserAccountDataContext()) // database clincomm_243x
{
string UserName = TextBoxUserName.Text.ToUpper();
opususer CheckUser = Data.opususers
.SingleOrDefault(opususer => opususer.username == UserName && opususer.active == true);
if (CheckUser == null || TextBoxUserName.Text.Length ==0)
{
TextBoxResult.Visible = true;
TextBoxResult.Text = "Username " + UserName + " does not exist. Please check and try again!";
}
else
{
TextBoxResult.Text = "";
TextBoxResult.Text = "User " + UserName + " has been found.";
TextBoxResult.Visible = true;
TextBoxResult.Visible = true;
TextBoxResult.Text += "\nAttempting to update user account details.....\n";
// Set the new values for the record returned
CheckUser.password = "03ac674216f3e15c761ee1a5e255f067953623c8b388b4459e13f978d7c846f4";
CheckUser.hashtype_code = "SHA-256";
CheckUser.unsuccessfullogons = 0;
CheckUser.active = true;
DateTime newPasswordExpiryDate = DateTime.Now.Date.AddYears(10);
CheckUser.passwordexpirationdate = newPasswordExpiryDate;
Data.SubmitChanges();
TextBoxResult.Text += "\nUser Account " + UserName + " has been successfully updated\n";
TextBoxResult.Text += "\nPassword has now been set to 1234 and will not expire until " + newPasswordExpiryDate;
}
}
}
Even though you are using a new data context, the classes that represent your table(s) will reside in the same namespace - unless you specify a different namespace. Try giving unique namespaces to the "Entity Namespace" property for each of your DataContexts.
I'm building a REST WebService with JAX-RS and Tomcat to consume a MySQL Database.
I'm following this model:
#Path("/login")
public class Login {
String username;
String password;
// This method is called if POST is requested
#POST
#Produces(MediaType.APPLICATION_XML)
public String loginResponseXML(#FormParam("username") String user, #FormParam("password") String pass) {
//Connection to MySQL Database
try {
Class.forName("com.mysql.jdbc.Driver");
Connection con = DriverManager.getConnection("jdbc:mysql://localhost/sakila", "root","larcom");
Statement stmt = con.createStatement();
ResultSet rs = stmt.executeQuery("Select first_name, last_name From actor where first_name='" +
user + "' and last_name='" + pass + "'");
while (rs.next()){
System.out.println(rs.getString("first_name") + " " + rs.getString("last_name"));
username = rs.getString("first_name");
password = rs.getString("last_name");
}
}
catch (ClassNotFoundException e) {
e.printStackTrace();
} catch (SQLException e) {
e.printStackTrace();
}
if (user.equals(username) && pass.equals(password)) {
return ("<?xml version=\"1.0\"?>" + "<auth>200" + "</auth>"); //Success
//return "Success!";
} else {
return ("<?xml version=\"1.0\"?>" + "<auth>404" + "</auth>"); //Damn
//return "Damn!";
}
}
}
I call this method with:
HttpPost httppost = new HttpPost("http://192.168.15.245:8080/org.jersey.andre/rest/login");
Now, my question is:
If I want to query the DB for another table I have to create a new class like Login and make the JDBC connection again?
A new class and a new JDBC connection for each class that make a query to the DB? Performance issues?
Hope you can understand.
Thanks in advance.
A few tips are in order here: Please isolate the DB based code to a "data layer" so to speak...only perform dispatching/business logic within your resource classes.
Now If you are querying a different table, you WILL have a different query! You could either use the same connection (bad) or create a new one and fire a different query(s).
Now whether each resource hits a different table or the same table with a different query depends on your choice of 'representation' for that resource. There is a reason a RDB schema has multiple tables and it's quite common you'll have a different query involving multiple tables or to mutually independent tables.
Performance issues: For 'fresh data' you ARE always going to hit the DB so to speak. If you want to optimize that either develop your own cache (extremely hard) or use approaches like memcached or ehcache to boost performance - before you decide to do that make sure you verify if it's worth it.
Are you going to be having about 1000 DB hits per second? You probably need some performance boosting/handling. Per day...maybe not. Per 2-3 days...YAGNI (You ain't gonna need it, so don't worry for now)
So, for every 'resource' that you design in your application (Login is NOT a resource: See related post: Why is form based authentication NOT considered RESTful?) choose the representation. It may involve different queries etc., for you to return json/xml/xhtml (whatever you choose). Each 'DB related call' should be isolated into it's own 'data layer' - I suggest go with Spring JDBC to make your life easier. It'll take the burden of JDBC plumbing off your shoulders so you can focus on creating your DAOs (Data Access Objects - a patter for Data Access classes. All DAOs logically belong in the data layer)
Hope this helps