create a new job with different name from backup of existing SSIS job and multiple database name in ssis config file - ssis

I would like to create new job by copy from existing job and point the new job to another database. I tried to find out the the steps and did not find any suitable steps for this.
One thing I know is: this can be done using "script job as" -> "create to". and then changing the name of the job.
However, is there any other step needs to be taken care? How can I points to a different DB? Can I make my connection string in config by adding two data source[like we can add multiple email with comma separated]?
something like this:
<InitialCatalog>Systest1,Systest2</InitialCatalog>
Do I need to change the SSIS config file?
Please give me some tutorial for this or few steps that may help. Thanks.

You will need to parametrize your connection manager in the package you are running and pass the connection string from the agent job. Then you can have two jobs that run the same package on different databases.
To parametrize the connection manager, right-click on it and select "Parametrize" from the pop-up menu.

Related

SSIS Project Destination Assistant "unable to use" Existing Connection Managers

Visual Studio 2019, SSIS project.
Whenever I drop a new Destination Assistant into a Data Flow I am required to create a new Connection Manager to our SQL Server instance.
Presently, there are 4 identical Data Connections (except the index appended on the name) and none of them show up in the list of "Select connection managers." TBH this is driving me nuts. It is the same behavior for both Package and Project scoped Connection Managers.
Any ideas? Everything I see everywhere just shows the Database connection managers listed as expected - but they don't show up for me.
Don't use the assistants. Not to be flippant but I don't find that they add any value. I know whether I'm pushing/pulling from a flat file vs OLE DB so why do I want to make clicks here versus just dragging the thing I want onto my palette?
Whenever I start SSIS work on a new machine, I remove the suggested favorites of source/destination assistants and add in what I use day-in and day-out.

is there any way to create an excel file and save it or email it?

is there any way using SSIS (or any MSSQL Server features) to automatically run a stored procedure and have the output saved as an Excel files (or even flat file) and then have the newly created file sent to people via email?
sorry im a complete newbie to SSIS.
In broad strokes, you'll have an SSIS package with 2 tasks and 3 connection manager.
The first task is a Data Flow Task. Much as the name implies, the data is going to flow here - in your case, from SQL Server to Excel.
In the Data Flow task, add an OLE DB Source to the data flow. It will ask what Connection Manager to use and you'll create a new one pointed at your source system. Change the source from the Table Selector to a Query and then reference your stored procedure EXECUTE dbo.ExportDaily'
Hopefully, the procedure is nothing more than select col1, col2, colN from table where colDate = cast(getdate() as date) Otherwise, you might run into challenges for the component to determine the source metadata. Metadata is the name of the game in an SSIS data flow. If you have trouble, the resolution is version dependent - pre 2012 you'd have a null operation select as your starting point. 2012+ you use the WITH RESULT_SETS to describe the output shape.
With our source settled, we need to land that data somewhere and you've indicated Excel. Drag an Excel destination onto the canvas and again, this is going to need a connection manager so let it create one after you define where the data should land. Where you land the data is important. On your machine, C:\user\pmaj\Documents is a valid path, but when this runs on a server as ServerUser1... Not so much. I have a pattern of C:\ssisdata\SubjectArea\Input & Output & Archive folders.
Click into the Columns tab, and there's nothing to do here as it auto-mapped source columns to the destination. Sort the target column names by clicking on the header. A good practice is to scroll through the listing and look for anything that is unmapped.
Run the package and confirm that we have a new file generated and it has data. Close Excel and run it again. It should have clobbered the file we made. If it errors (and you don't have your "finger" on the file by having it open in Excel, then you need to find the setting in the Excel destination that says overwrite existing file)
You've now solved the exporting data to Excel task. Now you want to share your newfound wisdom with someone else and you want to use email to do so.
There are two ways of sending email. The most common will be the Email task. You'll need to establish a connection to your SMTP server and I find this tends to be more difficult in the cloud based world - especially with authentication and this thing running as an unattended job.
At this point, I'm assuming you've got a valid SMTP connection manager established. The Send Email Task is straightfoward. Define who is receiving the email, the subject, body and then add your attachment.
An alternative to the Send Mail Task, is to use an Execute SQL Task. The DBAs likely already have sp_send_dbmail configured on your server as they want the server to alert them when bad things happen. Sending your files through that process is easier as someone else has already solved the hard problems of smtp connections, permissions, etc.
EXECUTE msdb.dbo.sp_send_dbmail
#profile_name = 'TheDbasToldMeWhatThisIs'
, #recipients ='pmaj#a.com;billinkc#b.com'
, #subject = 'Daily excel'
, #body = 'Read this and do something'
, #file_attachments = 'C:\ssisdata\daily\daily.xlsx';
Besides using existing and maintained mechanism for mailing the files, Execute SQL Task is easily parameterized with the ? place holder so if you need to change profile as the package is deployed through dev/uat/prod, you can create SSIS Variables and Parameters and map values into the procedure's parameters and configure those values post deployment.

Set up a Datasource for Tabular Models in SQL Server

I want to go through this tutorial. Unfortunately the author has already set up the datasources
and does not explain how to set them up. First off I installed an separate SSAS Instance in my sql server 2014. Then I tried to add a .mdf file via "Attach" but get the Error "AdventureWorks.detach_log could not be found in the folder". So according to this SO solution I tried this command:
CREATE DATABASE YAFnet ON (FILENAME = N'C:\sql_data\YAFnet.mdf')
FOR ATTACH_REBUILD_LOG;
within my SSAS instance query editor but it looks like the query is not a proper one since it is mdx.
Anyone who can help me to get a datasource (adventureworks dw) for my tabular model so I can follow the tutorial?
I would download the tabular backup from AW samples and recover the .abf file

Migration from postgresql to Mysql

Hello there i'm really newbie to SQL and i have a problem to create a dump file properly, I have an assignment and i don't really have no idea what's wrong, can anyone help me to migrate it? According to phpmyadmin the error is somewhere in this part.
CREATE TABLE DEPT (
DEPTNO NUMERIC(2) NOT NULL,
DNAME CHAR(14),
LOC CHAR(13),
CONSTRAINT DEPT_PRIMARY_KEY PRIMARY KEY (DEPTNO));
Why not use an ETL Tool? you dont have to worry about dumps or stuff like that. You just need to know the connection credentials and thats it. I personally use Pentaho ( it's open source ).
Download Pentaho ETL from http://kettle.pentaho.org/
Unzip and run Pentaho (using .bat file spoon.bat)
Create a new Job:
Create DB connection for source data base (PostgreSQL) - using menu: Tools→Wizard→Create DataBase Connection (F3)
Create DB connection for destination data base (Mysql) - using technique described above.
Run the Wizard: Tools → Wizard → Copy Tables (Ctrl-F10).
Select source (left dialog panel), and destination (left dialog panel). Click Finish.
The Job will be generated - Run the job.
Thats it! If you need any help let me know.
I've tried to open the tools via pgadmin3 but is something like frozen, I reboot my pc and still have no access to the "tools"

Local Load Testing: The load test results database could not be opened

I am creating some Load tests using VS2012. I can run the tests using a StorageType of "None", but when I change this to a StorageType of "Database" I get the dreaded error
The load test results database could not be opened. Check that the
load test results database specified by the connect string for your
test controller (or local machine) specifies a database that contains
the load test schema and that is currently available. For more
information, see the Visual Studio help topic 'About the Load Test
Results Store'. The connection error was: An error occurred while
attempting to create the load test results repository schema: To
create the database 'LoadTest2010' your user account must have the
either the SQL Server 'sysadmin' role or both the 'serveradmin' and
'dbcreator' roles
I have created a database on a non local copy of SQL called LoadTest. When I test the connection from the SQL Tracing Connect String dialog I get a success.
I have created a SQL user that has the Server Roles of dbcreator, public, serveradmin and sysadmin. The user has a User Mapping to the LoadTest2010 database that was created from the loadtestresultsrepository.sql in the VS2012 IDE directory. On the database the user has the Database role memberships db_accessadmin, db_datareader, db_datawriter, db_owner.
In the Owned Schemas i ticked on the db_datareader, db_datawriter, db_owner and db_securityadmin howver these have now gone to a blue square instead of a tick when displayed.
So what's going on? Is Visual studio trying to create the database or is something else the issue?
I am not using TestControllers or TestAgents I am simply using a local run.
The answer was simple. I was setting up the connection string in the "SQL Tracing Connect String" instead of clicking the little "Manage Test Controller" icon at the top of my load test window and setting up the connection string from there.
Now I'm off to remove some of those superfluous permissions I created on that SQL user :)
Edit:
The SQL Connection String is NOT stored in the loadtest files. The setting seems to be PC specific so I had to change it on the build server - in one loadtest file (address.loadtest) as shown, then all the other loadtests adopt the same connection string.
I am using Visual Studio 2013 and had this error as well, but for a different reason. It's not entirely clear when setting up a load test for the first time that it will attempt to save the results to a database by default. I didn't realize it was trying to save results to a database until I got the error on my first run attempt. Furthermore, in the load test wizard, there is no screen to configure the database settings or create the database schema for that matter.
I found this article on MSDN which helped me solve the problem.
How to: Create a Load Test Results Repository Using SQL
https://msdn.microsoft.com/en-us/library/ms182600%28v=vs.120%29.aspx?f=255&MSPPError=-2147217396
Basically, it explains that you first need to run a script to create the load test repository schema. Once this is in place on your SQL instance is (it could be anywhere you like), then you can point your load test to this database and save your results there.
For me after I had set database connections and all the test results was still not writing to the database.
I forgot to change my storage type in the properties section of the runsettings.
The property is called 'Storage Type'
Storage Type : change it from None to Database