Integration Service Catalog and Linked server - ssis

I have SSIS packages in SSIS catalog on SQL Server, Serv1. The packages, while executing, establish connection to SQL Server, Serv2. And they fail while acquiring connection.
I know a little bit about linked server that is to run query from one server on another the latter one has to be a linked server to former one. The above scenatrio looks the same but i didn't find any information related to it. Do i have to add Serv2 as Linked Server for Serv1?

No. Linked server scenario is used when you need to access DB from the outer server inside SQL query run on local server.
SSIS packages are built with goal that it access some DB server, possibly a remote one, fetch data from it, transform and store results somewhere - in files, DB etc. Accessing a remote SQL DB is a normal scenario.
Moreover, using Linked server in SSIS package is a bad practice. You move control of DB access from SSIS catalogue to DB server, in case of any problems - it is more difficult to trace and investigate.
In your case - SSIS packages in SSIS catalogue - check in package execution log which connection string is used to connect to the remote server. Is it an integrated authentication? Does the account under which SSIS package is executed have an access to the DB? If using SQL auth - are login and password valid?

Related

How to connect to a remote MySQL from an Azure SQL server/database

sorry if title is not so clear, probably I am not finding what I need due I do not know how to search
I have few MySQL servers is separated online servers (from different wordpress) and I want to load some of the data on those databases/tablets into a SQL database located on Azure.
inside Azure portal itself I do not see where to establish external connections, neither at server level, neither at database level
I download and install Microsoft SQL Server Management Studio, connect to the server, I can see my databse and the master one, Security with logins, and Integration Service Catalog, nothing else.
I was looking for something like:
https://www.jetbrains.com/help/go/db-tutorial-connecting-to-ms-sql-server.html#step-3-connect-to-microsoft-sql-server-with-datagrip
but nowhere ...
maybe something like this:
https://www.devart.com/odbc/mysql/docs/microsoft_sql_server_manager_s.htm
but no Servers objects option available on my SSMS
Can be this done?
Note: Azure database is a basic wfor now, if that is a limitation
Some choices.
In your SQL Server Management Studio create a linked server pointing to each MySQL instance. You found the instructions for that. https://www.devart.com/odbc/mysql/docs/microsoft_sql_server_manager_s.htm But it probably will not work in Azure SQL Server; you don't have access to the underlying Windows OS to install stuff like MySQL ODBC drivers, which you need. (You could ask Azure techsupport if they can help.)
In each MySQL instance, try creating a federated table connection to appropriate table in SQL Server. That cross-vendor federation stuff only works in MariaDB, however; MySQL's federation only goes MySQL <--> MySQL.
Write yourself a purpose-built extract / transform / load (ETL) program, and arrange to run it every so often. Program it to connect to all the servers involved, retrieve the data needing to be transferred from your MySQL servers, and update / insert that data on the SQL server.
(edit) You may be able to use command-line SQL client programs. mysqldump, with its --compatible option, may generate usable INSERT statements in a file. You then may be able to use sqlcmd to run those INSERTs on your Azure server. It's going to take some hacking, and may take using sed(1) or awk(1) to make the MySQL output compatible with SQL Server.
I believe the third option is the most robust one for production use.

SQL Server Linked server and DTSX

The architecture we are using is the following:
a.- 2 SQL Server servers.
b.- 1 SQL servers where is the integration server catalog (dtsx) and Jobs.
I need to execute from the SQL servers a stored procedure that executes one of the DTSX that is in the other server, I can perform this action by linked server.
But:
1.- If I execute via linked server in which server the execution is carried out? Will the performance of my SQL servers decrease?
2.- Is there another way to run dtsx and jobs that is not linked server?

SSIS package not running from SQL server catalog

I have created one project for SSIS and deployed that on sql server 2014. When I am running packages from sql server data tool its running fine and performing all operations, but when I am running from catlog procs [SSISDB].[CATALOG].[Create_execution] its shows run successful but I can't see any data into my staging tables. I have used configuration tables to configure connection and files path.
Any idea.
PLease check the user privileges (file system, etc.) - if you call the procedures with another user than you execute the package from within Data Tools, this might be the reason.

How to mimic SSIS with production connection string on development machine

Scenario :
SSIS in production uses production connection and Instance name of SQL Server.
Once I open the same SSIS on my machine, the connection break since we do not have access to prod.
I can obviously change the connection and use my local dev, but that would mean that whenever I deploy to production I would have to ask my Network guys to open the SSIS and change the connections back from my local to Production, that too for EACH SQL task.
Is there a way for me to mimic production connection on my dev so that I would not have to touch the connection strings from prod and also be able to do my debugging / modifications and upload it back to production?
You need to make use of SSIS Package Configurations that will allow you to store all the configurable values in one of the following options:
XML configuration file
Environment variable
Registry entry
Parent package variable
SQL Server table
Read more about package configuration on MSDN, below documentation link refers to SSIS 2012.
Package Configurations
I prefer SQL Server table because it gives the flexibility to easily update the values using T-SQL queries. When you develop the package, you will select the connection managers and variables that you think might require different values across environments. Once you create package configuration, you can move the package and simply modify the values on the package configuration XML file or SQL Server table without having to touch the package (.dtsx)
Refer the answer to the below SO question for detailed steps on how to make use of package configurations:
Run SSIS Package with 2 Different Configurations

Where is actual work done - app server or DB server?

I am executing SSIS packages from an application server that is different than the database server. This package will be reading and writing files. Is it better to have the potentially large input files on the DB server or the application server where the package is executed? I'm not clear on where the actual "work" takes place when I execute a package on a server other than the database server.
The Integration Services server is where the actual execution is done. Generally that means "the DB server" but it doesn't always have to be the same server that the houses the DB you are reading from/writing to.
"This package will be reading and writing files."
isnt that one of the specialities of a database server? ;)