SSIS - Export multiple SQL Server tables to multiple text files - ssis

I have to move data between two SQL Server DBs. My task is to export the data as text (.dat) files, move the files and import into the destination. I have to migrate over 200 tables.
This is what I tried
1) I used a Execute SQL task to fetch my tables.
2) Used a For each loop to loop through the table names from the collection.
3) Used a script task inside the for each loop to build the text file destination path.
4) Called a DFT with the table name in a variable for the source ole db and the path name in a variable for the destination flat file.
First table extracts fine but the second table bombs with a synchronization error. I see this is numerous posts but could not find one that matches my scenario. Hence posting here.
Even if I get the package to work with multiple DFTs, the second table from the second DFT does not export columns because the flat file connection manager still remembers the first table columns. Is there a way to get it to forget the columns?
Any thoughts on how I can export multiple tables to multiple text files using one DFT using dynamic source and destination variable?
Thanks and appreciate your help.

Unfortunately Bulk Import Task only enable us to use format files effectively to map the columns between source and destinations. Bulk Import Task uses BULK INSERT TSQL command to import the data, to execute user should have the BULKADMIN server privilege.
Most of the companies would not allow BULKADMIN server privilege to enable due to security reasons.
Hence using the script task to construct BCP statements is a good and simple option to Export.
You does not require to construct .bat file as script itself can execute dos commands which runs under .NET security account.

I figured out a way to do this. I thought I will share if anybody is stuck in the same situation.
So, in summary, I needed to export and import data via files. I also wanted to use a format file if at all possible for various reasons.
What I did was
1) Construct a DFT which gets me a list of table names from the DB that I need to export. I used 'oledb' as a source and 'recordset destination' as target and stored the table names inside a object variable.
A DFT is not really necessary. You can do it any other way. Also, in our application, we store the table names in a table.
2) Add a 'For each loop container' with a 'For Each ADO Enumerator' which takes my object variable from the previous step into the collection.
3) Parse the variable one by one and construct BCP statements like below inside a Script task. Create variables as necessary. The BCP statement will be stored in a variable.
I loop through the tables and construct multiple BCP statements like this.
BCP "DBNAME.DBO.TABLENAME1" out "PATH\FILENAME2.dat" -S SERVERNAME -T -t"|" -r$\n -f "PATH\filename.fmt"
BCP "DBNAME.DBO.TABLENAME1" out "PATH\FILENAME2.dat" -S SERVERNAME -T -t"|" -r$\n -f "PATH\filename.fmt"
The statements are put inside a .bat file. This is also done inside the script task.
4) A execute process task will next execute the .BAT file. I had to do this because, I do not have the option to use the 'master..xp_cmdShell' command or the 'BULK INSERT' command in my company. If I had the option to execute cmdshell, I could have directly run the command from the package.
5) Again add a 'For each loop container' with a 'For Each ADO Enumerator' which takes my object variable from the previous step into the collection.
6) Parse the variable one by one and construct BCP statements like this inside a Script task. Create variables as necessary. The BCP statement will be stored in a variable.
I loop through the tables and construct multiple BCP statements like this.
BCP "DBNAME.DBO.TABLENAME1" in "PATH\FILENAME2.dat" -S SERVERNAME -T -t"|" -r$\n -b10000 -f "PATH\filename.fmt"
BCP "DBNAME.DBO.TABLENAME1" in "PATH\FILENAME2.dat" -S SERVERNAME -T -t"|" -r$\n -b10000 -f "PATH\filename.fmt"
The statements are put inside a .bat file. This is also done inside the script task.
The -b10000 was put so I can import in batches. Without this many of my large tables could not be copied due to less space in the tempdb.
7) Run the .bat file to import the file again.
I am not sure if this is the best solution. I still thought I will share what satisfied my requirement. If my answer is not clear, I would be happy to explain if you have any questions. We can also optimize this solution. The same can be done purely via VB Scripts but you have to write some code to do that.
I also created a package configuration file where I can change the DB name, server name, the data and format file locations dynamically.
Thanks.

Related

What is the correct syntax for the SOURCE command in SQL

In codeAnywhere I'm trying to run pre written script files to create a table. When using codeAnywhere one must import the file to the shell for the code first, as I have done. However I have been unable to use the SOURCE command to run these files. I have currently attempted this syntax:
USE exams SOURCE students.txt;
What is the correct syntax here? Do I need to name the database in the syntax?
Are there other commands which run text files containing code?
EDIT: I tried using this syntax, to the following result:
ERROR: Failed to open file 'exams(question5.txt)', error: 2
Put the commands on separate lines, without semi-colons for the shell commands, and if this doesn't work, then prefix with \ as well (I don't need to on my setup, but it's in the docs):
USE exams
SOURCE students.txt
https://dev.mysql.com/doc/mysql-shell-excerpt/5.7/en/mysql-shell-commands.html
On the shell you can use the following command to execute the queries from a text file:
mysql db_name < text_file
Hint: If the USE command (with correct database name) is specified on the textfile you don't need to specify the database. The SOURCE command is not available on MySQL instead you need the <.
You can find more information about executing queries from text files here:
https://dev.mysql.com/doc/refman/5.7/en/mysql-batch-commands.html

mysql queries before insert operation by syslog-ng

I am using syslog-ng to parse some logs that I am receiving via a csv-parser. However, I want to achieve insert operations that are a bit more complex than the conventional insert using the "destination" option in syslog-ng. Currently, my destination into MYSQL from my syslog-ng conf file looks like this:
destination d_sql_test
{
sql(
type(mysql)
host('<host>')
username('<user>')
password('<pass>')
database('<db_name>')
table('test')
columns('col1')
values('${val1}')
);
};
However, this simply just inserts the contents of val1 into the column col1. I want to be able to specify my insert "logic" as shown in the example in this question.
I am unsure as to where to actually do this, and if it is even supported by syslog-ng
I think you can do this if you can somehow make the decision within syslog-ng.
You could try to use an in-list() filter to check if the username is already listed in a file. If it is not then, you can send the log into the mysql destination, and also to another destination (possibly a program() destination) that updates the file containing the list of users, and reloads the syslog-ng to update the inlist filter.
You can write a syslog-ng template-function in Python that implements the logic somehow, and for example sets a macro to 1 in the message if it should be sent to the database. Then you can use a filter for this macro in your log path with the mysql destination.
Or if you can write a separate destination that does the work in Python: Writing syslog-ng destinations in Python
Also, you might want to post this question on the syslog-ng mailing list, where the developers notice it more easily.

How to process a table data in nant script?

I want to process my mysql table data inside nant script. My table contatins some image file path. using that path I need to do svn check out also. How can I do like that. Does it possible to fetch all mysql table rows using nant script and process it in a loop?
Use the NAnt task from NAntContrib and output the results to a file.
http://nantcontrib.sourceforge.net/release/0.91/help/tasks/sql.html
Hint: Here is a sample MySql OleDb Connection String that can be used:
Provider=MySQLProv;Data Source=mydb;User Id=myUsername;Password=myPassword;
Use a task to loop over each line in the file:
http://nant.sourceforge.net/release/0.91/help/tasks/foreach.html
Use the task from NAntContrib to checkout the file from svn:
http://nantcontrib.sourceforge.net/release/0.91/help/tasks/svn-checkout.html

import database dump to mysql using visual foxpro

I used leaves stru2mysql.prg and vfp2mysql_upload.prg to create a .sql dump file from DBF's. I connect to mysql database from vfp using ODBC.I KNOW how upload the sql dump file but i need to automate the whole process i.e after creating the dump file,my visual foxpro program can upload the dump file without a third party(automatically). I thought of using the source command but that needs to be run in mysql prompt.The assumption here is that my end users dont know how to import(which most of them dont).Please advice on how i can automate importation of sql file to mysql database.thank you
I think what you are looking for are the various SQL* functions in Foxpro. See the VFP help or MSDN on SQLCONNECT (or SQLSTRINGCONNECT), SQLEXEC, and SQLDISCONNECT functions to get you started. Microsoft provided good examples on each in the documentation.
You may also want to use FILETOSTR to get the output from Leafe's programs into a string for the SQLEXEC function.
Here's the steps I use to take data from a Visual FoxPro Database and upload to a MySql Database. These are all put into a custom method on a form, which is fired by a command button. For example the method would be 'uploadnewdata' and I pass parameters for whichever data tables I need
1) Connect to the Server - I use MySql ODBC
2) Validate the user (this uses a SQLEXEC to pull the correct matching record for a users tables
IF M.WorkingDatabase<>-1
nRetVal=SQLEXEC(m.WorkingDatabase,"SELECT * FROM users", "csrUsersOnServer")
SELECT csrUsersOnServer
SELECT userid,FROM csrUsersOnServer;
WHERE ALLTRIM(UPPER(userid))=ALLTRIM(UPPER(lcRanchUser));
AND ALLTRIM(UPPER(lcPassWord))=ALLTRIM(UPPER(lchPassWord));
INTO CURSOR ValidUsers
IF _TALLY>=1
ELSE
=MESSAGEBOX("Your Premise ID Does Not Match Any Records On The Server","System Message")
RETURN 0
ENDIF
ELSE
=MESSAGEBOX("Unable To Connect To Your Database", "System Message")
RETURN 0
ENDIF
3) Once that is successful I create my base cursor (this is the one I'm sending from)
4) I then loop through that cursor creating variable for the values in the fields
5) then using the SQLEXEC, and INSERT INTO, I update each record
6) once the program is finished processing the cursor, it generates a messagebox with the 'finished' message and control returns to the form.
All the user has to do, is select the starting table and enter their login information

Robocopy, Multi-Line Execute Process SSIS Task, or Output Batch File Results to SSIS

I need to robocopy files from one location to another in a SSIS package. Since the folder is on another domain, I need to impersonate another account before I run the robocopy.exe command. I found I can execute a "net use" command to impersonate the necessary user account and then execute the robocopy command immediately afterwards. I don't see any way to do this in an Execute Process command to do this directly, so I use an Execute Process task to run a batch file that has these two commands as separate lines. The downside of this approach is that I cannot read the results of the Execute Process command. So this leads me to three questions:
Is there a way to execute a multi-line command in a single Execute Process task?
Is there a way to execute robocopy.exe while impersonating another account in one line?
Is there a way to write the results of a batch file back to either a variable in SSIS or to the SSIS database log?
If there is a positive answer for any of the above three questions, then I may be able to work out a way to add job success or failure rules based on the results of the robocopy command.
This can easily be achived if you have enabled the use of the Extended Stored Procedure "xp_cmdshell" (see Books Online for "Surface Area Configuration").
In the following sample I have build a .CMD file containing all my ROBOCOPY options and simple executes this command file using xp_cmdshell grabbing the output to a table variable (can be a persistent table instead):
Just add a Execute T-SQL statement Task to your SSIS package with the following statement:
/** START **/
DECLARE #cmdfile nvarchar(255) = N'C:\myFolder\myCommandFile.cmd'
DECLARE #logtable table (
[RowId] integer IDENTITY(1,1) NOT NULL
,[Message] nvarchar(1024) NULL
)
INSERT INTO #logtable ([Message])
EXEC xp_cmdshell #cmdfile
SELECT *
FROM #logtable
WHERE
[Message] IS NOT NULL
/** END **/
Depending on your logging options set for the ROBOCOPY command you can show progress, headers, report or more. See the ROBOCOPY documentation for those options. Also try use the /TEE switch if to grab any console output from the ROBOCOPY command.
One way I can think of doing this is to execute a batch command ( so you can execute multiple line items) with RUNAS (to impersonate another user account)
You can capture the output of the batch file in a log file and read the contents of that into SSIS using the Script.
I am not sure on how to write to the SSIS log and am interested to see what other developers have to say about that.
I found two round-about methods for getting data from a batch file execution into the database. One method is to add logging to the actual batch file. I haven't tried this yet, but it seems conceptually possible. The other option I have tried that has worked is to put the batch file execution as a SQL Server Agent job step and add flat-file logging. After I did this, I created a small package that scrapes the file for specific messages. I would still prefer a better solution, but this works well enough for now.
Is there a way to execute a multi-line command in a single Execute
Process task?
Yes, use "Cmd.exe" as executable
Create a string variable with following expression (sample):
*"/C \"start robocopy c:\\temp\\v10.2a c:\\temp\\v1 *.WRI /Z /S && start robocopy c:\\temp\\v10.2a c:\\temp\\v1 *.dll /Z /S\""*
And then map to the arguments parameter via component expression. Then you can execute those two (or more) robocopy in //