Generating a .csv file by running a .sql script with osql.exe - sql-server-2008

I have a script file in a folder.I want to use TSQL to run this script without opening it in SSMS as I will be using SSIS to run this script to generate CSV.
I have used OSQL to achieve this by using Execute Process Task but it is not giving me exact results I am looking for. It is generating spaces in the csv. A 17 mb file which I exported by rightclick as CSV option turned out to 1gb when I executed using OSQL code.The OSQL I used is:
-E -S CCCMSDSQL20,1819 -d SHARE_SCRUB -q "SET NOCOUNT ON" -i "C:\Scripts\.SQL" -o "C:\Scripts\.csv" -n -h-1 -s"," -w 700
I prefer a TSQL script which can achieve the same,so that I can use it in Execute SQL task and get the results.
EDIT:
I am working to generate multiple csv's from multiple .sql files in a folder .I am using a for each loop container and execute process task as of now.I want to change the execute process task to execute sql task and use this script to generate multiple files.
If it is possible to write a TSQL code which can do the looping and generation of CSV files in one script I can use it without using the package.
Thanks for your time and help.

Your parameters for osql are what I would expect, and I just made a small demo to do something similar and it works without problems.
I would examine your actual sql script and its results inside SSMS to make certain it isn't returning more than you want.
Also, it may not be what you want since you intend to also use this with the Execute SQL task, but I'll point out that bcp is specifically designed for copying data in and out of SQL and that you can use pure T-SQL with the opendatasource to make a CSV file without use of -o to dump the results of OSQL to a file.

Related

Which custom groovy code written in upload programatically summary report to MySQL data base in jmeter

I want the results to be uploaded after the test, how to configure summary report listener in csv file, after the test completion upload MySQL db, what custom code is used uploading programmatically
When you run JMeter in command-line non-GUI mode like
jmeter -n -t testplan.jmx -l result.jtl
you're getting .jtl results file
In order to get the "Summary Report" from the .jtl results file you can use JMeter Plugin Command Line Tool like:
JMeterPluginsCMD.bat --generate-csv summary.csv --input-jtl result.jtl --plugin-type SynthesisReport
At this stage it will be more convenient to use MySQL LOAD DATA statement like it's described in Import CSV File Into MySQL Table
If you still want to use Groovy - take a look at Creating/Inserting data chapter of Working with a relational database page of Groovy documentation

Executing binary SQL file using SQLCMD from WiX

I'm trying to install SQL script(SSDT) using SQLCMD - as this script contains to many SSDT definitions and cannot be run by the WIX SQL extension.
i want my SQL script file to be binary(as i don't want it to stay on target machine)
how can i set the SQLCMD command to use the binary script (with -i)?
p.s.
i tried this blog:
http://neilsleightholm.blogspot.co.il/2008/08/executing-sqlcmd-from-wix.html##
but this code don't shows the link between the binary SQL file and the SCLCMD command.
can someone help me with the correct code?
this is the code i used, which did not work for me
<Binary Id="CreateSchema.sql" SourceFile="..\SQL\CreateSchema.sql" />
<CustomAction Id="sqlcmd.cmd"
Property="sqlcmd"
Value=""sqlcmd.exe" -S [DATABASE_SERVER]
-i "[#CreateSchema.sql]" -v var=SYSTEM_USER -o [INSTALLDIR]installSql.log" />
<CustomAction Id="sqlcmd"
BinaryKey="WixCA"
DllEntry="CAQuietExec"
Return="check"
Execute="deferred"
Impersonate="yes" />
<InstallExecuteSequence>
<Custom Action="sqlcmd.cmd" After="InstallFiles">NOT Installed</Custom>
<Custom Action="sqlcmd" After="sqlcmd.cmd">NOT Installed</Custom>
</InstallExecuteSequence>
the log file showed that -i parameter did not had any file name value:
MSI (s) (4C:6C) [09:58:15:610]: Executing op: CustomActionSchedule(Action=sqlcmd,ActionType=1025,Source=BinaryData,Target=CAQuietExec,CustomActionData="sqlcmd.exe" -S (local) -i "" -v var=SYSTEM_USER -o C:\installSql.log)
That's not how <Binary> works. The [#FileID] syntax is used to dynamically use the at runtime installation full path of a component's file.
Binaries are used typically as temporary extracted files for custom actions or, in this case, sql files among other things.
Consider looking into the SQL Extension in wix. As a minimal example take a look at this code.
Add the sql namespace xmlns:sql="http://schemas.microsoft.com/wix/SqlExtension"
<Binary Id="CreateSchema" SourceFile="..\SQL\CreateSchema.sql" />
<sql:SqlDatabase Id="MyDB" Database="[DATABASE]" Server="[DATABASE_SERVER]" />
And in a component you can add
<sql:SqlScript Id="CreateSchemaScript" BinaryKey="CreateSchema" ExecuteOnInstall="yes" Sequence="1" SqlDb="MyDB"/>
Here is a link to the SQL Schema definition with all the available elements. I haven't done much with the SQL Extension so you may need to do some reading to get a better idea of what you will need to do to accomplish creating your DB on install.
As i mentioned i wanted to use both SQLCMD - since my SQL script is SSDT format, and binary file(so file will be deleted in end of the install).
After looking for answers i understood that i cannot use the WiX [#filekey], as binary file will not be extracted as long as there is no custom action that is running - using it explicitly.
So in the end i understood that the best way is to extract the binary file by my self.
the steps i used in one single custom action are:
extract binary SQL script from MSI binary table.
save this file locally
run SQLCMD with -i and new file path(the one i save to)
delete the SQL file
I encounter some issues, worth mentioned, if you save the file to INSTALLDIR than the directory may not exist at the tun time of the custom action, so consider save it to temp folder or to create directory beforehand.

Executing a batch file to create a test file with xp_cmdshell - differentt results

I'm trying to execute a batch file in SQL Server 2008 Express using xp_cmdshell. If I use it to execute a batch file that contains the following command:
echo > C:\development\test\itworks.txt
a file called "itworks.txt" is created and inside it text says "ECHO is on".
But if I run a batch file that contains the command:
CD. >test1.txt
it doesn't work (no error, just nothing created)
and neither does:
type NUL > test2.txt
although both those batch files do create the file if double clicked/run from command prompt. I thought it might be a permissions error (I hadn't tried the echo command at that point), so changed file permissions so that NTAuthority (which is what the SQLServer service runs as) had full control over the folder but it still didn't work. Nothing in event logs. I'm a novice at DOS commands so I don't really understand the different commands. Does anyone have any idea what might be going on?
If you test the commands directly at a command prompt, they work. The cd statement produces a text file containing the name of the current directory; the type statement produces a zero-byte file, but it does indeed produce a file.
Most likely, xp_cmdshell is executing in a folder where the account it's running under has no write privileges, and you're not specifying another location for the file to be written. (The echo statement that works specifies a folder location for the text file, while your other two don't.)
Change your batch file to:
cd > C:\development\test\test1.txt
or
type NUL > C:\development\test\test1.txt
If the echo statement works there when run via xp_cmdshell, you know it's writable by the NTAuthority account.

Batch files and MySQL: pass a block of commands instead of another .bat file

I see that I can pass a batch file to mysql in order to run a sequence of commands. But can I put those commands in the same batch file as the one that initiates the mysql app?
I.e. can I pass a block of batch commands to mysql instead of passing a batch file, so that it might look something like this:
mysql < [list of commands, not a .bat filename]
You can also pipe commands into MySQL if you don't want/have them in a file:
echo " ...some SQL... " | mysql
The term "Batch file" in the mySQL manual doesn't refer to DOS .BAT files, but to a file with many mySQL commands.
mysql < list.sql
will do exactly what you need.

Import Multiple .sql dump files into mysql database from shell

I have a directory with a bunch of .sql files that mysql dumps of each database on my server.
e.g.
database1-2011-01-15.sql
database2-2011-01-15.sql
...
There are quite a lot of them actually.
I need to create a shell script or single line probably that will import each database.
I'm running on a Linux Debian machine.
I thinking there is some way to pipe in the results of a ls into some find command or something..
any help and education is much appreciated.
EDIT
So ultimately I want to automatically import one file at a time into the database.
E.g. if I did it manually on one it would be:
mysql -u root -ppassword < database1-2011-01-15.sql
cat *.sql | mysql? Do you need them in any specific order?
If you have too many to handle this way, then try something like:
find . -name '*.sql' | awk '{ print "source",$0 }' | mysql --batch
This also gets around some problems with passing script input through a pipeline though you shouldn't have any problems with pipeline processing under Linux. The nice thing about this approach is that the mysql utility reads in each file instead of having it read from stdin.
One-liner to read in all .sql files and imports them:
for SQL in *.sql; do DB=${SQL/\.sql/}; echo importing $DB; mysql $DB < $SQL; done
The only trick is the bash substring replacement to strip out the .sql to get the database name.
There is superb little script at http://kedar.nitty-witty.com/blog/mydumpsplitter-extract-tables-from-mysql-dump-shell-script which will take a huge mysqldump file and split it into a single file for each table. Then you can run this very simple script to load the database from those files:
for i in *.sql
do
echo "file=$i"
mysql -u admin_privileged_user --password=whatever your_database_here < $i
done
mydumpsplitter even works on .gz files, but it is much, much slower than gunzipping first, then running it on the uncompressed file.
I say huge, but I guess everything is relative. It took about 6-8 minutes to split a 2000-table, 200MB dump file for me.
I don't remember the syntax of mysqldump but it will be something like this
find . -name '*.sql'|xargs mysql ...
I created a script some time ago to do precisely this, which I called (completely uncreatively) "myload". It loads SQL files into MySQL.
Here it is on GitHub
It's simple and straight-forward; allows you to specify mysql connection parameters, and will decompress gzip'ed sql files on-the-fly. It assumes you have a file per database, and the base of the filename is the desired database name.
So:
myload foo.sql bar.sql.gz
Will create (if not exist) databases called "foo" and "bar", and import the sql file into each.
For the other side of the process, I wrote this script (mydumpall) which creates the corresponding sql (or sql.gz) files for each database (or some subset specified either by name or regex).