I want to batch import multiple *.CSV files to a SQLite3 database using a batch command line.
I have a batch file(loader.bat) which calls loader.sql to import the csv file into test.db (using sqlite3.exe).
sqlite3.exe test.db ".read loader.sql"
loader.sql is a sqlite script that imports data.csv into a table (tb_data)
.import data.csv tb_data
This works for importing a single file (data.csv). I want to import all the *.csv files (eg data123.csv, data456.csv, data789.csv) in a folder into the same table(tb_data).
I am thinking of using a for loop in the batch script to iterate through the files.
for %%a in (*.csv) do (
echo %%a
sqlite3.exe test.db ".read loader.sql"
)
How do I pass the parameters from the batch script to the sqlite script(loader.sql)?
Just combine all your csv files into one first and use your original command to load that:
copy *.csv combined.cv
Related
Is it possible to import many .sql files using the source command in mysql command line interface?
if u have files like
1.sql
2.sql
...
n.sql
Create a file example.sql include
source 1.sql
source 2.sql
....
source n.sql
Connect to mysql and source example.sql
source example.sql
Source multiple files is not possible, but u can include the files to source in a file and source a single file for automating.
I am not very technical so please bear with me while I try to explain the problem. We have a daily scheduled job that kicks off a SAS program on a Unix server. The program analyzes data and spits out a CSV file which gets stored on the server and then emailed to me. I then manually download the file from email and upload it to a utility to process the file. I am looking for a way to bypass the manual steps and instead have the SAS program upload the CSV file directly to a folder on the server for the file processing utility.
So given that the csv file is stored on the server that SAS is connected to (/user/folder/file.csv) how can I transport this file to another folder (/inbox) on another server (server2.test.com)?
Here is what I have so far, but I think I am missing something because the file isn't making it to the folder:
proc import file="/user/folder/file.csv"
dbms=csv out=file1 replace;
run;
filename outdir ftp "/inbox" DIR
host="server2.test.com"
user="username" pass="pwd";
data _null_;
file outdir(file1);
put file1;
run;
Thanks!
Mike
You are close. I would define one filename for the reading the CSV file and one for writing to the FTP server.
filename in "/user/folder/file.csv";
filename outdir ftp "/inbox" DIR
host="server2.test.com"
user="username" pass="pwd"
;
Then copy the file.
data _null_;
infile in;
file outdir("file.csv");
input;
put _infile_;
run;
I am new to SQL Server and SSIS. I want to shedule the loading of .csv file into SQL Server.I want to run the loading for a specific time daily.Please help.Thanks in Advance.
You can do this by first creat the table(s) in SQL with the appropriate data types. Then by importing into SQL first and then write your script to update the table but a .bat file would require in order to run your scripts. The following is an example command for a .bat file: sqlcmd -S ServerName -U UserName P Password -i "C:\folder\update.sql"-o "C:\folder\output.txt"
You will need to write the .csv file to a text file and change the extension to .sql
You can use the "Bulk Insert Command"
Example:
BULK INSERT FILENAME
DATABASENAME.DBO.TABLENAME
FROM 'C:\sqlscript\fileName.csv'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
)
GO
Hope this helps
I have an ImportCommand class which reads a file and imports the data from that file to a database. The command itself works fine.
However, I need to run the same command several times with different files.
My .bat file:
#echo off
cd c:\xampp\htdocs\mysite\protected\
yiic import c:\sourcefiles\users_1.csv
yiic import c:\sourcefiles\users_2.csv
yiic import c:\sourcefiles\users_3.csv
The first command runs then the script stops and files users_2.csv and users_3.csv are not processed.
After struggling with this for a while, I found this answer: How to run multiple .BAT files within a .BAT file
So the .bat file should be:
#echo off
cd c:\xampp\htdocs\mysite\protected\`
call yiic import c:\sourcefiles\users_1.csv
call yiic import c:\sourcefiles\users_2.csv
call yiic import c:\sourcefiles\users_3.csv
Use CALL command. Without CALL, control is transferred to other batch and is not returned.
Try this
#echo off
cd c:\xampp\htdocs\mysite\protected\
yiic import c:\sourcefiles\users_1.csv && yiic import c:\sourcefiles\users_2.csv && yiic import c:\sourcefiles\users_3.csv
This will execute the desired command one by one . It only proceeds one operation is successful.
I have the following file dumped daily into one of our online directories:
dat-part2-489359-43535-toward.txt
The numbers change each day randomly.
I have the following code to try and LOAD the file:
mysql_query("LOAD DATA LOCAL INFILE 'dat-part2-%-toward.txt'
REPLACE INTO TABLE my_table
FIELDS TERMINATED BY ',' ENCLOSED BY ''
LINES TERMINATED BY '\r\n'
IGNORE 1 LINES") or die(mysql_error());
And of course no luck. What's the best way to do this?
assuming this is a scheduled job, why not check the directory for the most recent file that matches your filename template. Store the name of said file in a variable and then sub the variable into your query. Check out glob()
I would do it via a shell script or windows cmd file. This assumes you created the file with mysqldump or some program that creates a valid sql script.
You can run dir /B in windows commend to get a directory of files. So do a dir /B > input.txt then use Windows scripting to read the input file and for each line, pipe the file in using the mysql client.
# echo off set infile= %%1
for /f "eol= tokens=*
delims= usebackq" %%i in (%infile%)
do ( mysql -u userName --password=pwd < %%i )
It's been a long time since I wrote any windows scripts, but that should give you an idea of an approach.