MySQL Errorcode 2 in Perl script - mysql

I have created an ETL tool in Perl. There are three database servers to which ETL tool is communicating, say dbserver1 (OLTP server - Windows Box), dbserver2 (staging server - linux Box), dbserver3 (OLAP Server, linux Box). My ETL script is on dbserver 2.
The scripts reads the data from dbserver1 and bring it into the dbserver2 for some trnsformations, performs transformations and then put the data into the dbserver3. For achieving this the script creates some OUTFILE data on dbserver2. So there are two type of OUTFILE queries:
OUTFILE query which runs on the dbserver1, creates .data on dbserver2 and
OUTFILE query which runs on the dbserver2 creates .data file on dbserver2.
The second query works fine as it is creating a file on the same server. But first type of query gives me following error:
DBD::mysql::st execute failed: Can't create/write to file '\home\dbserver2\dumpfile.2011-11-04-03:02.data' (Errcode: 2) at stagingtransform.pl line 223, <> line 8.
I guess this is related to some user permissions. And if I am not wrong then, MySQL on dbserver2 is having permissions to read/write to the dbserver2, but MySQL on dbserver1 is not.
Can it be because of the dbserver1 is Windows and dbserver2 is Linux box?
How can I resolve this?
FYI: The file formate is: dumpfile.yyy-mm-dd-hh:mm.data and I also have set the AppArmor settings for MySQL on dbserver2, which is for MySQL on dbserver2.

The problem is the outfile query on dbserver1 can only write locally, so you need a different approach.
One very easy method is to use mysqldump (on dbserver2) to connect to dbserver1 and pipe the output to a mysql client which injects the SQL into dbserver2.
On the other hand, if you want to use DBI:
my $source_sql = q{SELECT ...};
my $target_sql = q{INSERT ... VALUES (?, ?, ...)};
my $source = $source_dbh->prepare($source_sql);
my $target = $target_dbh->prepare($target_sql);
$source->execute;
my $qty = $target->execute_array({ArrayTupleFetch => $source});
For large transfers of data, the mysqldump approach is faster.

Related

Load data infile does nothing

I'm kinda new to SQL. I am trying to add data to an alredy created table, the csv looks like these, it is 132645 lines long:
iso_code;continent;location;date;population;total_cases;new_cases;new_cases_smoothed;total_deaths;new_deaths;new_deaths_smoothed;total_cases_per_million;new_cases_per_million;new_cases_smoothed_per_million;total_deaths_per_million;new_deaths_per_million;new_deaths_smoothed_per_million;reproduction_rate;icu_patients;icu_patients_per_million;hosp_patients;hosp_patients_per_million;weekly_icu_admissions;weekly_icu_admissions_per_million;weekly_hosp_admissions;weekly_hosp_admissions_per_million
AFG;Asia;Afghanistan;24/02/2020;398354280;50;50;NULL;NULL;NULL;NULL;126;126;NULL;NULL;NULL;NULL;NULL;NULL;NULL;NULL;NULL;NULL;NULL;NULL;
AFG;Asia;Afghanistan;25/02/2020;398354280;50;0;NULL;NULL;NULL;NULL;126;0;NULL;NULL;NULL;NULL;NULL;NULL;NULL;NULL;NULL;NULL;NULL;NULL;
AFG;Asia;Afghanistan;26/02/2020;398354280;50;0;NULL;NULL;NULL;NULL;126;0;NULL;NULL;NULL;NULL;NULL;NULL;NULL;NULL;NULL;NULL;NULL;NULL;
AFG;Asia;Afghanistan;27/02/2020;398354280;50;0;NULL;NULL;NULL;NULL;126;0;NULL;NULL;NULL;NULL;NULL;NULL;NULL;NULL;NULL;NULL;NULL;NULL;
AFG;Asia;Afghanistan;28/02/2020;398354280;50;0;NULL;NULL;NULL;NULL;126;0;NULL;NULL;NULL;NULL;NULL;NULL;NULL;NULL;NULL;NULL;NULL;NULL;
And my sql query is this:
LOAD DATA LOCAL INFILE 'C:\\Users\\Usuario\\Desktop\\CovidDeaths.csv' INTO TABLE coviddeaths
columns terminated by ';'
LINES TERMINATED BY '\r\n';
I have some alredy uploaded data to the coviddeaths table using the import wizards but the upload was really slow, so tried to finish importing using the LOAD DATA INFILE query.
To see how many lines have been alredy uploaded I used
select count(*) from coviddeaths;
And threw:
count(*)
11312
When I try to run the LOAD DATA LOCAL INFILE query, nothing happens. The mouse pointer spins for a second like it tries to run it but it doesnt add any additional row. It doesnt throw any errors like the ones I have read about Error Code: 1148. The used command is not allowed with this MySQL version. I also tried to place the csv file in the Uploads folder, where my Secure File Priv option pointed to in my.ini:
# Secure File Priv.
secure-file-priv="C:/ProgramData/MySQL/MySQL Server 8.0/Uploads"
It just does nothing. Any ideas?
Thanks in advance!
EDIT 2: Database Schema:
The error comes from the fact that both client and server must enable an option to allow local datafiles to be imported. If you use the command-line client, you would need to open the client like this:
mysql --local-infile -u root -p
In addition, the MySQL Server would need to be configured with the local-infile option.
The secure-file-priv is not needed, because that controls importing or exporting files on the server, not from the client.
See also ERROR 1148: The used command is not allowed with this MySQL version or my answer to "mysql 8.0" local infile fails I tried and display the settings. Could it be permission issues?
The error message is unusually obscure. MySQL error messages are usually more accurate and informative. I logged a bug in 2019 to request an improved error message, and they did fix it in MySQL 8.0.

importing csv to db2

I am trying to import a .csv file with 2 columns (...and just 3 rows...just for testing) using the query:
IMPORT FROM "C:\db2\dtest.csv" OF DEL INSERT INTO TEST_DATA.DTEST (CAR, NICKNAME)
I am getting this error:
SQL0104N: SQL0104N An unexpected token "IMPORT FROM "C:\db2\dtest.csv"
OF DEL" was found following "BEGIN-OF-STATEMENT". Expected tokens may
include: "". SQLSTATE=42601
If I'm just being stupid here please do tell me :)
The IMPORT is not SQL, it is a command. That means you must submit the IMPORT from the Db2 command window (CLP), or from a script, or from a stored procedure . You cannot submit it directly via SQL, unless you use the ADMIN_CMD stored procedure. See the documentation for details and examples. You would use the ADMIN_CMD if you normally interact with databases via a GUI tool.
Using the command line is best ONLY if you will regularly use batch commands, or write scripts in any scripting language - and you are competent at scripting and using command-lines. But this method has pre-requisites especially if the database is on a different hostname or a different Db2-instance than the one you are working with (i.e. the db is remote). For remote databases the database needs to be catalogued via db2 catalog tcpip node .... and db2 catalog database ... commands.
Additionally you must first connect to the database via db2 connect to .... You have to do this regardless of whether the database is local or remote. See docs for details. For local databases you just use db2 connect to dbname where dbname must be your database name.
Using the ADMIN_CMD stored procedure if often easier for new users who are not familiar with using command line tools.
To use the Db2 command window (CLP), you can either use it interactively, or you can use your operating-system shell. On Windows use db2cwadmin.bat to open such a window (for batch or command usage), or run the db2.exe to enter interactive mode. You can then in interactive mode run your import command.
This happened to me when i was importing data from CSV to Sql DB
--In my case this happened because of incorrectly formatted CSV File.
When extracting data from CSV file have a look at the preview of data which will give an idea of where the data is wrong.
It may be because of different characters like , ' ' etc..
--t

MySQL into output file error

I am very new with SQL but I have to extract some fields of a table stored in a sql file.
I have installed mysql and create the database and source the file. Now I wanted to execute a sql request in order to read all the elts of the table, extract the interesting fields and write them into cvs file:
SELECT * INTO OUTFILE '/home/cr/database/Dump2/program_info.csv' FIELDS TERMINATED BY ',' ENCLOSED BY '"' ESCAPED BY '\\' LINES TERMINATED BY '\n' FROM program_info;
When running the command I have the following error message:
ERROR 1290 (HY000): The MySQL server is running with the --secure-file-priv option so it cannot execute this statement
Does anyone knows how to solve it ?
I have struggled with this message all the afternoon but could not .
I am working on Linux ubuntu and my config file is /etc/my.cfg
[mysqld]
read-only = 0
secure-file-priv = ""
Thank you for your help
MySQL is actually working as intended. The MySQL server booted with the --secure-file-priv option which basically restricts you from saving output into random directories. You need to check the startup parameters.
To do so, run the following command in the MySQL shell:
SHOW VARIABLES LIKE 'secure_file_priv';
The output would be what the MySQL Server currently thinks is the permitted location. You should be able to save your output to this location.
To disable the restriction, you need to edit the configuration file that declares it and restart the MySQL Server after changing the option.
MySQL could be reading the my.ini (or my.cnf) configuration file from a variety of locations. See Using Option Files for more information.

how to export data from table to excel on SQL Server 2008 with T-SQL?

Want to export data from a table in SQL Server 2008 to a excel file on windows 7 with T-SQL.
By searching over internet, many try following:
Insert into openrowset ('Microsoft.Jet.OLEDB.4.0', 'Excel 8.0;Database=c:\MyExcel.xls;','SELECT * FROM [Sheet1$]')
select * FROM mytab
I tried it too. I it is not working.
Also try below:
sqlcmd -S myServer -d myDB -E -Q "select * from Tab" -o "MyData.csv" -h-1 -s","
okay with no error, but no file created. Also not sure if this can be run in T-SQL.
Any better solution for this case?
Use export wizard to create a SSIS package. Select [First row is column name], excel destination, your source table or query
In the last step, save SSIS package as file
Use sp_configure to enable xp_cmdshell
use t-sql script to exec ('dtexec /file: ')
Bulk Copy Program = bcp
MSDN: http://msdn.microsoft.com/en-us/library/ms162802.aspx
example usage:
bcp AdventureWorks2012.Sales.Currency out "Currency Types.dat" -T -c
This is a command line util so you will need to use the command shell functionality if you have to do this from SQL Server. I am curious if the sqlcmd you were using was trying to use bcp under the hood potentially. Be aware creating files directly from SQL you will have permission issues if you are sending out to certain locations as the SQL Account may not have access. And above all remember that the machine running the TSQL is not the server, so where it saves to when using a local path is relative to the server running the script, NOT the user running the script over TCP/IP to SQL Server. I think the default for file creation for SQL server is the root install of the 'data' location of SQL Server on the Server. Something like:
C:\Program Files\Microsoft SQL Server\(version current is: 110)\Data

how to get mysql OUTFILE on to client machine when querying remote DB server

How to get the outfile onto client system when querying remote database server when a statement like "select * from sometable into outfile 'c:/somefile.txt' " is executed. Is there another command for this that makes it happen. Could someone please give the full command or list of steps for this to be possible. Thanks in advance.
The SELECT ... INTO OUTFILE statement
is intended primarily to let you very
quickly dump a table to a text file on
the server machine. If you want to
create the resulting file on some
other host than the server host, you
normally cannot use SELECT ... INTO
OUTFILE since there is no way to write
a path to the file relative to the
server host's file system.
However, if the MySQL client software
is installed on the remote machine,
you can instead use a client command
such as mysql -e "SELECT ..." >
file_name to generate the file on the
client host.
Source: http://dev.mysql.com/doc/refman/5.0/en/select.html