#1148 - The used command is not allowed with this MariaDB version - mysql

I've to import some data from a CSV file into a table of db on my Aruba server.
I use the following query:
LOAD DATA LOCAL INFILE 'test.csv' INTO TABLE dailycoppergg
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\r\n'
(
ddmmyy,
lmedollton,
changedolleuro,
euroton,
lmesterton,
delnotiz,
girm,
sgm
)
I tested this query on other Aruba server and it worked correctly but here, I've the following error:
#1148 - Il comando utilizzato non e` supportato in questa versione di MariaDB
How can I modify my query to import csv file data into dailycoppergg table? Can you help me, please? Thanks!

The query is fine, but MySQL client (mysql) disables local infile by default, you need to run it as mysql --local-infile ..., and then the same query should work.
The error message is a legacy and it's confusing.

Since you're using phpMyAdmin, I highly recommend you just use the Import tab instead of manually entering the import query in the SQL tab. phpMyAdmin can easily import CSV files and I don't see any advantage to entering the query manually.

In MySQL Workbench Add the line below. in the Advanced Tab, check the test connection, and close.
OPT_LOCAL_INFILE=1

Related

SQL Import Wizard errors on importing a psv file

I am trying to import a psv (pipe delimited csv) into Microsoft SQL Server 2008R2 Express database table.
There are only two fields in the psv, each field has more than 1000 characters.
In the import wizard, I have the following settings:
Double checked in the mapping:
Note I set the option of Ignore on fail/truncate:
and as usual, I get an error:
Error 0xc02020a1: Data Flow Task 1: Data conversion failed. The data
conversion for column "Comm" returned status value 4 and status text
"Text was truncated or one or more characters had no match in the
target code page.". (SQL Server Import and Export Wizard)
UPDATE:
So, following #Marc's suggestion, though very/extremely reluctant, I spent 3 hours or so to finally get SQL2014 installed on my computer and am hoping to import the psv. As expected, error shows up again:
I really cannot understand why company like Microsoft did not do thorough QAT on their products?!
After being tortured by Microsoft for the whole morning, I finally got this task done, for the future readers, you can follow the steps below to import a csv/psv data source into your sql:
Import the CSV/PSV to an Access Database. Note, must be saved to the mdb type (yes, the type from 20th century), you might want to read my story here: how to import psv data into Microsoft Access
In your SQL (mine is 2014), start the Import Wizard and select the data source type (ACCESS) and the file. Why you have to use mdb type of access database? Here you will see there is no option in SQL 2014 for accdb type of access database.
DO NOT forget to select the right Destination (yes, even though you started the wizard by right click on the destination database and chose Import), you want to select the last option: SQL Native Client 11.0. That will show up the SQL2014 and the database.
Now that the import can be completed as expected.
Thanks to the great design logic in this SQL (2014? No, essentially no change compared to 2008), what a humble expectation and requirement!!! it costs me 4-5 hours to complete.
Alternatively, you can use bulk insert to import any flat file.
if (object_id('dbo.usecase1') is not null)
drop table dbo.usecase1
go
create table dbo.usecase1
(
Descr nvarchar(2000) null,
Comm nvarchar(2000) null
)
go
bulk insert dbo.usecase1
from 'C:\tmp\usecase0.psv'
with (
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
)
go
BULK INSERT (Transact-SQL)

Execute MySQL Query batch file

Hi everyone here is my problem.
I'm trying to run this sql query:
select *columns* from table *left joins* where *conditions* and column >= date_sub(current_date(),INTERVAL 1 YEAR) order by column INTO OUTFILE 'C:\path\file.csv' FIELDS TERMINATED BY ',' LINES TERMINATED BY '\r\n';
Through this file.bat code:
echo select... | mysql -u user -p******* -D database
But the output file is not made, instead a weird file named "date_sub(current_date()" file with no extension is made, this file contains information about mysql, like commands, version and other stuff which tells me this part of the query is creating conflict >=date_sub(current_date(),INTERVAL 1 YEAR)
I've tried several other tests like:
Adding double quotes to the sql query, but that gives a mysql syntax error.
Using another way to execute the mysql query through the batch file: mysql -u user -p****** -D database select... but this gives the same output as before
If I remove the condition that gives me trouble then the file.csv is successfully created.
Is there any other way to work around this problem?
Thanks in advance for the answers.

Creating External Table takes long time

I have a table called table B that as 28 million records that is in Netezza and I want to export it to a text file so that I can export the text file to the mysql server. When I run the command below, the SQL client hangs. I am using SquirrelSQL.
CREATE EXTERNAL TABLE '/Users/blah/A.txt'
USING(DELIM '\t' REMOTESOURCE 'JDBC')
AS
SELECT * FROM tableB;
I am not sure if this is supposed to be the case.
Well I'm note sure if you are running Squirrel on a Window machine, but if you are you need to use backslash in the path, and you might need to escape them also. Below is an example I use in Squirrel running on a Window 7 laptop
CREATE EXTERNAL TABLE ‘C:\\Users\\ValuedCustomer\\customer dim dump.csv’
USING ( DELIMITER ‘,’ Y2BASE 2000 ENCODING ‘internal’ REMOTESOURCE ‘JDBC’ ESCAPECHAR ‘\’ ) AS
SELECT CUSTOMER_FIRST_NAME, CUSTOMER_LASTNAME, CUSTOMER_ADDRESS, CUSTOMER_CITY, CUSTOMER_STATE
FROM DIM_CUSTOMER
You can find a little more info here on my blog
http://nztips.com/2012/07/returning-and-saving-large-result-sets-locally/

Problems with simple import of csv into a MySQL database (load data infile doesn't work)

Relatively new to Ruby, running: Ruby 1.9.2 and MySQL 5.5.19
I have a csv file full of data that I'd like to add to an existing table on a mysql database. The csv file does not have headers. Yes, I'm aware that there are multiple questions on this topic already. Here's how it breaks down:
Answer #1: Use LOAD DATA INFILE
Unfortunately, LOAD DATA INFILE gives me the following error: "Can't get stat of 'filename.csv' (Errcode: 2)"
This appears to be some kind of permissions issue. I've tried this both directly at the mysql prompt (as root), and through a Ruby script. I've tried various chmod options on the csv file, as well as moving the csv file to various directories where other users have said it works for them. No luck. Regardless, most people at this point recommend...
Answer #2: Use LOAD DATA local INFILE
Unfortunately this also returns an error. Apparently local infile is a mysql option turned off by default, because its a security risk. I've tried turning it on, but still get nothing but errors, such as:
ERROR 1148 (42000): The used command is not allowed with this MySQL version
and also
undefined method `execute' for # (NoMethodError)
Answer #3: I can find various answers involving Rails, which don't fit the situation. This isn't for a web application (although a web app might access it later), I'm just trying to add the data to the database for right now as a one-time thing to do some data analysis.
The Ruby file should be incredibly simple:
require 'rubygems'
require 'csv' (or fastercsv?)
require 'mysql'
db = mysql.connect('localhost','root','','databasename')
CSV.foreach('filename.csv') do |row|
?????
db.execute("INSERT INTO tablename ?????")
end
P.S. Much thanks in advance. Please no answers that point to using LOAD DATA INFILE or LOAD DATA LOCAL INFILE. Already wasted enough hours trying to get that to work...
ad mysql:
LOAD DATA INFILE '/complete/path/csvdata.csv' INTO TABLE mytable(column1,column2,...);
ad ruby
require 'csv'
require 'mysql'
db = mysql.real_connect('localhost','root','password','database')
file=CSV::Reader.parse('filename.csv')
file.each do |row|
values = row.inject([]){|k,v| k<<"'#{v}'";k}.join(',')
db.query("insert into table(column, column ...) values(#{values})")
end
db.close
it assumes csv file contains ALL the columns required..

mysql load data infile permission issue

I am using the following query to load into mysql:
LOAD DATA INFILE '/var/www/vhosts/httpdocs/xml/insert_feeds.csv'
INTO TABLE `deals_items`
FIELDS TERMINATED BY '###!##'
OPTIONALLY ENCLOSED BY ''
ESCAPED BY ''
LINES TERMINATED BY '###%##'
IGNORE 1 LINES
(#id,dealID,shopid,categoryid,title,permalink,url,startDate,endDate,description,extract,price,previous_price,discount,purchases,image,#location,#lat,#lng,locationText,type,settings,active)
SET id = '', lat=#lat, lng=#lng, locationText=#location, location = GeomFromText(CONCAT(POINT(#lat,#lng)))
Now this used to work just fine, but ever since an mysql was upgraded from 5.0 to 5.1 it stopped working. It now works only if I add the LOCAL to the statement.
The error I'm getting is Can't get stat of /var/www/vhosts/httpdocs/xml/insert_feeds.csv
The user was granted with full permissions to test it and the file was given 0777.
It won't work from any client (mysql,mysqli,pdo and console). Adding the local solves all problems but there is a lot of queries and I cannot go on and change them all. Further more, as I understand from the manual there are security issues with the LOCAL command.
The exact mysql version is 5.1.58-1~dotdeb.0 and the client version 5.0.51a