i´m trying to upload registers in mysql using a procedure in genexus with SQL[!&Query!]. The command works to create the table, but not with the load.
&Query = 'CREATE TABLE tmpent (registro varchar(500));'
SQL [!&Query!]
Commit
&Query = "LOAD DATA LOCAL INFILE 'C:/Program Files/Apache Software Foundation/Tomcat 9.0/txtfiles/Maeent.txt' INTO TABLE tmpent LINES TERMINATED BY '\r\n' (registro);"
SQL [!&Query!]
Commit
i´m doing something wrong?
Related
I use parametric queries for normal insert/updates for security.
How do I do that for queries like this:
LOAD DATA INFILE '/filepath' INTO TABLE mytable
In my case, the path to the file would be different everytime (for different requests). Is it fine to proceed like this (since I am not getting any data from outside, the file is from the server itself):
path = /filepath
"LOAD DATA INFILE" + path + "INTO TABLE mytable"
Since LOAD DATA is not listed in SQL Syntax Allowed in Prepared Statements you can't prepare something like
LOAD DATA INFILE ? INTO TABLE mytable
But SET is listed. So a workaround could be to prepare and execute
SET #filepath = ?
And then execute
LOAD DATA INFILE #filepath INTO TABLE mytable
Update
In Python with MySQLdb the following query should work
LOAD DATA INFILE %s INTO TABLE mytable
since no prepared statement is used.
To answer your "is it fine to proceed like this" question, your example code will fail because the resulting query will be missing quotes around the filename. If you changed it to the following it could run, but is still a bad idea IMO:
path = "/filepath"
sql = "LOAD DATA INFILE '" + path + "' INTO TABLE mytable" # note the single quotes
While you may not be accepting outside input today, code has a way of sticking around and getting reused/copied, so you should use the API in a way that will escape your parameters:
sql = "LOAD DATA INFILE %s INTO TABLE mytable"
cursor.execute(sql, (path,))
And don't forget to commit if autocommit is not enabled.
I've to import some data from a CSV file into a table of db on my Aruba server.
I use the following query:
LOAD DATA LOCAL INFILE 'test.csv' INTO TABLE dailycoppergg
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\r\n'
(
ddmmyy,
lmedollton,
changedolleuro,
euroton,
lmesterton,
delnotiz,
girm,
sgm
)
I tested this query on other Aruba server and it worked correctly but here, I've the following error:
#1148 - Il comando utilizzato non e` supportato in questa versione di MariaDB
How can I modify my query to import csv file data into dailycoppergg table? Can you help me, please? Thanks!
The query is fine, but MySQL client (mysql) disables local infile by default, you need to run it as mysql --local-infile ..., and then the same query should work.
The error message is a legacy and it's confusing.
Since you're using phpMyAdmin, I highly recommend you just use the Import tab instead of manually entering the import query in the SQL tab. phpMyAdmin can easily import CSV files and I don't see any advantage to entering the query manually.
In MySQL Workbench Add the line below. in the Advanced Tab, check the test connection, and close.
OPT_LOCAL_INFILE=1
I have a LOAD DATA mysql query im trying to run and need help to fix one thing.
Here is the query im running
$query = "LOAD DATA LOCAL INFILE '$file_app'
INTO TABLE tbl_user_tmp
LINES STARTING BY '{'
TERMINATED BY '/>'
(#name)
set
name=SUBSTRING_INDEX(#name,'\"',-2),
activity=SUBSTRING_INDEX(SUBSTRING_INDEX(#name,'}',1),'/',1),
class=SUBSTRING_INDEX(SUBSTRING_INDEX(#name,'}',1),'/',-1),
user = '$user'" ;
Here is the example of data from file im trying to load.
<item component="ComponentInfo{com.apps.aaa.roadside/com.apps.aaa.roadside.Splash}" drawable="aaa_roadside1" />
With the above query i get the following
name=aaa_roadside1"
activity=com.apps.aaa.roadside
class=com.apps.aaa.roadside.Splash
Everything is correct but name. I need to removed that last "
I thought the below query would work but it does not. any ideas?
This was working before when i had TERMINATED BY set to this '\" />'
However this will not account for entires that might not have that space at end like this
<item component="ComponentInfo{com.apps.aaa.roadside/com.apps.aaa.roadside.Splash}" drawable="aaa_roadside1"/>
I need to account for both
$query = "LOAD DATA LOCAL INFILE '$file_app'
INTO TABLE tbl_user_tmp
LINES STARTING BY '{'
TERMINATED BY '/>'
(#name)
set
name=SUBSTRING(SUBSTRING_INDEX(#name,'\"',-2),-1),
activity=SUBSTRING_INDEX(SUBSTRING_INDEX(#name,'}',1),'/',1),
class=SUBSTRING_INDEX(SUBSTRING_INDEX(#name,'}',1),'/',-1),
user = '$user'" ;
I'm having an issue with an app I'm working on.
The app allows a user to upload a CSV file which gets processed by the app and in turn creates records into a number of tables. In order to improve performance, for one of the tables, it produces a new CSV file in order to use the mysql LOAD DATA INFILE functionality.
Instead, it seems to be increasing the time it takes to process.
I'm pushing all processing into the background using sidekiq. It seems to be creating the CSV without any problems, however when I execute the load data query it just sits there and I have no idea what it's doing.
My processing function does the following :
CSV.open(output_path, 'w+', { force_quotes: true }) do |writer|
writer << headers
while rows.count > 0
....
data_sets.each do |ds|
writer << [UUIDTools::UUID.random_create, resp, row[set], ds.id, now, now]
set += 1
end
resp += 1
end
end
sql = "LOAD DATA LOCAL INFILE '#{output_path}'
INTO TABLE data_set_responses
FIELDS TERMINATED BY ',' ENCLOSED BY '\"'
LINES TERMINATED BY '\n'
(id, response_number, response, data_set_id, created_at, updated_at)"
con = ActiveRecord::Base.connection
con.execute("SET autocommit = 0;")
con.execute("SET unique_checks = 0;")
con.execute("SET foreign_key_checks = 0;")
con.execute("LOCK TABLES data_set_responses WRITE;")
con.execute(sql)
con.execute("UNLOCK TABLES;")
con.execute("COMMIT;")
con.execute("SET autocommit = 1;")
con.execute("SET unique_checks = 1;")
con.execute("SET foreign_key_checks = 1;")
As of right now, my sidekiq process has been running for 22 minutes and still hasn't finished. It should be inserting around 700k rows which shouldn't be taking anywhere near this long!
The table I'm inserting into has a binary field for it's primary key (uuid) so I don't know if that's slowing it down?
Any ideas?
I ended up changing my data structure to one that didn't require the vast number of rows that this structure did. I've got it down to a matter of seconds :)
I am using the following query to load into mysql:
LOAD DATA INFILE '/var/www/vhosts/httpdocs/xml/insert_feeds.csv'
INTO TABLE `deals_items`
FIELDS TERMINATED BY '###!##'
OPTIONALLY ENCLOSED BY ''
ESCAPED BY ''
LINES TERMINATED BY '###%##'
IGNORE 1 LINES
(#id,dealID,shopid,categoryid,title,permalink,url,startDate,endDate,description,extract,price,previous_price,discount,purchases,image,#location,#lat,#lng,locationText,type,settings,active)
SET id = '', lat=#lat, lng=#lng, locationText=#location, location = GeomFromText(CONCAT(POINT(#lat,#lng)))
Now this used to work just fine, but ever since an mysql was upgraded from 5.0 to 5.1 it stopped working. It now works only if I add the LOCAL to the statement.
The error I'm getting is Can't get stat of /var/www/vhosts/httpdocs/xml/insert_feeds.csv
The user was granted with full permissions to test it and the file was given 0777.
It won't work from any client (mysql,mysqli,pdo and console). Adding the local solves all problems but there is a lot of queries and I cannot go on and change them all. Further more, as I understand from the manual there are security issues with the LOCAL command.
The exact mysql version is 5.1.58-1~dotdeb.0 and the client version 5.0.51a