I am testing Google Cloud SQL and I have to test the instructions LOAD DATA INFILE and SELECT...INTO OUTFILE.
I understood that the instruction LOAD DATA LOCAL INFILE could be used instead of LOAD DATA INFILE, but how can it be used in Cloud SQL?
Can the instruction SELECT...INTO OUTFILE be used as is?
Thank you Renzo for your answer but when I try to execute the instruction SELECT.. INTO OUTFILE when I am connected to the Cloud SQL instance, I get the following error message:
ERROR 1045 (28000) at line 1: Access denied for user 'root'#'%'
I tried to used the instruction LOAD DATA LOCAL INFILE from my computer connected to Cloud SQL instance on MySQL and it worked. I used a file located on my computer to import CSV data into a table successfully. But I am wondering if I can use a file present in a bucket to do the same...
I try to explain what I need to do:
We are migrating our Website to GCP App Engine/Cloud SQL. We extensively use both instructions LOAD DATA INFILE which select files from a folder "ftp" to load them into the databases. As well, we use the SELECT INTO OUTFILE to export data to CSV files into a folder of our Website. So my concern is to be able to use the same process on App Engine/Cloud SQL.
The difference is that the Website instance and database instance are separated on GCP. Should we use buckets on GCP to replace our folders ? Or should we create folders on App Engine / Cloud SQL instance instead? What is the best solution according to you?
Thanks in advance
According to the page about Importing data into Cloud SQL,
you can also use the LOAD DATA LOCAL INFILE statement in the mysql client, which loads a local file to the database
You can follow a sample steps in this link to start the mysql client by connecting to your Cloud SQL instance first through Cloud Shell. After connecting, you will be able to execute the import statement: LOAD DATA LOCAL INFILE.
The same steps can be applied for exporting and the syntax SELECT...INTO OUTFILE can be used as is. Together with other alternatives, this can be also found on Exporting data from Cloud SQL :
Exporting in CSV format is equivalent to running the following SQL statement:
SELECT <query> INTO OUTFILE ... CHARACTER SET 'utf8mb4'
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '\"'
ESCAPED BY '\\' LINES TERMINATED BY '\n'
Once you have executed the operations, it is also advisable to check their status especially if it is long running or an error has occurred.
Related
I'm having to create a local installation of MySQL for development of a Shiny App Dashboard for analytics (R language, my SQL knowledge is basic, for simple bulky queries that I then tune in R).
The deployed version will be on a MS SQL server, therefore I want to setup the basic tables I will use (which are in csv format, small extract of the real thing) to try and optimize my SQL query. I was initially using Postgres SQL but I discovered there are a few syntax differences and therefore I opted to restart and implement MySQL which to my knowledge should be the same with regards to query syntax.
I downloaded the MySQL community Windows installer and installed the
full version to my machine, with all default settings.
I created successfully the server (all default settings) and named it on MySQL Workbench
I then successfully created one test table with all the correct column names etc.
CREATE TABLE servername.test_table( col1 text,
col2 int,
col3 int,
col4 text,
...
col40 text);
Using the below code I tried uploading the CSV to the servername.test_table
LOAD DATA INFILE 'D:\Dropbox\R-Projects\SQL\SAMPLE DATASETS\DATA1.csv'
INTO TABLE servername.test_table
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 ROWS;
The problem arise when I try to upload the csv to the desired table, I am getting:
Error Code: 1290. The MySQL server is running with the --secure-file-priv option so it cannot execute this statement
After some research, instead of meddling with access restrictions which I have no experience with nor fully setting up DBs. The simple possible solution would be to just keep the files in the directory that secure_file_priv would allow.
I ran the SQL command:
SHOW VARIABLES LIKE "secure_file_priv";
Got the directory: C:\ProgramData\MySQL\MySQL Server 8.0\Uploads and saved a copy of the csv file in that directory and then ran the following to try and upload the csv:
LOAD DATA LOCAL INFILE 'C:\ProgramData\MySQL\MySQL Server 8.0\Uploads\DATA1.csv'
INTO TABLE servername.test_table
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 ROWS;
After the above, the following error became the new error:
Error Code: 2068. LOAD DATA LOCAL INFILE file request rejected due to restrictions on access.
Back to research, I began to try and tinker with settings and MySQL command prompt.
Checked what was the server setting for the local_infile with SHOW GLOBAL VARIABLES LIKE 'local_infile'; as a query in MySQL Workbench, result was: Value - ON which to my understanding should have allowed the second attempt using the LOAD DATA LOCAL INFILE to have worked, it did not.
The below link suggested to add the following to the my.ini file.
ERROR 2068 (HY000): LOAD DATA LOCAL INFILE file request rejected due to restrictions on access
[client]
loose-local-infile=1
[mysqld]
local_infile=1
I did as well as restarted the MySQL80 service to see if it implemented... Tried uploading CSV again with no avail.
Could someone please help me upload this CSV so I can focus on my actual work? I've been at this for 2 days and it seems to be a common issue, yet I haven't seen and "dumb-down" step by step troubleshooting like I have presented.
If I may request, please try and provide a step by step approach on how to troubleshoot this. MySQL server mgmt skill are almost null, I'm used to have the access to the DB and simply querying.
Sorry for the long post, but I wanted to be as detailed as possible to show I've tried most of the solutions I've found with no success.
(not sure if this is pertinent: using a Threadripper 3960x workstation with 128GB ram, Windows 10 Pro, MySQL was fully installed in the same drive as Windows 10 Pro is)
Thank you for your time.
Quickest solution was to just deploy on Azure an MS SQL server and utilize the Azure Data Studio Extension called - SQL Server Import. For someone who just wants a server for testing and has to import CSV files that also uses enclosed speech marks. The deployment was quite fast when compared to trying to troubleshoot my local installation.
As MySQL doc says LOAD DATA INFILE is 20x times faster than bulk insert(INSERT VALUES), I am tring to replace INSERT VALUES with LOAD DATA INFILE in my application.
When I try LOAD DATA INFILE with ADO.NET, it ends in an error. It seems there is no way to load a file from local machine where MySQL is not running.
But that is not a problem for mysqlimport, the doc also says that.
You can also load data files by using the mysqlimport utility; it
operates by sending a LOAD DATA INFILE statement to the server. The
--local option causes mysqlimport to read data files from the client host.
I am curious about what the protocol is used by mysqlimport when it imports data into MySQL on a remote machine.
Is it parsing the data file and construct SQL like INSERT VALUES?
or it has a special protocol to send the data file directly to MySQL?
I have wordpress website and have created plugin to import csv to a table. Database is in RDS. Here is the sql I have used
LOAD DATA LOCAL INFILE 'my.csv' INTO TABLE tablename CHARACTER SET UTF8 FIELDS TERMINATED BY ',' ENCLOSED BY '\"' LINES TERMINATED BY '\r\n' IGNORE 1 LINES
(
ID,
Name
Address
)
When I run this sql in sqlyog in the same database ( database in RDS ) this works perfectly. Please note csv file used is on my windows folder and given absolute path. However when I run this sql from the plugin on the linux server (where the website is hosted), wordpress gives message saying Load data local infile is not supported. I have another website where this works fine and this is also hosted on AWS as this one and has the same configuration such as database in RDS and mysql version 5.5+ on both servers.
Am I missing anything here. Any help will be appreciated.
Thanks in advance.
Amazon RDS hosted service doesn't support the load from files. Hence it gives error.
Hence, you can't load the CSV.
Here is approach.
Convert your CSV data into insert into table(....) SQL data.
Load your data using command like below.
mysql -h <Host> -u <username> -p<Password> < Your_file.sql
I'm trying to load data from a CSV using MySQL, but I'm getting Error code 29 (file not found). I'm using mac osx, but when I run the following query
LOAD DATA INFILE '/workspace/SQL_Test/src/values.csv'
INTO TABLE queryid_vs_column
COLUMNS TERMINATED BY ','
MySQL tries to look in 'C:/workspace/SQL_Test/src/values.csv'. I haven't found anyone else with similar issues, has anyone encountered something like this? I'm not sure why MySQL thinks I'm running a windows machine.
Thanks.
If you don't use the LOCAL modifier, it accesses the file on the server. Change your query to:
LOAD DATA LOCAL INFILE '/workspace/SQL_Test/src/values.csv'
INTO TABLE queryid_vs_column
COLUMNS TERMINATED BY ','
I have a csv file that I will be regularly updating through a batch script that calls cygwin+ bash script. I would like to automate the upload of the csv file into a MySQL database such that a table in my database would be updated with the csv file at regular intervals. The database is currently running on a Windows Server 2003 machine and administered with phpMyAdmin.
I have looked online and found some ways that I could achieve part of that, but I am confused as to where the code presented in those sources should be placed and how they would be called. For instance, Import CSV file directly into MySQL seems to show how to upload a csv file to a MySQL database from the SQL command line once, but not repeatedly, the latter being what I need.
I would prefer the solution to involved bash scripting (as opposed to batch and php) if possible (i.e. I would prefer a solution that I could integrate with the bash scripts that update the csv file).
Thank you
You can execute a MySQL script from the command line by doing something like:
mysql -uUsername -pPassword database_name < infile.sql
You could invoke that from the command line and in the infile.sql you could have code like:
LOAD DATA INFILE 'filename.csv' TO table_name
FIELDS TERMINATED BY ','
You can use a here document:
# some bash script stuff
mysql ... <<EOF
SQL COMMANDS
GO HERE
EOF
# more bash script stuff
You can use Quartz to create a cronjob - for periodically updating your database. with the help of cronmaker (http://www.cronmaker.com/), you get to choose when and how often your database gets updated.
This is a sample SQL Script to import data into your MySQL database:
LOAD DATA INFILE 'c:/.../filename.csv'
INTO TABLE discounts
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 ROWS;
run the above script in your cronjob using your preferred language.