Importing Microsoft Access in to a Ruby on Rails MySQL Database - mysql

I need to import data from a Microsoft Access database through Ruby in to a MySQL database programmaticly. I need the process to be as easy as possible, as the goals is for a non technical user to obtain the access database, upload it to the Rails application, and ingest the data in to the new database.
Is there a way to do this on a Mac or Linux environment completely through Ruby?

1) export the access file as a CSV
2) you can use http://dev.mysql.com/doc/refman/5.1/en/load-data.html
example:
mysql> LOAD DATA LOCAL INFILE 'myfile.csv'
-> INTO TABLE mytable
-> FIELDS TERMINATED BY ',' ENCLOSED BY ''
-> LINES TERMINATED BY '\r\n';

Related

Use LOAD DATA INFILE and SELECT INTO OUTFILE in Cloud SQL

I am testing Google Cloud SQL and I have to test the instructions LOAD DATA INFILE and SELECT...INTO OUTFILE.
I understood that the instruction LOAD DATA LOCAL INFILE could be used instead of LOAD DATA INFILE, but how can it be used in Cloud SQL?
Can the instruction SELECT...INTO OUTFILE be used as is?
Thank you Renzo for your answer but when I try to execute the instruction SELECT.. INTO OUTFILE when I am connected to the Cloud SQL instance, I get the following error message:
ERROR 1045 (28000) at line 1: Access denied for user 'root'#'%'
I tried to used the instruction LOAD DATA LOCAL INFILE from my computer connected to Cloud SQL instance on MySQL and it worked. I used a file located on my computer to import CSV data into a table successfully. But I am wondering if I can use a file present in a bucket to do the same...
I try to explain what I need to do:
We are migrating our Website to GCP App Engine/Cloud SQL. We extensively use both instructions LOAD DATA INFILE which select files from a folder "ftp" to load them into the databases. As well, we use the SELECT INTO OUTFILE to export data to CSV files into a folder of our Website. So my concern is to be able to use the same process on App Engine/Cloud SQL.
The difference is that the Website instance and database instance are separated on GCP. Should we use buckets on GCP to replace our folders ? Or should we create folders on App Engine / Cloud SQL instance instead? What is the best solution according to you?
Thanks in advance
According to the page about Importing data into Cloud SQL,
you can also use the LOAD DATA LOCAL INFILE statement in the mysql client, which loads a local file to the database
You can follow a sample steps in this link to start the mysql client by connecting to your Cloud SQL instance first through Cloud Shell. After connecting, you will be able to execute the import statement: LOAD DATA LOCAL INFILE.
The same steps can be applied for exporting and the syntax SELECT...INTO OUTFILE can be used as is. Together with other alternatives, this can be also found on Exporting data from Cloud SQL :
Exporting in CSV format is equivalent to running the following SQL statement:
SELECT <query> INTO OUTFILE ... CHARACTER SET 'utf8mb4'
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '\"'
ESCAPED BY '\\' LINES TERMINATED BY '\n'
Once you have executed the operations, it is also advisable to check their status especially if it is long running or an error has occurred.

Why number of rows being inserted in MySQL is less when inserted using LOAD DATA LOCAL INFILE from ruby

I am trying to insert data from text file into MySQL using LOAD DATA LOCAL INFILE command.
When I hit this command in MySQL window I get the total number of records inserted.
LOAD DATA LOCAL INFILE 'filepath' INTO TABLE tablename FIELDS TERMINATED BY '\t';
However, when I pass the same command using ruby, I see the less data is getting inserted.
connect.query("LOAD DATA LOCAL INFILE 'filepath' INTO TABLE tablename FIELDS TERMINATED BY '\\t';")
I have verified by printing the above query and it is the same.
I am using MySQL Workbench 6.3 version 6.3.10 build 12092614
If 'filepath' is a variable, is will not be expanded. Try this:
connect.query("LOAD DATA LOCAL INFILE '#{filepath}' INTO TABLE tablename FIELDS TERMINATED BY '\\t';")
If 'filepath' is a file path literal, try using an absolute path from the root. That is, if MySQL is running locally.
If this query is being submitted to a remote MySQL server, then there is no hope of it opening a local file, and you may need to rethink your import strategy completely.

LOAD DATA LOCAL INFILE not working in RDS

I have wordpress website and have created plugin to import csv to a table. Database is in RDS. Here is the sql I have used
LOAD DATA LOCAL INFILE 'my.csv' INTO TABLE tablename CHARACTER SET UTF8 FIELDS TERMINATED BY ',' ENCLOSED BY '\"' LINES TERMINATED BY '\r\n' IGNORE 1 LINES
(
ID,
Name
Address
)
When I run this sql in sqlyog in the same database ( database in RDS ) this works perfectly. Please note csv file used is on my windows folder and given absolute path. However when I run this sql from the plugin on the linux server (where the website is hosted), wordpress gives message saying Load data local infile is not supported. I have another website where this works fine and this is also hosted on AWS as this one and has the same configuration such as database in RDS and mysql version 5.5+ on both servers.
Am I missing anything here. Any help will be appreciated.
Thanks in advance.
Amazon RDS hosted service doesn't support the load from files. Hence it gives error.
Hence, you can't load the CSV.
Here is approach.
Convert your CSV data into insert into table(....) SQL data.
Load your data using command like below.
mysql -h <Host> -u <username> -p<Password> < Your_file.sql

load data into the table into remote server from local machine

I have a table in database on the remote server. I want to populate the data into the that table from the local machine. I have the csv file in the local machine which has data that needs to be uploaded data into the table of remote server.
Please suggest me how can do it through mysqlimport and from where should I execute.
Thanks in advance.
Use LOAD DATA to solve this problem. Find example below:
LOAD DATA LOCAL INFILE 'filename'
INTO TABLE `tablename`
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\r\n' (`column_name`)

Can load data in file take ftp file location

Is there is any way to run
LOAD DATA INFILE
command, when file exists in FTP location? For example something like this:
LOAD DATA INFILE 'USER%40WEBSITE.COM:PASSWORD#ftp.WEBSITE.COM/FILE.csv';
REPLACE INTO TABLE mytable FIELDS TERMINATED BY '|' LINES TERMINATED BY '\n' IGNORE 1 LINES
No, MySQL file-related functions can only access files on the database server or the client, but not other machines (unless you've set up a network file system mount, to make the remote machine appear to be local).