How to load data infile mysql workbench for AWS RDS? - mysql

I am trying to import a CSV file into my AWS RDS database through mysql workbench by using Load Data Infile. I was able to load the data infile for a local db I created. For that local database I used the following format in cmd:
CREATE DATABASE db;
USE db;
CREATE TABLE data(id, name, description);
LOAD DATA LOCAL INFILE
"C:/Users/bbrown/Projects/Database/NSF/nsf_04_17_2019.csv" INTO TABLE data
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
IGNORE 1 LINES
(id, name, description);
This creates the database with the table containing all the data from the csv.
For the AWS RDS database it created a database called innodb, how can I import the same CSV file into that database? (I do not want to use the "Table Data Import Wizard" because it takes too long to import the CSV file).

use following command
mysql -u {username} -p {database} < {your_db_sql_file or .sql.zip file}

The solution is by logging into mysql in cmd like this:
mysql -h (aws endpoint) -P 3306 -u username -p
Then it will prompt you for your password(keep in mind, the username and passwords are the ones you created on AWS). I hope this helps anyone else who also struggled with this

Related

How to migrate data from an Informix database to MySQL

We have to migrate data from an Informix database to a MySQL database.
Help me how we can achieve it?
What steps or commands I have to use it? If you have any document or reference link that would be very much helpful.
Step1: login to Informix and run the unload command. It will create a backup file named "file1" at current location under export directory.
UNLOAD to 'export/file1' SELECT * FROM db1.table1
Step2: Create table1 in MySQL Database.
Step3: Login to MySQL Database and set the global variable.
mysql> SET GLOBAL local_infile=1;
mysql> quit
Step4: Login to MySQL Database using below command
mysql --local-infile=1 -u root -p
Step5: Load the data into MySQL Databse using LOAD DATA command
LOAD DATA LOCAL INFILE '/export/file1'
INTO TABLE table1
FIELDS TERMINATED BY '|'
LINES TERMINATED BY '\n';

Loading local data on mysql community server/workbench

I cant seem to use community server(or workbench for that matter) to use the Load local data infile command for csv. it always says cant use this command its not allowed on this version.
Ps I tried setting the local_infile global variable as 1 in the client cmd prompt but it still gives the same issue.Also i cant seem to give the parameters directly at start since the client opens on its own.(I dont have simple mysql command on windows only community server)
Why you want to use load local data infile command in Workbench? It already provides the option for table data import wizard.
Also, for the particular error, I tried following code and it worked for me:
(Before connecting to MySQL database type following)
mysql -u username -p --local-infile databasename
Enter Password:
Now, try using the database and importing the table.
MySQL> use databasename;
MySQL> load data local infile 'filename/filelocation.csv' into table tablename
-> fields terminated by ','
-> enclosed by '\r\n'
-> ignore 1 lines
-> (col1, col2, col3...);

Import *.ods file in MySQL without phpMyAdmin

I'm trying to import a *.ods file using mysqlimport or 'load data' instead of phpMyAdmin interface because I need to automate the process.
This is my first attempt:
mysqlimport --ignore-lines=1 -u root -p DATABASE /home/luca/Scrivania/lettura.ods
mysqlimport can't upload the file because there are two spreadsheets, I need both and I can't modify the file structure.
With phpMyAdmin i'm able to upload the file correctly. The content of the two spreadsheets fills correctly the two tables. I know that when importing an *.ods file like this, phpMyAdmin uses the sheet name as the table name for import the file, but this is not the behavior of mysqlimport. Mysqlimport uses the file name, not the sheet name.
So I've tried this:
mysql -u root -p -e "load data local infile '/home/luca/Scrivania/lettura.ods' into table TABLE_NAME" DATABASE
Returns no error but the data in the table is totaly inconsistent.
Any suggestions?
Thank You
MySQL doesn't know how to read spreadsheets. Your best bet is to use your spreadsheet program to export a CSV, then load that new CSV into the database. Specify comma-delimited loading in your query and try it:
LOAD LOCAL DATA INFILE '/home/luca/Scrivania/lettura.csv'
INTO TABLE TABLE_NAME
FIELDS TERMINATED BY ','
ENCLOSED BY '' ESCAPED BY '\'
LINES TERMINATED BY '\n'
STARTING BY ''
The documentation for file loading contains tons of possible options, if you need them.

Restoring mysql dump text file to database

I am working with analysing big data using opencl. I got the data in mysql dump text file (dump.txt). I want to restore the data into mysql database. Then I can connect to the database and do the other stuff related to parallel programming to analysis big data.
I went through some tutorials and questions related this. I found a related question here. link here. But there they mentioned the way to restore .sql dump file. what can we do for dump text files?
Abstract part of my dump text file is here
"94723605719","435035","2013-06-01 23:51:36","2013-06-01","2013","6","1","7","22
","23","51","36","1","202","-1002409728","1005823215","50000.000",\N,"1613003749
10","50026.000","226","0","0","94723605719","34399725","0","0","0",\N
"94722616878","435014","2013-06-01 23:51:52","2013-06-01","2013","6","1","7","22
","23","51","52","1","202","-1002099361","1005511506","50000.000",\N,"1613002394
31","50127.000","157","0","0","94722616878","34438596","0","0","0",\N
"94726556777","435022","2013-06-01 23:51:52","2013-06-01","2013","6","1","7","22
","23","51","52","1","202","-1002496570","1005910182","50000.000",\N,"1614002967
42","61046.000","226","0","0","94726556777","34399744","0","0","0",\N
Any help is appreciated.
Thanks in advance.
If you have 'real' dump, then you can use:
mysql -u root -p -h local host < your_dump.sql
That said, your example listed in the question seems to hint that you just have the values in a CSV file, in which case you need to first create the table and then load the data into it:
LOAD DATA INFILE 'your_file.txt'
INTO TABLE your_table
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"';
See MySQL manual for LOAD DATA INFILE for more details.
I'm used to migrate MySql databases and the best way for me is like this:
Export Database(backup):
mysqldump -u root -p [database_name] > [dump_filename].sql
Import Database:
mysql -u root -p [database_name] < [dump_filename].sql
If the database you want to restore doesn't already exist, you need to create it first.
On the command-line, if you're in the same directory that contains the dumped file, use these commands (with appropriate substitutions):
C:\> mysql -u root -p
mysql> create database mydb;
mysql> use mydb;
mysql> source db_backup.dump;
https://dev.mysql.com/doc/mysql-backup-excerpt/5.6/en/using-mysqldump.html
http://meta.wikimedia.org/wiki/Database_dump_and_restore_examples

How many ways of importing data into mysql

I have a page in my website which is used for insertion of properties by users which has 54 boxes.
I don't want these information should go directly to my database cause it make it heavy if there will be 200 record per day.
The way i want is to collect the data from users and confirm it, after confirmation i should be able to imported.
May i know how many ways are there for importing data into mysql ?
How many ways of importing data into mysql:
It should be as simple as...
LOAD DATA INFILE '/tmp/mydata.txt' INTO TABLE PerformanceReport;
By default LOAD DATA INFILE uses tab delimited, one row per line, so should take it in just fine
IMPORT
1.Make sure the database you need has already been created. If it has not, please first create the database:
How do I create a database?
CAUTION:
If you import a backup file to a database that already has content, it will replace the existing content.
Use FTP to upload your SQL file to your server. You can upload it to
your default FTP directory. Or, see Step 1 in the "Export"
instructions above for another suggestion. Alternately, you can use
scp to upload your file via SSH.
Log into your server via SSH.
Use the command cd to navigate into the directory where you uploaded
your backup file in Step 1. If you uploaded the backup to your data
directory, go here (replace 00000 with your site number):
cd /home/00000/data/
Import the database by executing the following command:
`mysql -h internal-db.s00000.gridserver.com -u username -p dbname < dbname.sql`
OR:
`mysql -h internal-db.s00000.gridserver.com -u username -p dbname -e 'source dbname.sql'`
Once you execute this command, you will be prompted for your
database password. Type it in and hit enter. Your database will now
import. It may take a few minutes if you have a large database. When
the import is done, you will be returned to the command prompt.
NOTE:
Variables are the same as in Step 3 from the Export section above.
Please check Step 3 in the "Export" section to make sure you are
correctly replacing the example code with your own information.
dbname.sql is the actual name of your SQL file.
If you have a gzipped backup of your database, you can use this line instead:
`gunzip < dbname.gz | mysql -h internal-db.s00000.gridserver.com -u username -p dbname`
You can enter in your own username, database name, and backup file
name, as before. dbname.gz is the name of your gzipped backup file.
Use "unzip" instead of "gunzip" for zipped files.
Remove the SQL file from your web-accessible directory, if you
uploaded it to a public folder. Otherwise, anyone can download it
from the web.
If you get an error that looks like this:
Got Error: 1045: Access denied for user 'db00000#internal-db.s00000.gridserver.com' (using password: YES) when trying to connect
You have entered an incorrect password. Please retype it carefully,
or reset your password via the AccountCenter Control Panel. See
Database users on the Grid for instructions.
If you get an SQL error during the import, you can force it to finish by adding "-f" to the command, which stands for "force." For example:
`mysql -f -h internal-db.s00000.gridserver.com -u username -p dbname -e 'source dbname.sql'`
This can help you finish an import if you have a few corrupt tables,
but need to get the database as a whole imported before you do
anything else.
http://dev.mysql.com/doc/refman/5.0/en/load-data.html
https://dev.mysql.com/doc/refman/5.0/en/loading-tables.html
https://www.mysql.com/why-mysql/windows/excel/import/
http://www.itworld.com/it-management/359857/3-ways-import-and-export-mysql-database
$file = '/pathtocsviportdatabase/csv/importtabledata.csv';
$import = "LOAD DATA LOCAL INFILE '".$file."' INTO TABLE `imports` FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '\"'
LINES TERMINATED BY '\n'
(AllCOLUMN SEparated by ',');";
mysql_query($import) or die(mysql_error());