I am trying to import a CSV file into my AWS RDS database through mysql workbench by using Load Data Infile. I was able to load the data infile for a local db I created. For that local database I used the following format in cmd:
CREATE DATABASE db;
USE db;
CREATE TABLE data(id, name, description);
LOAD DATA LOCAL INFILE
"C:/Users/bbrown/Projects/Database/NSF/nsf_04_17_2019.csv" INTO TABLE data
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
IGNORE 1 LINES
(id, name, description);
This creates the database with the table containing all the data from the csv.
For the AWS RDS database it created a database called innodb, how can I import the same CSV file into that database? (I do not want to use the "Table Data Import Wizard" because it takes too long to import the CSV file).
use following command
mysql -u {username} -p {database} < {your_db_sql_file or .sql.zip file}
The solution is by logging into mysql in cmd like this:
mysql -h (aws endpoint) -P 3306 -u username -p
Then it will prompt you for your password(keep in mind, the username and passwords are the ones you created on AWS). I hope this helps anyone else who also struggled with this
I cant seem to use community server(or workbench for that matter) to use the Load local data infile command for csv. it always says cant use this command its not allowed on this version.
Ps I tried setting the local_infile global variable as 1 in the client cmd prompt but it still gives the same issue.Also i cant seem to give the parameters directly at start since the client opens on its own.(I dont have simple mysql command on windows only community server)
Why you want to use load local data infile command in Workbench? It already provides the option for table data import wizard.
Also, for the particular error, I tried following code and it worked for me:
(Before connecting to MySQL database type following)
mysql -u username -p --local-infile databasename
Enter Password:
Now, try using the database and importing the table.
MySQL> use databasename;
MySQL> load data local infile 'filename/filelocation.csv' into table tablename
-> fields terminated by ','
-> enclosed by '\r\n'
-> ignore 1 lines
-> (col1, col2, col3...);
I am working with analysing big data using opencl. I got the data in mysql dump text file (dump.txt). I want to restore the data into mysql database. Then I can connect to the database and do the other stuff related to parallel programming to analysis big data.
I went through some tutorials and questions related this. I found a related question here. link here. But there they mentioned the way to restore .sql dump file. what can we do for dump text files?
Abstract part of my dump text file is here
"94723605719","435035","2013-06-01 23:51:36","2013-06-01","2013","6","1","7","22
","23","51","36","1","202","-1002409728","1005823215","50000.000",\N,"1613003749
10","50026.000","226","0","0","94723605719","34399725","0","0","0",\N
"94722616878","435014","2013-06-01 23:51:52","2013-06-01","2013","6","1","7","22
","23","51","52","1","202","-1002099361","1005511506","50000.000",\N,"1613002394
31","50127.000","157","0","0","94722616878","34438596","0","0","0",\N
"94726556777","435022","2013-06-01 23:51:52","2013-06-01","2013","6","1","7","22
","23","51","52","1","202","-1002496570","1005910182","50000.000",\N,"1614002967
42","61046.000","226","0","0","94726556777","34399744","0","0","0",\N
Any help is appreciated.
Thanks in advance.
If you have 'real' dump, then you can use:
mysql -u root -p -h local host < your_dump.sql
That said, your example listed in the question seems to hint that you just have the values in a CSV file, in which case you need to first create the table and then load the data into it:
LOAD DATA INFILE 'your_file.txt'
INTO TABLE your_table
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"';
See MySQL manual for LOAD DATA INFILE for more details.
I'm used to migrate MySql databases and the best way for me is like this:
Export Database(backup):
mysqldump -u root -p [database_name] > [dump_filename].sql
Import Database:
mysql -u root -p [database_name] < [dump_filename].sql
If the database you want to restore doesn't already exist, you need to create it first.
On the command-line, if you're in the same directory that contains the dumped file, use these commands (with appropriate substitutions):
C:\> mysql -u root -p
mysql> create database mydb;
mysql> use mydb;
mysql> source db_backup.dump;
https://dev.mysql.com/doc/mysql-backup-excerpt/5.6/en/using-mysqldump.html
http://meta.wikimedia.org/wiki/Database_dump_and_restore_examples
I have a text file which i would want to load into mysql database on Ubuntu server 12.04 LTS. I have entered the data into the file trey.txt and i moved the file to /tmp directory. When i move to the db, and enter the command
LOAD DATA
INFILE '/tmp/trey.txt'
into table arp_table
columns terminated by '|';
the output is
ERROR 13(HY000):Can't get stat of '/tmp/trey.txt'(Errcode: 2)
How should i modify to enter these details. And can i run this from command line as a cron job.
Put your data file in root folder after that run the command
$ sudo mysql -u root -p <database name>
mysql> LOAD DATA LOCAL INFILE '/path/trey.txt' INTO TABLE pet;
I have a huge mysql dump file generated from phpmyAdmin (150 000 lines of create table, inserts....)
The mySql query tool fails to open such a big file.
As a result i can't insert all the records.
Is there a way to do this ?
Thanks
John
The solution to restore large mysql database from sql dump file is using unix/linux shell command.
To restore mysql database from a dump file, just type the command below:
mysql -u #username# -p #database# < #dump_file#
Of course you need to replace #username# to your database username and #database# to your target database. and rename #dump_file# to your dump file file name (Ex: dump.sql) Once you enter the command, the linux/unix shell will prompt you for your database user password, just key in your database password and you are done
borrowed from: http://www.techiecorner.com/31/how-to-restore-mysql-database-from-sql-dump-file/
Split the file by parts. First the CREATE instructions, then the inserts.