Import *.ods file in MySQL without phpMyAdmin - mysql

I'm trying to import a *.ods file using mysqlimport or 'load data' instead of phpMyAdmin interface because I need to automate the process.
This is my first attempt:
mysqlimport --ignore-lines=1 -u root -p DATABASE /home/luca/Scrivania/lettura.ods
mysqlimport can't upload the file because there are two spreadsheets, I need both and I can't modify the file structure.
With phpMyAdmin i'm able to upload the file correctly. The content of the two spreadsheets fills correctly the two tables. I know that when importing an *.ods file like this, phpMyAdmin uses the sheet name as the table name for import the file, but this is not the behavior of mysqlimport. Mysqlimport uses the file name, not the sheet name.
So I've tried this:
mysql -u root -p -e "load data local infile '/home/luca/Scrivania/lettura.ods' into table TABLE_NAME" DATABASE
Returns no error but the data in the table is totaly inconsistent.
Any suggestions?
Thank You

MySQL doesn't know how to read spreadsheets. Your best bet is to use your spreadsheet program to export a CSV, then load that new CSV into the database. Specify comma-delimited loading in your query and try it:
LOAD LOCAL DATA INFILE '/home/luca/Scrivania/lettura.csv'
INTO TABLE TABLE_NAME
FIELDS TERMINATED BY ','
ENCLOSED BY '' ESCAPED BY '\'
LINES TERMINATED BY '\n'
STARTING BY ''
The documentation for file loading contains tons of possible options, if you need them.

Related

MySQL loop for use with LOAD DATA LOCAL INFILE

I have a set of raw data files named using a pattern like a-1.txt, a-2.txt, etc. I am using the LOAD DATA LOCAL INFILE command in a MySQL script to load the raw data files into the database. That command cannot be run in a stored procedure. I'd like to avoid doing a copy/paste for loading 20 raw data files, like I described, in the MySQL script and would much rather use a LOOP to load the raw data files, but LOOP cannot be used outside of a stored procedure.
What's the best way to handle this? How do I get the MySQL script to do this?
Assuming you are using Bash, you can run the following to generate the SQL file.
rm testfile.sql
ls *.txt | xargs -I inputfile echo "LOAD DATA LOCAL INFILE 'inputfile' INTO TABLE mytable FIELDS TERMINATED BY ',' ENCLOSED BY '\"' LINES TERMINATED BY '\r\n';" >> testfile.sql
Then to run it, you can add another line.
mysql -h localhost -u root -pXXXXXXX mydatabase < testfile.sql

How to load data infile mysql workbench for AWS RDS?

I am trying to import a CSV file into my AWS RDS database through mysql workbench by using Load Data Infile. I was able to load the data infile for a local db I created. For that local database I used the following format in cmd:
CREATE DATABASE db;
USE db;
CREATE TABLE data(id, name, description);
LOAD DATA LOCAL INFILE
"C:/Users/bbrown/Projects/Database/NSF/nsf_04_17_2019.csv" INTO TABLE data
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
IGNORE 1 LINES
(id, name, description);
This creates the database with the table containing all the data from the csv.
For the AWS RDS database it created a database called innodb, how can I import the same CSV file into that database? (I do not want to use the "Table Data Import Wizard" because it takes too long to import the CSV file).
use following command
mysql -u {username} -p {database} < {your_db_sql_file or .sql.zip file}
The solution is by logging into mysql in cmd like this:
mysql -h (aws endpoint) -P 3306 -u username -p
Then it will prompt you for your password(keep in mind, the username and passwords are the ones you created on AWS). I hope this helps anyone else who also struggled with this

How to insert a CSV into Mysql with Xampp shell

I am trying to upload a sql file in the database I am running following code
# mysql - u root -p idecon123 < "d:\IdeOffline\answersnew.sql"
But when I parse this it shows the help manual .
Is there any mistake in my command
Thanks
Go to http://localhost/ after starting Apache and MySQL. On the left side you will see a link for phpMyAdmin. Click that, then upload your .sql file through the browser. See http://wiki.phpmyadmin.net/pma/import for how to import files.
edit: for big files, set an uploadDir per http://docs.phpmyadmin.net/en/latest/config.html . Copy the file into that directory and you'll be able to import it directly.
You can do that by using the mysqlimport command in the command prompt.
Here is the syntax,
mysqlimport -u root -p abc#123 --ignore-lines=1 --fields-terminated-by="," --lines-terminated-by="\n" --local upam_dev C:\Users\ukrishnan\Desktop\Generated-Players\PlayerProfile.dataSet-1.csv
--ignore-line - To exclude if you have something like title in your first line of CSV
--fields-terminated-by - Can specify, how the field is terminated like , or | symbol
The filenames must correspond with the tables into which their data will be imported. If a filename contains one or more . (dots), the portion before the first dot will be assumed to be the name of the table.
If you have data in multiple files, you can specify like Profile.File-1.csv, Profile.File-2.csv. So that it will take the table name as Profile when you import these files.
These is ultimately fast when compare with the importing through PhpMyAdmin. It took me around 3-4 seconds to import a file with 100K records.

How many ways of importing data into mysql

I have a page in my website which is used for insertion of properties by users which has 54 boxes.
I don't want these information should go directly to my database cause it make it heavy if there will be 200 record per day.
The way i want is to collect the data from users and confirm it, after confirmation i should be able to imported.
May i know how many ways are there for importing data into mysql ?
How many ways of importing data into mysql:
It should be as simple as...
LOAD DATA INFILE '/tmp/mydata.txt' INTO TABLE PerformanceReport;
By default LOAD DATA INFILE uses tab delimited, one row per line, so should take it in just fine
IMPORT
1.Make sure the database you need has already been created. If it has not, please first create the database:
How do I create a database?
CAUTION:
If you import a backup file to a database that already has content, it will replace the existing content.
Use FTP to upload your SQL file to your server. You can upload it to
your default FTP directory. Or, see Step 1 in the "Export"
instructions above for another suggestion. Alternately, you can use
scp to upload your file via SSH.
Log into your server via SSH.
Use the command cd to navigate into the directory where you uploaded
your backup file in Step 1. If you uploaded the backup to your data
directory, go here (replace 00000 with your site number):
cd /home/00000/data/
Import the database by executing the following command:
`mysql -h internal-db.s00000.gridserver.com -u username -p dbname < dbname.sql`
OR:
`mysql -h internal-db.s00000.gridserver.com -u username -p dbname -e 'source dbname.sql'`
Once you execute this command, you will be prompted for your
database password. Type it in and hit enter. Your database will now
import. It may take a few minutes if you have a large database. When
the import is done, you will be returned to the command prompt.
NOTE:
Variables are the same as in Step 3 from the Export section above.
Please check Step 3 in the "Export" section to make sure you are
correctly replacing the example code with your own information.
dbname.sql is the actual name of your SQL file.
If you have a gzipped backup of your database, you can use this line instead:
`gunzip < dbname.gz | mysql -h internal-db.s00000.gridserver.com -u username -p dbname`
You can enter in your own username, database name, and backup file
name, as before. dbname.gz is the name of your gzipped backup file.
Use "unzip" instead of "gunzip" for zipped files.
Remove the SQL file from your web-accessible directory, if you
uploaded it to a public folder. Otherwise, anyone can download it
from the web.
If you get an error that looks like this:
Got Error: 1045: Access denied for user 'db00000#internal-db.s00000.gridserver.com' (using password: YES) when trying to connect
You have entered an incorrect password. Please retype it carefully,
or reset your password via the AccountCenter Control Panel. See
Database users on the Grid for instructions.
If you get an SQL error during the import, you can force it to finish by adding "-f" to the command, which stands for "force." For example:
`mysql -f -h internal-db.s00000.gridserver.com -u username -p dbname -e 'source dbname.sql'`
This can help you finish an import if you have a few corrupt tables,
but need to get the database as a whole imported before you do
anything else.
http://dev.mysql.com/doc/refman/5.0/en/load-data.html
https://dev.mysql.com/doc/refman/5.0/en/loading-tables.html
https://www.mysql.com/why-mysql/windows/excel/import/
http://www.itworld.com/it-management/359857/3-ways-import-and-export-mysql-database
$file = '/pathtocsviportdatabase/csv/importtabledata.csv';
$import = "LOAD DATA LOCAL INFILE '".$file."' INTO TABLE `imports` FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '\"'
LINES TERMINATED BY '\n'
(AllCOLUMN SEparated by ',');";
mysql_query($import) or die(mysql_error());

How do I store a MySQL query result into a local CSV file?

How do I store a MySQL query result into a local CSV file? I don't have access to the remote machine.
You're going to have to use the command line execution '-e' flag.
$> /usr/local/mysql/bin/mysql -u user -p -h remote.example.com -e "select t1.a,t1.b from db_schema.table1 t1 limit 10;" > hello.txt
Will generate a local hello.txt file in your current working directory with the output from the query.
Use the concat function of mysql
mysql -u <username> -p -h <hostname> -e "select concat(userid,',',surname,',',firstname,',',midname) as user from dbname.tablename;" > user.csv
You can delete the first line which contains the column name "user".
We can use command line execution '-e' flag and a simple python script to generate result in csv format.
create a python file (lets name = tab2csv) and write the following code in it..
#!/usr/bin/env python
import csv
import sys
tab_in = csv.reader(sys.stdin, dialect=csv.excel_tab)
comma_out = csv.writer(sys.stdout, dialect=csv.excel)
for row in tab_in:
comma_out.writerow(row)
Run the following command by updating mysql credentials correctly
mysql -u orch -p -h database_ip -e "select * from database_name.table_name limit 10;" | python tab2csv > outfile.csv
Result will be stored in outfile.csv.
I haven't had a chance to test it against content with difficult characters yet, but the fantastic mycli may be a solution for many.
Command line
mycli --csv -e "select * from table;" mysql://user#host:port/db > file.csv
Interactive mode:
\n \T csv ; \o ~/file.csv ; select * from table1; \P
\n disables pager that requires pressing space to display each page
\T csv ; - sets the output format to csv
\o <filename> ; - appends next output to a file
<query> ;
\P turns the pager back on
I am facing this problem and I've been reading some time for a solution: importing into excel, importing into access, saving as text file...
I think the best solution for windows is the following:
use the command insert...select to create a "result" table. The ideal scenario whould be to automatically create the fields of this result table, but this is not possible in mysql
create an ODBC connection to the database
use access or excel to extract the data and then save or process in the way you want
For Unix/Linux I think that the best solution might be using this -e option tmarthal said before and process the output through a processor like awk to get a proper format (CSV, xml, whatever).
Run the MySQl query to generate CSV from your App like below
SELECT order_id,product_name,qty FROM orders INTO OUTFILE '/tmp/orders.csv'
FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '\n'
It will create the csv file in the tmp folder of the application.
Then you can add logic to send the file through headers.
Make sure the database user has permissions to write into the remote file-system.
You can try doing :
SELECT a,b,c
FROM table_name
INTO OUTFILE '/tmp/file.csv'
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
We use the OUTFILE clause here to store the output into a CSV file.We enclose the fields in double-quotes to handle field values that have the comma in them and we separate the fields by comma and separate individual line using newline.