Hey I have a large database where customers request data that is specific to them. They usually send me the requests in a text or csv file. I was wondering if there is a way to get sql to read that file and take the content and put them into a sql query. This way I don't have to open up that file and copy and paste everything into a sql query.
Steve already answered it.
Let me add few words only.
you can not use the csv, text,excel or anyother format directly in
query for DML/DDL.. you can use file directly only for export/import.
No. MySQL is not designed to do this.
You need an intermediate script that can interpret the files and generate the queries you require.
Yes there is a way to do it: you can import the csv file to your database and then join it with any query you want.
You can load the csv file with an SQL query such as:
LOAD DATA INFILE "/tmp/test.csv"
INTO TABLE test
COLUMNS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
ESCAPED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 LINES;
You can use other ways to import the data, see: How to import CSV file to MySQL table.
I tried this SQL solution in Ubuntu 14.04 with MySQL 5.6. For this to work you will have to put the test.csv file in the /tmp directory and do a chmod 755 test.csv for it to work. Otherwise MySQL is gives "Permission denied" errors. More about this issue on: LOAD DATA INFILE Error Code : 13
Related
I am having a hard time loading my data to MySQL from a text file. I have been attempting to choose the correct delimiters but my file contains column names with each value.
The data is structured like this
{"id":"15","name":"greg","age":"32"}
{"id":"16","name":"jim","age":"42"}
the sql statement I am working on looks something like this currently
LOAD DATA LOCAL INFILE '/xxx.txt' INTO TABLE t1 FIELDS
TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"' LINES TERMINATED BY '\r\n'(id, name, age);
results are being stored like this
{"id":"16", "name","greg"}
I need to do away with the column name and store the value.
any tips?
Writing a script as suggested by #Shadow will be easier to wrap around but you can check the section on json on this page how to import json-text-xml and csv data into mysql
or
Import JSON to MySQL made easy with the MySQL Shell if you are using MySQL Shell 8.0.13 (GA)
I currently use Mysql Workbeanch and I want to move to DBeaver as it is an all in for multiple databases.
However I use a function to import a CSV from Amazon S3 buckets as follows
LOAD DATA FROM S3 's3-eu-west-2://csv-files/OCN04.txt'
INTO TABLE OCN
FIELDS TERMINATED BY '\t'
LINES TERMINATED BY '\n'
IGNORE 1 ROWS
;
This function does not work in DBEAVER
Load data infile works but i am guessing the file path is the issue here. I'm guessing that dbeaver only reads file from a specific directory. To know the specific/default directory, you try running the above code with simply the name of your file (i.e. 'OCN04.txt'). The error message will show the file path. Save your txt file in that directory and try running your code again.
The above method works for me.
On DBeaver you can import data from file into a table with the Gui.
Rigth-click on the table and select "import data" menu.
Import Data Capture
and select the file to import.
I have an assignment to write queries in Neo4J, but the database provided is SAKILA.SQL.
How can I load it into Neo4j?
I've tried to find an answer for this, but had no luck!
Perhaps you can share your sql?
Easiest would be to insert it into a relational database, dump the table contents as CSV and import the data into Neo4j using LOAD CSV. See: http://neo4j.com/developer/guide-importing-data-and-etl/
See: http://neo4j.com/docs/stable/query-load-csv.html
For details on Cypher see: http://neo4j.com/developer/cypher/
So you need to import (i.e. run all those insert statements) into MySQL first and then export into CSV files that Neo4j can use.
In the example Michael posted we used PostgresSQL's 'COPY' command to export CSV files. In MySQL you have a slightly different command as described over here.
You'd have something like:
SELECT * from customer
INTO OUTFILE '/tmp/customers.csv'
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
And then in Neo4j you'd have a query like this:
LOAD CSV WITH HEADERS FROM 'file:/tmp/customers.csv' AS line
MERGE (c:Customer {id: c.id})
ON CREATE SET c.name = line.name
And so on.
You can then do a similar thing to extract your other tables and use the MERGE command to create appropriate relationships between the different nodes.
If you share all the MySQL import script we can show you how to do a more complete translation.
I am working with a large database 1.5 gig in size and hundreds of tables / fields. I need to convert all tables into CSV files. PhpMyAdmin does not do this easily / times out.
I would rather use a shell / mysql command or a script to get the data out and into CSV.
Note:
I am looking to export ALL tables of the database - in 1 shot. I can not produce an export command for every single table individually.
You can use mysqldump:
The mysqldump command can also generate output in CSV, other delimited text, or XML format.
In particular, look at the following arguments:
--tab=path
--fields-[optionally-]enclosed-by
--fields-escaped-by
--fields-terminated-by
--lines-terminated-by
--no-create-info
You will need to do this table by table, see below.
SELECT *
INTO OUTFILE '/tmp/products.csv'
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
ESCAPED BY '\\'
LINES TERMINATED BY '\n'
FROM products
Note that the directory must be writable by the MySQL database server. If it's not, you'll get an error message like this:
#1 - Can't create/write to file '/tmp/products.csv' (Errcode: 13)
Also note that it will not overwrite the file if it already exists, instead showing this error message:
#1086 - File '/tmp/products.csv' already exists
Source: http://www.electrictoolbox.com/mysql-export-data-csv/
Information about the software : sql2csv
Download link exe : http://www.convert-in.com/demos/sql2csv.exe
This is best option I found around for windows. With the software we can connect to local and remote DB server and select schema. In one shot we can extract all tables data into Valid CSV files.
Features :
I have an excel file that i need to get into CSV. I export it fine but when I go to import it into a mysql db via phpMyAdmin i get a "Invalid field count in CSV input on line 1.".
Problem seems to be that the fields are not enclosed by double quotes. I just migrated to MS Excel 2007 and am not sure how to manipulate the CSV save options so that there are double quotes around the fields so my DB doesn't throw a conniption when i try to import.
Any suggestions? I'm fairly new at going from EXCEL to CSV but have gotten it to work previously.
Thanks
This worked for me after exporting from Excel as CSV and defining various options
load data infile '/tmp/tc_t.csv'
into table new_test_categories
fields terminated by ','
enclosed by '"'
lines terminated by '\n'
ignore 1 lines
(id,category_name,type_id,home_collection,seo_tags,status_id);
I ran this at the mysql prompt.
There should be an MS-DOS format of CSV in your export drop down. Pick that one.
There should be an option in save-as advanced properties or something, but if not, you could always change the delimiter character to : or ; or | and then write a quick perl script to convert it to a quote-and-comma file.
Or you could just try a tab-separated-value file instead, I think phpMyAdmin will read TSVs as well.