I have an assignment to write queries in Neo4J, but the database provided is SAKILA.SQL.
How can I load it into Neo4j?
I've tried to find an answer for this, but had no luck!
Perhaps you can share your sql?
Easiest would be to insert it into a relational database, dump the table contents as CSV and import the data into Neo4j using LOAD CSV. See: http://neo4j.com/developer/guide-importing-data-and-etl/
See: http://neo4j.com/docs/stable/query-load-csv.html
For details on Cypher see: http://neo4j.com/developer/cypher/
So you need to import (i.e. run all those insert statements) into MySQL first and then export into CSV files that Neo4j can use.
In the example Michael posted we used PostgresSQL's 'COPY' command to export CSV files. In MySQL you have a slightly different command as described over here.
You'd have something like:
SELECT * from customer
INTO OUTFILE '/tmp/customers.csv'
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
And then in Neo4j you'd have a query like this:
LOAD CSV WITH HEADERS FROM 'file:/tmp/customers.csv' AS line
MERGE (c:Customer {id: c.id})
ON CREATE SET c.name = line.name
And so on.
You can then do a similar thing to extract your other tables and use the MERGE command to create appropriate relationships between the different nodes.
If you share all the MySQL import script we can show you how to do a more complete translation.
Related
I am having a hard time loading my data to MySQL from a text file. I have been attempting to choose the correct delimiters but my file contains column names with each value.
The data is structured like this
{"id":"15","name":"greg","age":"32"}
{"id":"16","name":"jim","age":"42"}
the sql statement I am working on looks something like this currently
LOAD DATA LOCAL INFILE '/xxx.txt' INTO TABLE t1 FIELDS
TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"' LINES TERMINATED BY '\r\n'(id, name, age);
results are being stored like this
{"id":"16", "name","greg"}
I need to do away with the column name and store the value.
any tips?
Writing a script as suggested by #Shadow will be easier to wrap around but you can check the section on json on this page how to import json-text-xml and csv data into mysql
or
Import JSON to MySQL made easy with the MySQL Shell if you are using MySQL Shell 8.0.13 (GA)
Hey I have a large database where customers request data that is specific to them. They usually send me the requests in a text or csv file. I was wondering if there is a way to get sql to read that file and take the content and put them into a sql query. This way I don't have to open up that file and copy and paste everything into a sql query.
Steve already answered it.
Let me add few words only.
you can not use the csv, text,excel or anyother format directly in
query for DML/DDL.. you can use file directly only for export/import.
No. MySQL is not designed to do this.
You need an intermediate script that can interpret the files and generate the queries you require.
Yes there is a way to do it: you can import the csv file to your database and then join it with any query you want.
You can load the csv file with an SQL query such as:
LOAD DATA INFILE "/tmp/test.csv"
INTO TABLE test
COLUMNS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
ESCAPED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 LINES;
You can use other ways to import the data, see: How to import CSV file to MySQL table.
I tried this SQL solution in Ubuntu 14.04 with MySQL 5.6. For this to work you will have to put the test.csv file in the /tmp directory and do a chmod 755 test.csv for it to work. Otherwise MySQL is gives "Permission denied" errors. More about this issue on: LOAD DATA INFILE Error Code : 13
How can I load 10,000 rows of test.xls file into mysql db table?
When I use below query it shows this error.
LOAD DATA INFILE 'd:/test.xls' INTO TABLE karmaasolutions.tbl_candidatedetail (candidate_firstname,candidate_lastname);
My primary key is candidateid and has below properties.
The test.xls contains data like below.
I have added rows starting from candidateid 61 because upto 60 there are already candidates in table.
please suggest the solutions.
Export your Excel spreadsheet to CSV format.
Import the CSV file into mysql using a similar command to the one you are currently trying:
LOAD DATA INFILE 'd:/test.csv'
INTO TABLE karmaasolutions.tbl_candidatedetail
(candidate_firstname,candidate_lastname);
To import data from Excel (or any other program that can produce a text file) is very simple using the LOAD DATA command from the MySQL Command prompt.
Save your Excel data as a csv file (In Excel 2007 using Save As) Check
the saved file using a text editor such as Notepad to see what it
actually looks like, i.e. what delimiter was used etc. Start the MySQL
Command Prompt (I’m lazy so I usually do this from the MySQL Query
Browser – Tools – MySQL Command Line Client to avoid having to enter
username and password etc.) Enter this command: LOAD DATA LOCAL INFILE
‘C:\temp\yourfile.csv’ INTO TABLE database.table FIELDS TERMINATED
BY ‘;’ ENCLOSED BY ‘”‘ LINES TERMINATED BY ‘\r\n’ (field1, field2);
[Edit: Make sure to check your single quotes (') and double quotes (")
if you copy and paste this code - it seems WordPress is changing them
into some similar but different characters] Done! Very quick and
simple once you know it :)
Some notes from my own import – may not apply to you if you run a different language version, MySQL version, Excel version etc…
TERMINATED BY – this is why I included step 2. I thought a csv would default to comma separated but at least in my case semicolon was the deafult
ENCLOSED BY – my data was not enclosed by anything so I left this as empty string ”
LINES TERMINATED BY – at first I tried with only ‘\n’ but had to add the ‘\r’ to get rid of a carriage return character being imported into the database
Also make sure that if you do not import into the primary key field/column that it has auto increment on, otherwhise only the first row will be imported
Original Author reference
I am working with a large database 1.5 gig in size and hundreds of tables / fields. I need to convert all tables into CSV files. PhpMyAdmin does not do this easily / times out.
I would rather use a shell / mysql command or a script to get the data out and into CSV.
Note:
I am looking to export ALL tables of the database - in 1 shot. I can not produce an export command for every single table individually.
You can use mysqldump:
The mysqldump command can also generate output in CSV, other delimited text, or XML format.
In particular, look at the following arguments:
--tab=path
--fields-[optionally-]enclosed-by
--fields-escaped-by
--fields-terminated-by
--lines-terminated-by
--no-create-info
You will need to do this table by table, see below.
SELECT *
INTO OUTFILE '/tmp/products.csv'
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
ESCAPED BY '\\'
LINES TERMINATED BY '\n'
FROM products
Note that the directory must be writable by the MySQL database server. If it's not, you'll get an error message like this:
#1 - Can't create/write to file '/tmp/products.csv' (Errcode: 13)
Also note that it will not overwrite the file if it already exists, instead showing this error message:
#1086 - File '/tmp/products.csv' already exists
Source: http://www.electrictoolbox.com/mysql-export-data-csv/
Information about the software : sql2csv
Download link exe : http://www.convert-in.com/demos/sql2csv.exe
This is best option I found around for windows. With the software we can connect to local and remote DB server and select schema. In one shot we can extract all tables data into Valid CSV files.
Features :
I have a CSV file that I want to read with Ruby and create Ruby objects to insert into a MySQL database with Active Record. What's the best way to do this? I see two clear options: FasterCSV & the Ruby core CSV. Which is better? Is there a better option that I'm missing?
EDIT: Gareth says to use FasterCSV, so what's the best way to read a CSV file using FasterCSV? Looking at the documentation, I see methods called parse, foreach, read, open... It says that foreach "is intended as the primary interface for reading CSV files." So, I guess I should use that one?
Ruby 1.9 adopted FasterCSV as its core CSV processor, so I would say it's definitely better to go for FasterCSV, even if you're still using Ruby 1.8
If you have a lot of records to import you might want to use MySQL's loader. It's going to be extremely fast.
LOAD DATA INFILE can be used to read files obtained from external sources. For example, many programs can export data in comma-separated values (CSV) format, such that lines have fields separated by commas and enclosed within double quotation marks, with an initial line of column names. If the lines in such a file are terminated by carriage return/newline pairs, the statement shown here illustrates the field- and line-handling options you would use to load the file:
LOAD DATA INFILE 'data.txt' INTO TABLE tbl_name
FIELDS TERMINATED BY ',' ENCLOSED BY '"'
LINES TERMINATED BY '\r\n'
IGNORE 1 LINES;
If the input values are not necessarily enclosed within quotation marks, use OPTIONALLY before the ENCLOSED BY keywords.
Use that to pull everything into a temporary table, then use ActiveRecord to run queries against it to delete records that you don't want, then copy from the temp table to your production one, then drop or truncate the temp. Or, use ActiveRecord to search the temporary table and copy the records to production, then drop or truncate the temp. You might even be able to do a table-to-table copy inside MySQL or append one table to another.
It's going to be tough to beat the speed of the dedicated loader, and using the database's query mechanism to process records in bulk. The step of turning a record in the CSV file into an object, then using the ORM to write it to the database adds a lot of extra overhead, so unless you have some super difficult validations requiring Ruby's logic, you'll be faster going straight to the database.
EDIT: Here's a simple CSV header to DB column mapper example:
require "csv"
data = <<EOT
header1, header2, header 3
1, 2, 3
2, 2, 3
3, 2, 3
EOT
header_to_table_columns = {
'header1' => 'col1',
'header2' => 'col2',
'header 3' => 'col3'
}
arr_of_arrs = CSV.parse(data)
headers = arr_of_arrs.shift.map{ |i| i.strip }
db_cols = header_to_table_columns.values_at(*headers)
arr_of_arrs.each do |ary|
# insert into the database using an ORM or by creating insert statements
end