How to use load data Infile to insert into multiple tables? - mysql

I use aa python program which inserts many new entries to database,
this new entries are spread across multiple tables.
I'm using load data infile to load the file, but this solution is only for one table, and I don't feel like to do this multiple times.
I found http://forge.mysql.com/worklog/task.php?id=875 this but I'm not quite
sure if its already implemented or not.

I am doing exactly what you are trying to do as follows:
Step 1: Create a temp table (holding all the fields of the import file)
Step 2: LOAD DATA LOCAL INFILE -> into the temp table
Step 3: INSERT INTO Table1 ( fieldlist ) SELECT FROM TempTable ( matching fieldlist ) ... include JOINS, WHERE, and ON PRIMARY KEY UPDATE as necessary
Step 4: Repeat step 3 with the second table insert query and so on.
Using this method I am currently importing each of my 22MB data files, and parsing them out to multiple tables (6 tables, including 2 audit/changes tables)
Without knowing your table structure and data file structure it is difficult to give you a more detailed explanation, but I hope this helps get you started

load data from local file to insert new data accross multiple tables isnt yet supported (v 5.1)

I don't think LOAD DATA can do that, but why not duplicate the table after importing?
See
Duplicating table in MYSQL without copying one row at a time
Or, if you can go outside mySQL, Easiest way to copy a MySQL database?

Related

Importing MYSQL database to NeO4j

I have a mysql database on a remote server which I am trying to migrate into Neo4j database. For this I dumped the individual tables into csv files and am now planning to use the LOAD CSV functionality to create graphs from the tables.
How does loading each table preserve the relationship between tables?
In other words, how can I generate a graph for the entire database and not just a single table?
Load each table as a CSV
Create indexes on your relationship field (Neo4j only does single property indexes)
Use MATCH() to locate related records between the tables
Use MERGE(a)-[:RELATIONSHIP]->(b) to create the relationship between the tables.
Run "all at once", this'll create a large transaction, won't go to completion, and most likely will crash with a heap error. Getting around that issue will require loading the CSV first, then creating the relationships in batches of 10K-100K transaction blocks.
One way to accomplish that goal is:
MATCH (a:LabelA)
MATCH (b:LabelB {id: a.id}) WHERE NOT (a)-[:RELATIONSHIP]->(b)
WITH a, b LIMIT 50000
MERGE (a)-[:RELATIONSHIP]->(b)
What this does is find :LabelB records that don't have a relationship with the :LabelA records and then creates that relationship for the first 50,000 records it finds. Running this repeatedly will eventually create all the relationships you want.

How to replace a Column simultaneously with LOAD INFILE in MySQL

Suppose we have table with a DECIMAL column with values, for example: 128.98, 283.98, 21.20.
I want to import some CSV Files to this table. However, in the columns of these files, I have values like 235,69, 23,23, with comma instead of points.
I know I can REPLACE that column, but is there some way of doing that before LOAD INFILE?
I do not believe you can simultaneously replace that column and load the data. Looks like you will have to do multiple steps to get the results you want.
Load the data first into a raw table using the LOAD INFILE command. This table can be identical to the main table. You can use the Create Table like command to create the table.
Process the data (i.e. change the comma to a . where applicable) in the raw table.
select the data from the raw table and insert into main table either with row by row processing or bulk insert.
This can all be done in a stored procedure (SP) or by a 3rd party script written in python, php, etc...
If you want to know more about SP's in Mysql, Here is a useful link.

Way to create and load multiple tables of same schema

I have 200 tab delimited files which I want to load up in MySQL database.Is there any way to automate create table command for creating the schema for 200 tables ,and loading up those 200 tables automatically?
The thing is I would have to run the create table query and loading tables 200 times each.so any way to automate it.
The create table command creates one table. You can run 200 create tables in one sql script, but the create table schema would have to be there for each table.
The only way you could do multiple create tables is if all your tables were exactly the same. You could use a FOR LOOP to run the create table sql as many times as you want. The only thing is, if you have more than one table that is exactly the same, you have other problems. So the answer is no.
There are various software that can import your tab delimited files and create the tables for you, but you will still have import 200 times.
On the plus side, you only have to import them once. At that point you can easily export all the tables to a single sql file. You will now be at a single import of your tables for the future.

java and mysql load data infile misunderstanding

Thanks for viewing this. I need a little bit of help for this project that I am working on with MySql.
For part of the project I need to load a few things into a MySql database which I have up and running.
The info that I need, for each column in the table Documentation, is stored into text files on my hard drive.
For example, one column in the documentation table is "ports" so I have a ports.txt file on my computer with a bunch of port numbers and so on.
I tried to run this mysql script through phpMyAdmin which was
LOAD DATA INFILE 'C:\\ports.txt" INTO TABLE `Documentation`(`ports`).
It ran successfully so I went to do the other load data i needed which was
LOAD DATA INFILE 'C:\\vlan.txt' INTO TABLE `Documentation` (`vlans`)
This also completed successfully, but it added all the rows to the vlan column AFTER the last entry to the port column.
Why did this happen? Is there anything I can do to fix this? Thanks
Why did this happen?
LOAD DATA inserts new rows into the specified table; it doesn't update existing rows.
Is there anything I can do to fix this?
It's important to understand that MySQL doesn't guarantee that tables will be kept in any particular order. So, after your first LOAD, the order in which the data were inserted may be lost & forgotten - therefore, one would typically relate such data prior to importing it (e.g. as columns of the same record within a single CSV file).
You could LOAD your data into temporary tables that each have an AUTO_INCREMENT column and hope that such auto-incremented identifiers remain aligned between the two tables (MySQL makes absolutely no guarantee of this, but in your case you should find that each record is numbered sequentially from 1); once there, you could perform a query along the following lines:
INSERT INTO Documentation SELECT port, vlan FROM t_Ports JOIN t_Vlan USING (id);

Can I import tab-separated files into MySQL without creating database tables first?

As the title says: I've got a bunch of tab-separated text files containing data.
I know that if I use 'CREATE TABLE' statements to set up all the tables manually, I can then import them into the waiting tables, using 'load data' or 'mysqlimport'.
But is there any way in MySQL to create tables automatically based on the tab files? Seems like there ought to be. (I know that MySQL might have to guess the data type of each column, but you could specify that in the first row of the tab files.)
No, there isn't. You need to CREATE a TABLE first in any case.
Automatically creating tables and guessing field types is not part of the DBMS's job. That is a task best left to an external tool or application (That then creates the necessary CREATE statements).
If your willing to type the data types in the first row, why not type a proper CREATE TABLE statement.
Then you can export the excel data as a txt file and use
LOAD DATA INFILE 'path/file.txt' INTO TABLE your_table;