Way to create and load multiple tables of same schema - mysql

I have 200 tab delimited files which I want to load up in MySQL database.Is there any way to automate create table command for creating the schema for 200 tables ,and loading up those 200 tables automatically?
The thing is I would have to run the create table query and loading tables 200 times each.so any way to automate it.

The create table command creates one table. You can run 200 create tables in one sql script, but the create table schema would have to be there for each table.
The only way you could do multiple create tables is if all your tables were exactly the same. You could use a FOR LOOP to run the create table sql as many times as you want. The only thing is, if you have more than one table that is exactly the same, you have other problems. So the answer is no.
There are various software that can import your tab delimited files and create the tables for you, but you will still have import 200 times.
On the plus side, you only have to import them once. At that point you can easily export all the tables to a single sql file. You will now be at a single import of your tables for the future.

Related

Fastest-Cleanest way to update database (mysql large tables)

I have a website feeded with large mysql tables (>50k of rows in some tables). Lets name one table "MotherTable". Every night I update the site with a new csv file (produced locally) that has to substitute "MotherTable" data.
The way I do this currently (I am not an expert, as you see), is:
- First, I TRUNCATE the MotherTable table.
- Second, I import the csv file to the empty table, with columns separated by "/" and skipping 1 line.
As the csv file is not very small, there are some seconds (or even a minute) when the MotherTable is empty, so the web users that make SELECTS on this table find nothing.
Obviously, I don't like that. Is there any procedure to update MotherTable in a way users note nothing? If not, what would be the quickest way to update the table with the new csv file?
Thank you!

Importing MYSQL database to NeO4j

I have a mysql database on a remote server which I am trying to migrate into Neo4j database. For this I dumped the individual tables into csv files and am now planning to use the LOAD CSV functionality to create graphs from the tables.
How does loading each table preserve the relationship between tables?
In other words, how can I generate a graph for the entire database and not just a single table?
Load each table as a CSV
Create indexes on your relationship field (Neo4j only does single property indexes)
Use MATCH() to locate related records between the tables
Use MERGE(a)-[:RELATIONSHIP]->(b) to create the relationship between the tables.
Run "all at once", this'll create a large transaction, won't go to completion, and most likely will crash with a heap error. Getting around that issue will require loading the CSV first, then creating the relationships in batches of 10K-100K transaction blocks.
One way to accomplish that goal is:
MATCH (a:LabelA)
MATCH (b:LabelB {id: a.id}) WHERE NOT (a)-[:RELATIONSHIP]->(b)
WITH a, b LIMIT 50000
MERGE (a)-[:RELATIONSHIP]->(b)
What this does is find :LabelB records that don't have a relationship with the :LabelA records and then creates that relationship for the first 50,000 records it finds. Running this repeatedly will eventually create all the relationships you want.

infer table structure from file in MySql

Another posting said there is a way to infer the table columns from a data file using phpMyAdmin. I haven't found documentation on this, can you point me to it? Does it only use the header row, or does it also sample the data to infer the data type?
I'm trying to create several tables in MySQL from data files, which have roughly 100 columns each, so I don't want to write the SQL DDL to create the tables manually.
Thanks!

How to use load data Infile to insert into multiple tables?

I use aa python program which inserts many new entries to database,
this new entries are spread across multiple tables.
I'm using load data infile to load the file, but this solution is only for one table, and I don't feel like to do this multiple times.
I found http://forge.mysql.com/worklog/task.php?id=875 this but I'm not quite
sure if its already implemented or not.
I am doing exactly what you are trying to do as follows:
Step 1: Create a temp table (holding all the fields of the import file)
Step 2: LOAD DATA LOCAL INFILE -> into the temp table
Step 3: INSERT INTO Table1 ( fieldlist ) SELECT FROM TempTable ( matching fieldlist ) ... include JOINS, WHERE, and ON PRIMARY KEY UPDATE as necessary
Step 4: Repeat step 3 with the second table insert query and so on.
Using this method I am currently importing each of my 22MB data files, and parsing them out to multiple tables (6 tables, including 2 audit/changes tables)
Without knowing your table structure and data file structure it is difficult to give you a more detailed explanation, but I hope this helps get you started
load data from local file to insert new data accross multiple tables isnt yet supported (v 5.1)
I don't think LOAD DATA can do that, but why not duplicate the table after importing?
See
Duplicating table in MYSQL without copying one row at a time
Or, if you can go outside mySQL, Easiest way to copy a MySQL database?

Can I import tab-separated files into MySQL without creating database tables first?

As the title says: I've got a bunch of tab-separated text files containing data.
I know that if I use 'CREATE TABLE' statements to set up all the tables manually, I can then import them into the waiting tables, using 'load data' or 'mysqlimport'.
But is there any way in MySQL to create tables automatically based on the tab files? Seems like there ought to be. (I know that MySQL might have to guess the data type of each column, but you could specify that in the first row of the tab files.)
No, there isn't. You need to CREATE a TABLE first in any case.
Automatically creating tables and guessing field types is not part of the DBMS's job. That is a task best left to an external tool or application (That then creates the necessary CREATE statements).
If your willing to type the data types in the first row, why not type a proper CREATE TABLE statement.
Then you can export the excel data as a txt file and use
LOAD DATA INFILE 'path/file.txt' INTO TABLE your_table;