tons of CSV data into new MySQL Tables in one Database - mysql

I got a problem, and after some hours of research I just want to die.
Is there a way to import lots of CSV data into one MySQL database but creating new tables with the file name of the CSV data?
Example: If I import data1.csv into db the table should be named data1 with all the data from data1.csv.
Thanks for your suggestions and answers.

There is no built in tool/method/command/query to accomplish what you desire within MySQL alone.
What will be required is 2 parts.
1st. of course your MySQL DB where the table will be created.
2nd. some 3rd party program that can interact with your DB. Eg. (Java, JavaScript, Python, even Unix shell scripting)
Following is a sudo example of what will be needed.
What this program will have to do is relatively simple.
It will require a couple inputs:
DataBase IP, Username, Password (these can be parameters passed into your program, or for simplicity of testing hard coded directly into the program)
The next input will be your file name. data1.csv
Using the inputs the program will harvest the 'data1' name as well as the first row of the data1.csv file to name each column.
Once the program collects this info, it can Connect to the DB and run the MySQL statement for CREATE TABLE TableName (RowName1 VARCHAR(255), RowName2 VARCHAR(255), ect...)
Finally it can do a MySQL command to import he *.csv file into the newly created table. eg.
LOAD DATA LOCAL INFILE 'C:/Stuff/csvFiles/Data1.csv'
INTO TABLE `SchemaName`.`Data1`
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\r\n'
Hope this helps clear up your options an approach a little.

Related

How to export your database after you make some queries in mysql workbench?

I'm new in mysql workbench properties. so here is the situation
i have 2 tables, 'student' and 'classes'. a student can have multiple classes with student ID as the connected field. one to many relationship. i wrote some queries that connects the two table (i.e using join,...) and i want to export what i have on my queries rather than the two tables (which i got from data export wizard).
i've tried to export to csv file using codes but came across the error 1290
select teacher.student.U_id, teacher.student.U_id, teacher.student.F_name, teacher.student.L_name
teacher.classess.days,teacher.classess.mor, teacher.classess.aft
from teacher.student, teacher.classesss
where teacher.student.U_id=teacher.classess.U_id
INTO OUTFILE 'C:\Users\Eddie Vu\Downloads'
FIELDS ENCLOSED BY '"'
TERMINATED BY ';'
ESCAPED BY '"'
LINES TERMINATED BY '\r\n';
i expect the output to be store in a csv file.
please help, thank you in advance
Your command, if it worked, would create the file on the server machine. But probably that directory doesn't exist or MySQL isn't allowed to write to it or it is configured not to do so.
But I suppose you want to export to a file on the client machine.
Note the section "Export/Import" in the little toolbar above the result grid and the little button with a disk in front of a grid.
If you click that button a dialog opens that allows you to save the current result in the grid to a file.

How to generate MySQL databse tables script using mvc application with Entity Framework

As part of automation we want to create MySQL DB script from Existing DB. Actually we can't use EF Code Firts migration because tables are more than 1000. So timeout error will come. So the alternative is generate script from existing db using MVC application.
If you can export your tables data to .csv comma separated value files, you can do a easy migration with LOAD DATA INFILE. My experience is not from db to db. Just from excel to MySQL db. I exported excel sheets as a somedata.csv files. And then I created a table that I need to and I use this scripts to import to MySQL database. I think you can use it.
LOAD DATA INFILE 'somedata.csv' INTO TABLE some_table
FIELDS TERMINATED BY ',' ENCLOSED BY '"'
LINES TERMINATED BY '\r\n';
So, first you need to create your tables' structure again.
Second, you need to export your data to csv files.
Third, you can do import specific data by using LOAD DATA INFILE.
It is not the entire great way. But It can be fast enough. A table with more than 1 millions record will imported with just a minute. You can see
LOAD DATA INFILE documentation here

How can I turn a CSV file into a web-based SQL database table, without having any database already setup?

Currently, I have a CSV file with data in it. I want to turn it into a SQL table, so I can run SQL queries on it. I want the table to be within a web-based database that others in my organization can also access. What's the easiest way to go from CSV file to this end result? Would appreciate insight on setting the up database and table, giving others access, and getting data inside. Preferably PostgreSQL, but MySQL is fine too.
To create the table it depends on the number of columns you have. If you have only a few then do it manually:
CREATE TABLE <table name> (<variable name> <variable type (e.g. int or varchar(100)>, <etc.>)
If you have many columns you can open the csv file in excel and get 'SQL Converter for Excel' which will build a create statement for you using your column headings (and autodetect variable types too).
Loading data from a csv is also pretty straightforward:
LOAD DATA INFILE <filepath (e.g. 'C:/Users/<username>/Desktop/test.csv'>
INTO TABLE <table name>
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 ROWS; (Only use this line if you have column names included in the csv).
As for a web-based solution: https://cloud.google.com/products/cloud-sql/
That's a relatively open-ended question. A couple of noteworthy pointers off the top of my head:
MySQL allows you to store your data in different formats, one of them being CSV. That's a very straightforward solution if you're happy with it and don't mind a few limitations (see http://dev.mysql.com/doc/refman/5.0/en/csv-storage-engine.html).
Otherwise you can import your data into a table with a full-featured engine (see other answer(s) for details).
If you're happy with PostgreSQL and look for fully web based solution, have a look at Heroku.
There are a great many ways to make your data available through web services without accessing the back-end data store directly. Have a look at REST and SOAP for instance.
HTH

Insert Magento details to external database

I need to insert data from Magento place order form to an external database, Please give details about how I can achieve it.
Currently when we click on place order it is inserting to table sales_flat_order, i need to save it into an external DB .
As i am New to Magento please don't mind if this is a simple thing.
When you say external DB, does that mean another database on the same box? Or a remote database on another box? Will the table remain the same, or are all the fields and additional information different?
Approaches:
API: http://www.magentocommerce.com/api/rest/Resources/Orders/sales_orders.html
If it's a remote box, you can use the REST API to pull the order's (once the API is active, the role is created, the user is assigned and connected) and push the returned information to the new box programatically.
Dataflow:
You can setup a dataflow for exporting the order information, pull in the CSV/XML,parse it and upload the needed parts to the new DB.
Dataflow Extension:
Same as above, but instead of doing all the programming yourself, can install an extension like: http://www.wyomind.com/orders-export-tool-magento.html and have it ftp information to a remote server so you can check/parse the file into the new DB as needed.
Can you reveal a bit more about the environment, the amount of data/orders, etc?
Thanks.
--- Update:
Per your response, it sounds less of a Magento question here and more of a MySQL question.
In this case, you can do something as simple as "replicating" or copying over the table data to your other local db.
If you're not working with too many orders, the following may meet your needs for a 1 time deal. If you're dealing with a substantial amount of orders the approach may need to be expanded upon.
##Direct Copy:
#using stage_magento to represent your other DB
#assuming this is done with a user that has correct permissions on both databases.
#create the table
CREATE TABLE stage_magento.sales_flat_order LIKE production_magento.sales_flat_order;
#copy the data
INSERT stage_magento.sales_flat_order SELECT * FROM production_magento.sales_flat_order;
#####################
## Option 2, export to file system, import to new db
##Indirect, Export from DB/Table
SELECT * FROM production_magento.sales_flat_order INTO OUTFILE '/tmp/sales_flat_order.csv' FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"' ESCAPED BY '\\' LINES TERMINATED BY '\n' ;
##Import into New DB/Table
LOAD DATA INFILE '/tmp/sales_flat_order.csv' INTO TABLE stage_magento.sales_flat_order FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"' ESCAPED BY '\\' LINES TERMINATED BY '\n' ;

Replace contents of MySQL table with contents of csv file on a remote server

I'm a newbie here trying to import some data into my wordpress database (MySQL) and I wonder if any of you SQL experts out there can help?
Database type: MySQL
Table name: wp_loans
I would like to completely replace the data in table wp_loans with the contents of file xyz.csv located on a remote server, for example https://www.mystagingserver.com/xyz.csv
All existing data in the table should be replaced with the contents of the CSV file.
The 1st row of the CSV file is the table headings so can be ignored.
I'd also like to automate the script to run daily at say 01:00 in the morning if possible.
UPDATE
Here is the SQL I'm using to try and replace the table contents:
LOAD DATA INFILE 'https://www.mystagingserver.com/xyz.csv'
REPLACE
INTO TABLE wp_loans
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
IGNORE 1 LINES
I would recommend a cron job to automate the process, and probably use BCP (bulk copy) to insert the data into a table... But seeing as you are using MySQL, instead of BCP, try load data in file - https://mariadb.com/kb/en/load-data-infile/