I have a Django application which is deployed on GAE. I have the same models on the prod server and the dev server. However, the content on both databases are different.
Actually, I'd like to do some tests on that data without screwing with the actual data on the cloud. Is there any way that I can pull the data in my Cloud SQL to my local MySQL db?
Assuming you can start fresh in development (empty tables), you could have auto_increments with primary key in development, and foreign key constraints there.
Perform
SELECT * INTO OUTFILE '/full/path/to/fileParentXXX.txt'
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\n'
FROM ParentXXX;
(same concept for other tables). Grab those exported CSV (comma-separated value) text files, bringing them back over the wire to development server.
Perform LOAD DATA INFILE on development with Parents first, then Children tables that have Foreign Key constraints depending on those FK's. The auto_incs should remain happy in development.
Mysql Manual page for load data link is here.
Related
I got a problem, and after some hours of research I just want to die.
Is there a way to import lots of CSV data into one MySQL database but creating new tables with the file name of the CSV data?
Example: If I import data1.csv into db the table should be named data1 with all the data from data1.csv.
Thanks for your suggestions and answers.
There is no built in tool/method/command/query to accomplish what you desire within MySQL alone.
What will be required is 2 parts.
1st. of course your MySQL DB where the table will be created.
2nd. some 3rd party program that can interact with your DB. Eg. (Java, JavaScript, Python, even Unix shell scripting)
Following is a sudo example of what will be needed.
What this program will have to do is relatively simple.
It will require a couple inputs:
DataBase IP, Username, Password (these can be parameters passed into your program, or for simplicity of testing hard coded directly into the program)
The next input will be your file name. data1.csv
Using the inputs the program will harvest the 'data1' name as well as the first row of the data1.csv file to name each column.
Once the program collects this info, it can Connect to the DB and run the MySQL statement for CREATE TABLE TableName (RowName1 VARCHAR(255), RowName2 VARCHAR(255), ect...)
Finally it can do a MySQL command to import he *.csv file into the newly created table. eg.
LOAD DATA LOCAL INFILE 'C:/Stuff/csvFiles/Data1.csv'
INTO TABLE `SchemaName`.`Data1`
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\r\n'
Hope this helps clear up your options an approach a little.
As part of automation we want to create MySQL DB script from Existing DB. Actually we can't use EF Code Firts migration because tables are more than 1000. So timeout error will come. So the alternative is generate script from existing db using MVC application.
If you can export your tables data to .csv comma separated value files, you can do a easy migration with LOAD DATA INFILE. My experience is not from db to db. Just from excel to MySQL db. I exported excel sheets as a somedata.csv files. And then I created a table that I need to and I use this scripts to import to MySQL database. I think you can use it.
LOAD DATA INFILE 'somedata.csv' INTO TABLE some_table
FIELDS TERMINATED BY ',' ENCLOSED BY '"'
LINES TERMINATED BY '\r\n';
So, first you need to create your tables' structure again.
Second, you need to export your data to csv files.
Third, you can do import specific data by using LOAD DATA INFILE.
It is not the entire great way. But It can be fast enough. A table with more than 1 millions record will imported with just a minute. You can see
LOAD DATA INFILE documentation here
I have a table on the development box with exactly the same format as another one on the production server. The data on the development need to be appended/inserted into the production where I don't have the permission to create a table.
I was thinking about doing something like insert into production_table select * from develop_table, however, since I cannot create a new table develop_table, then this is impossible to do.
I am using Sequal Pro, and I don't know is there a way to export my development table to a file (CSV/SQL), then I can run some command on my client side to load from that file into the production without overwriting the production table?
Assuming your production table has primray / unique key(s), you can export the data in your development server as a .csv file, and load it into your production server with load data, specifying if you want to replace/ignore the duplicated rows.
Example:
In your development server you must export the data to a .csv file. You can use select into... to do that:
select *
into outfile '/home/user_dev/your_table.csv'
fields terminated by ',' optionally enclosed by '"'
lines terminated by '\n'
from your_table;
In your production server, copy the your_table.csv file and load it using load data...:
load data infile '/home/user_prod/your_table.csv'
replace -- This will replace any rows with duplicated primary | unique key values.
-- If you don't want to replace the rows, use "ignore" instead of "replace"
into table your_table
fields terminated by ',' optionally enclosed by '"'
lines terminated by '\n';
Read the reference manual (links provided above) for additional information.
I need to insert data from Magento place order form to an external database, Please give details about how I can achieve it.
Currently when we click on place order it is inserting to table sales_flat_order, i need to save it into an external DB .
As i am New to Magento please don't mind if this is a simple thing.
When you say external DB, does that mean another database on the same box? Or a remote database on another box? Will the table remain the same, or are all the fields and additional information different?
Approaches:
API: http://www.magentocommerce.com/api/rest/Resources/Orders/sales_orders.html
If it's a remote box, you can use the REST API to pull the order's (once the API is active, the role is created, the user is assigned and connected) and push the returned information to the new box programatically.
Dataflow:
You can setup a dataflow for exporting the order information, pull in the CSV/XML,parse it and upload the needed parts to the new DB.
Dataflow Extension:
Same as above, but instead of doing all the programming yourself, can install an extension like: http://www.wyomind.com/orders-export-tool-magento.html and have it ftp information to a remote server so you can check/parse the file into the new DB as needed.
Can you reveal a bit more about the environment, the amount of data/orders, etc?
Thanks.
--- Update:
Per your response, it sounds less of a Magento question here and more of a MySQL question.
In this case, you can do something as simple as "replicating" or copying over the table data to your other local db.
If you're not working with too many orders, the following may meet your needs for a 1 time deal. If you're dealing with a substantial amount of orders the approach may need to be expanded upon.
##Direct Copy:
#using stage_magento to represent your other DB
#assuming this is done with a user that has correct permissions on both databases.
#create the table
CREATE TABLE stage_magento.sales_flat_order LIKE production_magento.sales_flat_order;
#copy the data
INSERT stage_magento.sales_flat_order SELECT * FROM production_magento.sales_flat_order;
#####################
## Option 2, export to file system, import to new db
##Indirect, Export from DB/Table
SELECT * FROM production_magento.sales_flat_order INTO OUTFILE '/tmp/sales_flat_order.csv' FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"' ESCAPED BY '\\' LINES TERMINATED BY '\n' ;
##Import into New DB/Table
LOAD DATA INFILE '/tmp/sales_flat_order.csv' INTO TABLE stage_magento.sales_flat_order FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"' ESCAPED BY '\\' LINES TERMINATED BY '\n' ;
I have a central server and several (around 50) remote servers. I want to transfer some log data from each of the servers to the central server every night. They all run Linux, and the logs are stored in MySQL. I have a ssh access to all servers.
What is the best (easiest, safest, most reliable...) practice of transferring the data from remote servers to the central server?
thanks
Depending on your needs and the time you want to put into this, I have been using this script for a long time to backup databases.
It's a low-cost strategy that is tried and tested, very flexible and quite reliable.
You can export new lines to a csv file. Like this:
SELECT id, name, email INTO OUTFILE '/tmp/result.csv'
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
ESCAPED BY ‘\\’
LINES TERMINATED BY '\n'
FROM users WHERE timestamp > lastExport
Then transfer it via scp and import it whit mysqlimport
If database is innoDB you should import first referenced tables.
In general it is easiest to dump it with mysqldump and load it back in on all the duplicate servers. You can use some of the many options to mysqldump to control things such as locking, MVCC snapshot, which tables, and other options.
CSV is more difficult than mysqldump because you need to make sure you agree on how to terminate fields, how to escape etc.