Exporting data from one schema to another in MySQL Workbench - mysql

Is there a way to export the tables and data from one schema to another? The manage import/export option asks me to select a server to connect to, which comes up blank. I'm currently connected to a server that my school has rented, specifically for this class, so I don't have any admin rights.

You can create a dump via Data Export in MySQL Workbench and import that right after the export to a new schema. MySQL Workbench allows to override the target schema in a dump.

If you run into troubles importing your data into the new schema, like not getting any data in it, a workaround might be needed. I ran an export of a schema from MySQL workbench to a .sql file to later import it in a different schema and the problem was that the .sql file exported maintained the previous schema.
So if you find this at the beginning of the .sql exported file:
CREATE DATABASE IF NOT EXISTS `old_schema` /*!40100 DEFAULT CHARACTER SET latin1 */;
USE `old_schema`;
Replace it with this:
CREATE DATABASE IF NOT EXISTS `new_schema` /*!40100 DEFAULT CHARACTER SET latin1 */;
USE `new_schema`;
That will do the trick. In some situations, your .sql file might be of a few hundreds MB so you will have to wait a little until it opens up in your editor. This code should be at the beginning of the file though so it is easy to find.
I hope it helps!

in 6.0 and up, it looks like the dump writes out individual tables in a directory that you name the dump. All of the schema and table names are defaulted to your schema that you exported from (as you've noted.) In order to facilitate an import to a new schema, simply run the following in your dump directory:
find . -type f -exec sed -i 's/your_export_schema/your_different_schema_name/g' {} \;
Be careful though, you'll bone your self if you have data in your export that has your old schema name in it.

I noticed that the question was about the Workbanch, but be aware that the phpMyAdmin have this ability directly, in the database operations.

Related

dp.opt file isn't creating when using "create database"

my problem is that when i use mysql to create database it described that it should create db.opt file, but it doesn't. How to fix that, i'm using mysql mysql-8.0.32-winx64 portable version
I have tried to enable that option innodb_file_per_table=ON in my.cnf file;
Also, i have tried to enable server with that argument "--skip-opt", and also imported it in my.cnf;
It seems that noone is faced that problem before and yeah i know that i doesn't really need that file it is just for my homework, by that file i should show that my databases have right CHARACTER SET and COLLATE.
For all that i using comand line interface.
MySQL 8.0 is working as designed. The db.opt file and other metadata files are no longer used, because this version of MySQL implements metadata in a different way.
https://dev.mysql.com/doc/refman/8.0/en/data-dictionary-file-removal.html
The metadata files listed below are removed from MySQL. Unless otherwise noted, data previously stored in metadata files is now stored in data dictionary tables.
...
db.opt files: Database configuration files. These files, one per database directory, contained database default character set attributes.
I have no idea what homework you would be doing that requires direct access to db.opt. That file is normally only used by internal code of MySQL 5. I've been using MySQL since 2001, but I've never had any need to read that file directly.

How to import the data from a data dump (SqLite3) into django?

I have a data dump file(.sql) containing certain data, I would like to import it to my database(currently Sqlite3 but thinking about changing to MySQL) in order to use it in my test website.
Also, I need to know if it's possible to add the models too automatically, I presume it needs to be added manually, but any how, if there is any way to solve it, please suggest it.
There is a way to help you generate the Django models automatically given you have an existing database.
However this is a shortcut. You may then fine tune your models and add them to your app as needed.
Structuring your models in apps might force you to use the Models db_table meta option.
If at some point you would like to switch databases (Sqlite3 -> MySQL) you can export (dump) your current data to json. Then you could import (load) the data to the new database (after creating the database tables with migrate command). To do this you can use Django management commands:
Dump data
Load data
I was able to get an alternative answer after researching a bit.
Since I'm having a PostgreSQL data dump file with a file extension '.sql', I was capable of running a single command that imported the whole data dump into my local database, which is PostgreSQL. I'm using PgAdmin4 as my database management system and I installed psql during the installation of PgAdmin4, I added the psql to the path of my command prompt, hence it was accessible.
In order to import the data dump, I used the command provided below,
psql -U <username> -d <database_name> < <file.sql>
The '<' after database_name is necessary, so be sure to include it.
Here is the username of the configured account, is the database to which the data dump should be added, , is the file containing the data dump.

mysqldump.exe - how to have database renamed in the dump file

I'm writing a simple utility of merging few database dump files into a single one. I have a temporary database (lets name it 'db_temporary') and have to export it into the dump but in the dump file it should be named 'db_final'. Can I do this using 'mysqldump.exe'? This seems like a trivial task but I can't find any clue in the 'mysqldump' documentation here: https://dev.mysql.com/doc/refman/4.1/en/mysqldump.html
Big thanks to any help.
The Dump does not contain the database name, the dump only contains the tables and data from the database youve dumped. So yes, you could import the resulting SQL code in any Database you wish, just create a database and import the SQL code into that.
At least if you use it this form:
mysqldump.exe -u USERNAME -p database > database_dump.sql

Possible to patch a mysql sql file without using phpmyadmin?

If I have a cms, shopping cart, or some db based web script and lets say it has a database with like 50 tables
If I have a .sql file (call it patch.sql) that has a few ALTER commands and some UPDATE commands, I can goto phpmyadmin, import the patch.sql file and it will "apply" the changes to my db.
But lets say I export my db to a mydb.sql file first
Is there a way to "apply" the changes from "patch.sql" to "mydb.sql" without using a database?
I figured some command line like
mysql.exe merge patch.sql mydb.sql
or something but I didn't see anything like that.
Is phpmyadmin with a database the only way?
If your goal is to be able to later reimport mydb.sql and get the patched version of the database, then no, there are no tools to do that. patch.sql may has altered the structure of the database as well as the data itself based on your description. What you should do is apply the patch to your database, then export your db to a new .sql file.

How should I import my legacy MYSQL db into my new Django project which will use Postgres?

I currently have a site which uses a MySQL 4.x schema, the site uses PHP for the backend. I'm currently in Django-land and I like Postgres so far ( meaning I'm a noob to both ).
What would be the easiest way to go about importing the data in my tables ( the dump file for the entire DB is around ~200 MB ) into my new project? I currently don't have anything setup on my new site and I'm about to install postgres on my server.
One thing to note is that currently, the MySQL was incorrectly setup so the data is stored as LATIN1 and I need to convert it to real Cyrillic/UTF-8 using iconv before I do any importing, that was a pain to figure out and this is the next step.
Fixtures is your best bet. Dump data from mysql, then load it into postgres.
It's some work, but possible:
Make a PostgreSQL compatible dump:
mysqldump --compatible=postgresql --databases db1 > dump.sql
Create a PostgreSQL database using
UTF8
Tell the database you are about to
import latin1 content:
SET CLIENT_ENCODING TO 'LATIN1';
Import the dump and do some tests.
Good luck!
I'm going to throw out something crazy. Run your syncdb to create your tables in the Postgres db. Then use the Storm ORM (storm.canonical.com) and it's multiple db support to open in one database and save it to another. I'd be really intrigued to hear if that worked.
NOTE: You may be able to do this with the multiple db support in Django 1.2. Probably possible in SQLAlchemy as well.