Export masked data with mysqldump from a mariadb database - mysql

Some collegues asked me to extract certain data from a Mariadb db with mysqldump and to mask some of them due to their sensibility; so basically I should obtain a "masked" dump.
Is it possible to do this kind of thing?
I didn't find any clause to add to the command or tool to do it.
I just found this tool https://mariadb.com/resources/blog/sensitive-data-masking-with-mariadb-maxscale/ but at first glance, looks like I would be forced to mask data FOR EVERYONE and I can't do it because is a production db.
Any suggestion is absolutely welcome. :)

Related

Is there a good way to perform SQL dump of MySQL database in DataGrip?

I'm trying to use JetBrains DataGrip as my primary DB tool. However, I still find myself using SequelPro for SQL Dump. Here is why:
On a database level, I couldn't find any SQL dump functionality. The only options seems to be "copy DDL", which copies the schema, but not the content.
On a table level, sure, I can export data as SQL Inserts. But then it seems the only way to do so is to export it from each table separately, which is unacceptable. Another downside is, when exporting data as INSERT, it creates a separate INSERT statement for each row.
I tried to look for plugins, but couldn't find any. DataGrip users, if you came up with any solutions, please let me know. Sequel Pro works like a charm, but I really would love to use one database client at the end of the day.
PS. SSHing to a server and running sqldump is not an option for me, for various security reasons.
In 2016.2 there is some functionality, check like in screenshot.
2016.3 will be integrated with mysqldump.

Exported MySQL database without procedures by mistake

I've exported a database via SSH and I didn't add --routine command to export the routines.
Now I don't have any access to this database, and I have only one .sql file. is there any way to restore and find the routines through PHP code or database structures?
No, sorry, in this case I think you're out of luck. Looking at the database structure, you won't be able to figure out what a routine might have done. Likewise, looking at the PHP code is probably not going to help. If you know what the routines did (for instance, manipulate data on insert, maintenance by deleting some rows, or some such) you can work through recreating it, but that's basically reverse engineering it based on what breaks when you try to run your application.

Save sql code used for creating database

I am using windows terminal to create a simple database. I was wondering is the code used saved anywhere or do I have to save it? And how? I need to save the code I used for creating the database that's why I'm asking.
If you are talking about SQL Server, you can script out the database you created - just right click on the database in Management Studio, and script away!
Yes, you should save your work. Most tools don't save your indentation, they often format the sql in their own way - sometimees as
CREATE TABLE user#host.dbname.table AS ...
so it works to reconstruct your database, but isn't well readable. The worst thing I ever saw was what MsAccess did to my Input in the SQL-Window (but it was 15 years ago).
In MySQL you can use SHOW CREATE TABLE xxx to see the definition for your table(s).
Using mysqldump can help you create an sql file which you can later run to create a DB identical to yours.
It has many useful options you can read about here.
For your case it seems you need the schema only, without the data - see a how-to here. Basically all you need is the command:
mysqldump --no-data -u Username -pPassword mydatabase

Migration from MySQL to Postgresql with auto-increments - how?

I'm considering a MySQL to Postgresql migration for my web application, but I'm having a really hard time converting my existing MySQL database to Postgresql.
I tried :
mysldump with --compatible=postgresql
migration wizard from EnterpriseDB
Postgresql Data Wizard from EMS
DBConvert from DMSoft
and NONE of the above programs do a good job converting my database!
I saw some Perl and Python scripts for converting mysql to postgresql, but I can't figure out how to use them....(I installed ActivePerl and don't understand what I'm supposed to do next to run that script!)
I use Auto Increment fields (as a primary key) all the time, and these are just ignored... I understand that Postgresql does auto-increments in another way (with sequences), but it can't be THAT hard for MIGRATION software to implement that, or is it?
Did anybody have better luck converting a MySQL database with auto-increments as primary keys?
I know this is probably not the answer you are looking for, but: I don't believe in "automated" migration tools.
Take your existing SQL Scripts that create your database schema, do a search and replace for the necessary data types (autonumber maps to serial which does all the sequence handling automagically for you), remove all the "engine=" stuff and then run the new script against Postgres.
Dump the old database into flat files and import them into the target.
I have done this several times with sample databases that were intended for MySQL and it really doesn't take that long.
Probably just as long as trying all the different "automated" tools.
Why not use an ETL Tool? you dont have to worry about dumps or stuff like that.
I have migrated to PostgresSQL and MySQL and have had no problems with the auto increment fields.
You just need to know the connection credentials and thats it. I personally use Pentaho ( it's open source ).
Download Pentaho ETL from http://kettle.pentaho.org/
Unzip and run Pentaho (using .bat file spoon.bat)
Create a new Job:
Create DB connection for source data base (PostgreSQL) - using menu: Tools→Wizard→Create DataBase Connection (F3) Create DB connection for destination data base (Mysql) - using technique described above.
Run the Wizard: Tools → Wizard → Copy Tables (Ctrl-F10).
Select source (left dialog panel), and destination (left dialog panel). Click Finish.
The Job will be generated - Run the job.
If you need any help let me know.
Even when you familiar with all "PostgreSQL gotchas", doing every step by hand may take a lot of time, especially when your db is "big".
Try some other scripts/tools.
I know this is an old question but I just ran into the same problem migrating from MySQL to Postgres. After trying several migration tools out the very best one I could find, which will migrate your database structure as cleanly as possible, was Pgloader https://github.com/dimitri/pgloader/ it will take care of changing the Auto Increment to Postgres sequences no problem and it's super fast.

Copy data from Postgresql to MySQL

I have encountered a problem where I need to copy only data from a Postgresql database to Mysql database. I already have the Mysql database with empty tables. By using PGAdmin I got a backup (data only, without database schema). I tried using PSQL tool but it keeps giving segmentation fault which I couldn't fix at the moment. I am using Ubuntu. Any simple help with a guide will be highly appreciated to copy data.
Use postgres COPY, and MySQL LOAD DATA INFILE.
psql will crash because of out-of-memory if you try to display a few millions of rows because it fetches all the data beforehand to determine the column widths for a prettier display. If you intend to use psql to fetch lots of data, disable this.
You could try:
http://www.lightbox.ca/pg2mysql.php
It looks like you might be trying to load data into mysql with the postgres client tool. Use the mysql client tools to manipulate data in the mysql server.
mysql client programs
How can you move data into MySQL if you have errors reading from PSQL?
Fix the error, then ask this question.