How to Encrypt the whole structure of a table in phpmyadmin? - mysql

I am facing problem regarding the encryption of the structure of a table in phpmyadmin. like i want to see the structure of a table in encrypted form when i export a table from phpmyadmin. means i want that the fields of the table appear in .sql file should be encrpted so that no one can read the information about fields of the table. i have search a lot on google, yahoo but i don't find any solution to do this. Please it will be a great favour. Thanks in advance
the example is here
basically i have a table name student with fields like name, cnic, dob i want that when i export the whole table into sql file from phpmyadmin the fields name like name, cnic, dob should be appear encrypted. so the user don't able to read them. just i want to do this task

I believe you can encrypt table data while you are saving in MYSQL, so that MySQL doesn't even know it's encrypted.
Result: Whenever you export table data either from phpmyadmin, No one can read that sql or csv file without decypting it.
If you are showing data you can de-crypt it back for getting its actual form.
Note: If you encrypt your data with any method like MD5 you can't
get its actual form (De-crypt from).

Related

How to import single CSV file with more than one table to MySQL database

I've just found that I can import a CSV file to MySQL table. I tried it on phpMyAdmin, but I also found out that by importing a CSV file, its columns need to match the mysql database table you are importing to. This means one CSV file equals one table only in the SQL database, correct me if I'm wrong though.
The problem is the employee table I'm inserting data to, is related to other tables. There's also the rolemap table where when every employee is inserted in the employee table, the rolemap table also creates a new row for that new employee(only inserts the employee_id generated by the employee table, then the role of the user if admin or not).
The question is can I achieve this logic, by importing a CSV file in phpMyAdmin or any database manager? I'm thinking that maybe there will be some formatting that needs to be done in the CSV file in order to import to different tables in the database. Or is this not possible, that I need to parse the CSV file in a backend and handle how to insert it to each respective table in the database?

Drupal user data storage location

Where does Drupal 7 store user data? I believe our client has been inserting bad data (incorrect characters) copied from MS Word into their Drupal install (for their user list).
How do I export/look at the users to see if any bad characters are in them?
I'm taking a look at the structure of the database in MySQL and I'm not seeing any clear table with the data in it. Is it contained in only one, or multiple tables?
User data is stored in multiple tables one of which is the users table. Any fields that have been added to the user entity will be in the field_data_field_user... table.
If you are using phpMyAdmin or similar interface, you will only be able to immediately see a portion of the tables Drupal has. Use the filter to find "users" and the Users table will appear in the list. There is only one table if there were no modifications for custom user fields.
You can export the table as a CSV and open in a spreadsheet interface, such as Excel; though Excel is not always friendly with UTF-8 if it's not set up to display it. Or you can save the file as a .SQL and view it in a database viewer such as SQLite in Firefox.

Viewing blob data-type in mysql

I have downloaded a mysql table in text format from one our collaborator's. I have dumped the table into a table on mysql database on my machine successfully. The table was created using their sql file. SO they have some of the fields with blob data-type, and I am unable to view them in mysql. when I opened the same downloaded text file with csv I could see the fields with blob data-type with letters like BC,ABD,BDS. I do not understand why I am unable to view the fields in mysql. Anyone have ideas?
This is sure you can not see the blob data directly when you view the table data from mysql. But i think when you click the edit link of particular row you might see the data but i'm not sure about this. If you are using any server script then you definitely gonna see data without any hassle using simple select query. Like
SELECT COLUMN_NAME FROM TABLE_NAME
// REGARDLESS OF COLUMN_NAME DATA TYPE

Appending rows in a database using toad and excel

Friends, I am using toad for MySQl, and have a huge database ready and validated.
Now i have an excel file which contains data-entries for a particular table. And i am also successfully able to import data into the db using import wizard, mapping the first row header with the column names etc.
But now i have appended a few data entries into it which i wish to insert into the database. However the old values also get selected and hence cause a primary_key_violation exception as the entry already exists! Otherwise a truncate table option is there which i dont wish to use as there may be many files from which i have inserted the data.
I tried my level best but didnt get any solution, atleast in toad for mysql. Please tell me what to do! the solution maybe simple but i need it SOS
An option may be to not append records to that excel file, but create a new excel file with only the new records

How to dump database from mysql with sensitive data removed or corrupted?

I am using mysql. Some of the tables contain sensitive data like user names, email addresses, etc. I want to dump the data but with these columns in the table removed or modified to some fake data. Is there any way to do it easily?
I'm using this approach:
Copy contents of sensitive tables to a temporary table.
Clear/encrypt the sensitive columns.
Provide --ignore-table arguments to mysqldump.exe to leave the original tables out.
It preserves foreign key contraints, and you can keep columns that are not sensitive.
The first two actions are contained in a stored procedure that I call before doing the dump. It looks something like this:
BEGIN
truncate table person_anonymous;
insert into person_anonymous select * from person;
update person_anonymous set Title=null, Initials=mid(md5(Initials),1,10), Midname=md5(Midname), Lastname=md5(Lastname), Comment=md5(Comment);
END
As you can see, I'm not clearing the contents of the fields. Instead, I keep a hash. That way, you can still see which rows have the same value, and between exports you can see if something changed or not, without anyone being able to read the actual values.
There is a tool called Jailer that is typically used to export a subset of a database. We use this at work to create a smaller test database from a production backup, with all sensitive data obfuscated.
The GUI is a bit crude, but Jailer is the best alternative I have found so far.
You can simply unselect the sensitive tables or columns and get a full copy of the rest. Jailer also supports obfuscating data during export - you could for instance md5 hash all user names or change all email addresses to user#example.org.
There is a tutorial to get you started.
ProxySQL is another approach.
Here is an article explaining how to obfuscate data with proxysql.
https://proxysql.com/blog/obfuscate-data-from-mysqldump