Where does Drupal 7 store user data? I believe our client has been inserting bad data (incorrect characters) copied from MS Word into their Drupal install (for their user list).
How do I export/look at the users to see if any bad characters are in them?
I'm taking a look at the structure of the database in MySQL and I'm not seeing any clear table with the data in it. Is it contained in only one, or multiple tables?
User data is stored in multiple tables one of which is the users table. Any fields that have been added to the user entity will be in the field_data_field_user... table.
If you are using phpMyAdmin or similar interface, you will only be able to immediately see a portion of the tables Drupal has. Use the filter to find "users" and the Users table will appear in the list. There is only one table if there were no modifications for custom user fields.
You can export the table as a CSV and open in a spreadsheet interface, such as Excel; though Excel is not always friendly with UTF-8 if it's not set up to display it. Or you can save the file as a .SQL and view it in a database viewer such as SQLite in Firefox.
Related
I have a database wordpress consisting a lot of tables. I am looking for a certain entry 20.22.31.44. What is the best way to return the table and column if this entry exists?
You can dump your database data (SQLDump) into a file and search through it's content.
You may also use a Database Management system (i.e. Workbench, PHPMyAdmin) to perform that global search for you. See related.
Intro
I've searched all around about this problem, but I didn't really found a source of knowledge about this, so I'm sorry if this problem seems basic to you, but for me is rather quite intriguing due the fact that I'm having hard time to guess what keywords to use on google in order to retrieve proper info.
Problem Description :
As a matter of fact, i have to issues that i don't know how to deal in a MySQL instance installed in a laptop in a windows environment:
I have a DB in MySQL with 50 tables, of with 15 or 20 tables are tables with original data. The other tables were tables that i generated from the original data tables, in order to properly create tables that would allow me to analyze data in PowerBI. The original data tables are fed by dumps from a ERP Database.
My issue is the following:
How would one automate the process of receiving cumulative txt/csv files (via pen-drive or any other transfer mechanism), store those files into a folder and then update the existing tables with the new information? Is there any reference of best practices to deal with such a scenario?
How can i maintain the good shape of my database with the successive data integration, I mean, how can I make my database scalable and responsive?
Can you point me some sources that would help me with this?
At the moment I imported data into tables, in 2 steps:
1st - I created the table structure with the Workbench import wizard help ( I had to do it this way because the tables have a lot of fields - dozens of them, literally, and those fields need to be in the database). I also inserted primary keys and indexes in those tables;
2nd - I Managed to load the data from the files into those tables, using LOAD DATA IN FILE command.
Some of the fields of the tables created with the import wizard, were created as data type text, with is not necessary in this scenario. I would like to revert those fields to data type NVARCHAR(255) or something, However there are a lot of field to alter the data type and in multiple tables at this point, and i was wondering if i can write a query to do the job of creating all the ALTER TABLES statements i need.
So my issue here is: is it safe to alter the data type in multiple fields in multiple columns (in this case i would like to change fields with datatype text to NAVARCHAR(255))? What is the best way to do this? Can you point me to some sources or best practices for this, please?
Thank you, in advance, for your help.
Cheers
You need a scripting language, not a UI. See mysql commandline tool, the shell of your OS, etc, etc.
DROP DATABASE and reCREATE it
LOAD DATA
Massage the data to get the columns cleaner than what the load data provided
Sic the BI tool on the data.
If you want to discuss Step 3, we need details about what transformations are needed between step 2 and step 4. That includes providing the format or schema for steps 2 and 4.
I am attempting to "sync" data from a read-only ODBC MySQL server to Access 2016. I need to move the data into Access so that I can more easily manipulate and create better customized reports.
I have linked the data tables between Access and MySQL, however I cannot get the data in these tables to automatically refresh. I must go into Access and hit "Refresh All".
What I'm looking to do is update all of my open tables in Access once nightly so that each morning the data used to build these reports is new. Currently if I leave these tables all evening, when I get in the next morning I must hit "Refresh-All" for Access to go retrieve the most recent data.
Any ideas?
The data in linked tables is automatically refreshed by access when you attempt to read them. You can do that by displaying a datasheet view of the database, or by a form where the linked table is the data source. Beware, we have had problems which tables with lots of records being the source for drop down lists, having the database locked.
Access only does this properly (and at speed) if either the underlying table has a unique clustered index, or after having linked the tables you create an index in access.
If you want to create a copy that you can manipulate (such as write to) and the underlying tables are read only, then you will have to create matching unlinked tables and execute some form of copy sql and appropriate points in your application.
I am facing problem regarding the encryption of the structure of a table in phpmyadmin. like i want to see the structure of a table in encrypted form when i export a table from phpmyadmin. means i want that the fields of the table appear in .sql file should be encrpted so that no one can read the information about fields of the table. i have search a lot on google, yahoo but i don't find any solution to do this. Please it will be a great favour. Thanks in advance
the example is here
basically i have a table name student with fields like name, cnic, dob i want that when i export the whole table into sql file from phpmyadmin the fields name like name, cnic, dob should be appear encrypted. so the user don't able to read them. just i want to do this task
I believe you can encrypt table data while you are saving in MYSQL, so that MySQL doesn't even know it's encrypted.
Result: Whenever you export table data either from phpmyadmin, No one can read that sql or csv file without decypting it.
If you are showing data you can de-crypt it back for getting its actual form.
Note: If you encrypt your data with any method like MD5 you can't
get its actual form (De-crypt from).
I have created a data model in Access Database.
Tables that are composite parts of the model are loaded with data. Some of the data needs to be loaded manually.
Now I would like to link couple of tables together and give the user the option to insert the missing data in the tables. (I am linking the tables together so that the user doesn't have to work with raw keys, but with the "actual" information that he knows.)
I+ve never worked with Access DB before and therefore I would like to ask you to please instruct and help me on how to accomplish my goal?
The access form wizards are pretty well put together. You can easily setup a form that will allow them to insert data into the table(s).