What is the difference in View and EER Diagram? - mysql

I quite confuse with both. First, both might seen the same. May I know how the View and EER Diagram is apply?

View is a stored query accessible as a virtual table composed of the result set of a query. View is the perspective on the data from tables. It actually does not store data in it but give you the access like tables from the database only thing is you may not allow to insert data in to it.
EER/ER is nothing but data modelling techniques of the database. It is not acting on data but on the structure of tables in database. Generally it gives snapshot of tables & their relation. This is helpful to get the idea that howz the data flows in a database.

Related

Operations between linked tables and native tables

I have three identical tables, one on MySQL, one linked to this one on Access by ODBC, and a native in the same Access database.
When I update the table on MySQL, the linked table on Access updates, and vice versa. But I would like to know if it is possible that the linked table updates the native table (and vice versa)?
Access table
MySQL table
It really depends on how the local Access table is being updated. If it is ALWAYS updated say by a few forms, then you could add a after update even to those few forms, and put in code to update the MySQL table.
Another approch (again you only/always update the local tables) is to add a table trigger to the local table. In this table code event, you can actually have it call some VBA code, and that VBA code could then update/insert to the linked MySQL table. Once again, then the two tables will automatic remain in sync.
The other possible would be to add a time + date stamp column to the tables (both on MySQL side, and on the Access side). You could then write some VBA code to sync up the tables. Such code is not too hard, but in a multi-user setting, this can become quite a challenge, since while you are syncing the data, other users might also update the MySQL tables and thus your sync routines might well miss some tables. Database sync software and this subject can fill a few books the size of medical texts, and is a VERY complex subject.
However, why not just always use linked tables to MySQL, and be done with any requirements to sync data? Access makes a great client to SQL server or MySQL. If you eliminate the local tables, then you eliminate the need to sync your data.

MySQL: Automate Data Ingestion from regular txt/csv files to a Database

Intro
I've searched all around about this problem, but I didn't really found a source of knowledge about this, so I'm sorry if this problem seems basic to you, but for me is rather quite intriguing due the fact that I'm having hard time to guess what keywords to use on google in order to retrieve proper info.
Problem Description :
As a matter of fact, i have to issues that i don't know how to deal in a MySQL instance installed in a laptop in a windows environment:
I have a DB in MySQL with 50 tables, of with 15 or 20 tables are tables with original data. The other tables were tables that i generated from the original data tables, in order to properly create tables that would allow me to analyze data in PowerBI. The original data tables are fed by dumps from a ERP Database.
My issue is the following:
How would one automate the process of receiving cumulative txt/csv files (via pen-drive or any other transfer mechanism), store those files into a folder and then update the existing tables with the new information? Is there any reference of best practices to deal with such a scenario?
How can i maintain the good shape of my database with the successive data integration, I mean, how can I make my database scalable and responsive?
Can you point me some sources that would help me with this?
At the moment I imported data into tables, in 2 steps:
1st - I created the table structure with the Workbench import wizard help ( I had to do it this way because the tables have a lot of fields - dozens of them, literally, and those fields need to be in the database). I also inserted primary keys and indexes in those tables;
2nd - I Managed to load the data from the files into those tables, using LOAD DATA IN FILE command.
Some of the fields of the tables created with the import wizard, were created as data type text, with is not necessary in this scenario. I would like to revert those fields to data type NVARCHAR(255) or something, However there are a lot of field to alter the data type and in multiple tables at this point, and i was wondering if i can write a query to do the job of creating all the ALTER TABLES statements i need.
So my issue here is: is it safe to alter the data type in multiple fields in multiple columns (in this case i would like to change fields with datatype text to NAVARCHAR(255))? What is the best way to do this? Can you point me to some sources or best practices for this, please?
Thank you, in advance, for your help.
Cheers
You need a scripting language, not a UI. See mysql commandline tool, the shell of your OS, etc, etc.
DROP DATABASE and reCREATE it
LOAD DATA
Massage the data to get the columns cleaner than what the load data provided
Sic the BI tool on the data.
If you want to discuss Step 3, we need details about what transformations are needed between step 2 and step 4. That includes providing the format or schema for steps 2 and 4.

Many tables, one, shared structure

For example i have 10 databases and they are the same. Only different data.
When i'm making some update in one database table(for example adding new collumn or changing/renaming it), it should affect others tables in all databases.
Can i make some view to make it easier or there is a way to do it automatically?
Thanks.
You can create a Trigger, which observe changes in the database structure.
See MySql trigger

Is there a simple way to directly convert Web-based MySQL DB into ER Diagram?

I have a domain, for which I got a website created by someone. They have used MySQL as the DB, but did not do a good job of documenting this and now it is a total mess where I do not know how the DB Tables connect to each other (there are still only 73 tables in 2 DB's).
I have the option of downloading all the table schemas into SQL statements, Excel, etc., but wish to figure out a way to create a ER diagram from the DB Tables on the site itself, so I can replicate it later.
Is there a way to get the ER diagram from the tables on the site itself? If not, which is the best way to convert the SQL Schema into an ER Diagram?
You should look at MySql WOrkBench
I have used Squirrel SQL Client to view the relationships between tables visually. You can select any table and view "tables that depend on that table" or view "all related tables".
Also look at the tutorial here

Setting up a master database to control the structure of other databases

I got a case where I have several databases running on the same server. There's one database for each client (company1, company2 etc). The structure of each of these databases should be identical with the same tables etc, but the data contained in each db will be different.
What I want to do is keep a master db that will contain no data, but manage the structure of all the other databases, meaning if I add, remove or alter any tables in the master db the changes will also be mirrored out to the other databases.
Example: If a table named Table1 is created in the master DB, the other databases (company1, company2 etc) will also get a table1.
Currently it is done by a script that monitors the database logs for changes made to the master database and running the same queries on each of the other databases. Looked into database replication, but from what I understand this will also bring along the data from the master database, which is not an option in this case.
Can I use some kind of logic against database schemas to do it?
So basicly what I'm asking here is:
How do I make this sync happen in the best possible way? Should I use a script monitoring the logs for changes or some other method?
How do I avoid existing data getting corrupted if a table is altered? (data getting removed if a table is dropped is okay)
Is syncing from a master database considered a good way to do what I wish (having an easy maintainable structure across several datbases)?
How will making updates like this affect the performance of the databases?
Hope my question was clear and that this is not a duplicate of some other thread. If more information and/or a better explantion of my problem is needed, let me know:)
You can get the list of tables for a given schema using:
select TABLE_NAME from information_schema.tables where TABLE_SCHEMA='<master table name>';
Use this list for a script or stored procedure ala:
create database if not exists <name>;
use <name>;
for each ( table_name in list )
create table if not exists <name>.table_name like <master_table>.table_name;
Now that Im thinking about it you might be able to put a trigger on the 'information_schema.tables' db that would call the 'create/maintain' script. Look for inserts and react accordingly.