MySQL Workbench -- Create Model with Queries - mysql

I'm new to using Workbench (obviously) and have hit a wall already. How do I create a Model using queries?
All the tutorials I've found require me to entire the fields one-by-one and select data type. I just want to use standard CREATE TABLE etc. queries to build the model quickly.
Thank you for your help!

Related

Getting SHOW CREATE TABLE from a view, as if it were a table

I have access to a remote database, and I would like to dump the schema and data of several views onto my local machine and load this into my local database as tables in a quick and easy way.
I lack the user privileges to run CREATE TABLE AS (SELECT * FROM target_view), otherwise this would be trivial to solve. In other words, I want to retrieve and recreate the "composite" schema of target_view as if it were a table.
I do not want the output of SHOW CREATE VIEW, as this only shows a complex SELECT statement with joins to various tables on remote I have limited ability to access. And a problem I'm seeing in MySQL 8.x is when I run SHOW CREATE TABLE on the view, this command simply acts as an alias of SHOW CREATE VIEW (which is reasonable).
Frustratingly, I can run DESCRIBE and see the schema of these views as they were tables. I really just need to convert this information into a CREATE TABLE statement without actually being able to run CREATE TABLE.
In case it weren't obvious, the key is to avoid manual reconstruction of these views' tabular schemas (as they may change in the future). I also want to avoid the solution of reverse engineering a generic table construction of 20-30 generic VARCHAR or TEXT columns from a CSV dump.
I don't know of any way to display the metadata of a result set in CREATE TABLE syntax.
What I would do given your circumstance is first create on your local MySQL instance the base table and the view, then you can use the CREATE TABLE AS SELECT ... syntax to produce a concrete table to match the metadata of the view result set.

Mysql Query match to check if query has been updated

I am trying to match two MySQL Queries (for now, the target is "Create VIEW") to analyze if the result of execution would result in the same effect to Database.
The source of the queries is not the same, making the syntax across the queries inconsistent.
To further simplify the question, let me add more details:
Let's say there is an already existing View in the database.
This View was created using a Create VIEW ... SQL statement.
There is a possibility that the Create VIEW ... statement get's updated, hence to reflect the changes in the database currently this statement is executed at the time of migration.
But, I want to avoid this situation, if the statement Create VIEW ... will result in the same structure as of the existing View in the database, I want to avoid executing it.
To generate the CREATE VIEW from database I am using SHOW CREATE VIEW... (comparing this with the query originally used to create the VIEW).
The primary restriction is I need to make this decision only at the time of migration and cannot presume any conclusions (say, using git diff or commit history...).
I have already done some search to look for a solution for this:
Found no direct solution for this problem (like a SQL engine to which I can feed both queries and know if the result would be the same).
Decided to Parse the queries and to achieve that ended up looking into ANTLR (also used by MYSQL WorkBench)
ANTLR's approach looks promising but, this will require an extensive rule-based parsing and creating a query match program from scratch.
I realized that just parsing queries is not enough, I have to create my own POJOs to store the atomic lexers from queries and then compare the queries based on some rules.
Even if I could find predefined POJOs, that would allow to quickly create a solution for this problem.

MySQL: Automate Data Ingestion from regular txt/csv files to a Database

Intro
I've searched all around about this problem, but I didn't really found a source of knowledge about this, so I'm sorry if this problem seems basic to you, but for me is rather quite intriguing due the fact that I'm having hard time to guess what keywords to use on google in order to retrieve proper info.
Problem Description :
As a matter of fact, i have to issues that i don't know how to deal in a MySQL instance installed in a laptop in a windows environment:
I have a DB in MySQL with 50 tables, of with 15 or 20 tables are tables with original data. The other tables were tables that i generated from the original data tables, in order to properly create tables that would allow me to analyze data in PowerBI. The original data tables are fed by dumps from a ERP Database.
My issue is the following:
How would one automate the process of receiving cumulative txt/csv files (via pen-drive or any other transfer mechanism), store those files into a folder and then update the existing tables with the new information? Is there any reference of best practices to deal with such a scenario?
How can i maintain the good shape of my database with the successive data integration, I mean, how can I make my database scalable and responsive?
Can you point me some sources that would help me with this?
At the moment I imported data into tables, in 2 steps:
1st - I created the table structure with the Workbench import wizard help ( I had to do it this way because the tables have a lot of fields - dozens of them, literally, and those fields need to be in the database). I also inserted primary keys and indexes in those tables;
2nd - I Managed to load the data from the files into those tables, using LOAD DATA IN FILE command.
Some of the fields of the tables created with the import wizard, were created as data type text, with is not necessary in this scenario. I would like to revert those fields to data type NVARCHAR(255) or something, However there are a lot of field to alter the data type and in multiple tables at this point, and i was wondering if i can write a query to do the job of creating all the ALTER TABLES statements i need.
So my issue here is: is it safe to alter the data type in multiple fields in multiple columns (in this case i would like to change fields with datatype text to NAVARCHAR(255))? What is the best way to do this? Can you point me to some sources or best practices for this, please?
Thank you, in advance, for your help.
Cheers
You need a scripting language, not a UI. See mysql commandline tool, the shell of your OS, etc, etc.
DROP DATABASE and reCREATE it
LOAD DATA
Massage the data to get the columns cleaner than what the load data provided
Sic the BI tool on the data.
If you want to discuss Step 3, we need details about what transformations are needed between step 2 and step 4. That includes providing the format or schema for steps 2 and 4.

Laravel 4 - copy a table from one databse to another

I am using Laravel 4 to build a site that uses a large number of mysql databases, where each database has multiple tables. The structure and organization of the databases is not within my control.
I need to be able to replicate a table from one database in another database at run time (the source database and destination database (and the specific table within the source database are dependent on choices made by the user).
Duplicating a table is easy to do with mysql:
CREATE TABLE database2.new_table LIKE database1.original_table;
INSERT INTO database2.new_table SELECT * FROM database1.original_table;
but I cannot figure out how to do it with Laravel.
I can easily access each database by creating their own connections ('mysql1' and 'mysql2') but I can't figure out how to construct the statement to use both. The following doesn't work
$success = DB::connection('mysql2')->statement('CREATE TABLE new_table LIKE database1.original_table);
because I am trying to access database1 directly without using the 'mysql1' connection, and Laravel generates an error saying that database1.original_table doesn't exist.
I feel like the solution should be obvious but don't have enough experience with Laravel to figure it out. Any guidance would be greatly appreciated.

Importing records from PostgreSQL to MySQL

Was wondering if anyone had any insight or recommended tools for exporting the records from a PostgreSQL database and importing them into a MySQL database. I believe the table structure is 100% identical.
Thoughts? Thanks!
The command
pg_dump --data-only --column-inserts <database_name>
will generate SQL-standard-compliant INSERT statements with all column names listed and one VALUES clause per INSERT. This is the most portable way of moving data from PostgreSQL to any other SQL database.
Check out SquirrelSQL, it can pump data from one database brand into another via the DBCopy plugin. When the table structures are really identical it works quite well.
There is a ruby app called Taps that will do it. I've used it before with great success:
http://adam.heroku.com/past/2009/2/11/taps_for_easy_database_transfers/