How can I import initial table data to a .mwb file? I know that there is an inserts tab for each table, but I would like to import around 200 records and I don't want to do this by hand.
It is not possible with the modern version of MySQL Workbench. There is no way, essentially, to model data - you can only upload it to the server (not the model). The only way currently is to edit one by one which isn't practical. Even if you reverse engineer a table filed with data, the inserts table of the EER model will be blank. You'll note that right-clicking on the row of the inserts tab gives a number of greyed out options including "load from file". I suspect the team didn't have time to implement them or something. Anyway, there is a simple work around if you know phpMyAdmin, which seems to handle CSV files well, or MySQL Workbench, which I have not gotten to work at all with CSV files.
Solution:
Draw your DB model in MySQL Workbench, structure only. Place all your data in associated CSV files - I use Excel and save as CSV - very easy. Once you have your data modeled in Excel and the structure modeled in Workbench, forward engineer the DB, then use some other tool or technique to upload your Excel modeled data.
Not the greatest solution, but bug them to provide data modeling and maybe we'll be lucky in the next version.
Currently this seems not to be possible. I too was hoping to be able to Reverse engineer from the insert statements in a script file, but 1. it didn't work :P and 2. actually the documentation explicitly states that these will be ignored:
http://download.oracle.com/docs/cd/E19078-01/mysql/mysql-workbench/wb-data-modeling.html#wb-reverse-engineering
7.7.9.1. Reverse Engineering Using a Create Script
Reverse engineering using a create script is done by using the File, Import, Reverse Engineer MySQL Create Script ... menu options. Doing this opens a file open dialog box with the default file type set to an SQL script file, a file with the extension sql.
You can create a data definition (DDL) script by executing the mysqldump db_name --no-data > script_file.sql command. Using the --no-data option ensures that the script contains DDL statements only. However, if you are working with a script that also contains DML statements you need not remove them; they will be ignored.
It seems that the lesson is that we ought to handle such resources (that are too large to be manually inserted) through some other medium, such as a versioned sql file. :(
Related
I inherited the maintenance of a small web forum. Near as I can tell, it is powered by a MySQL database on the backend (the frontend is all PHP).
I need to extract some of the data (which also involves searching for the data I need to extract), but I don't want to touch the production database. I exported a database backup, which produced a several-hundred-megabyte .sql file.
What's the best way to mine these data? I can see several options:
grep through the .sql script in text mode, trying to extract the relevant data
Load it up in sqlite3 (I tried doing this, but it barfed on some of the statements in the script and didn't produce any tables. I have no database experience whatsoever though, so I haven't ruled it out as a dead end just yet).
Install MySQL on my home box, create a database, and execute the .sql script to recreate the data. Then just attach some database explorer tool.
Find some (Linux) app which can understand the .sql file natively (seems unlikely after a bit of Googling).
Any pointers to which of these options (or one I haven't thought of yet) would be the most productive?
I would say any option might work but for data mining, you definitely want to load it up in a new database so you can start query-ing the data and building reports on the data. I would load it up on your Home box. No need to have it remote.
If I were to want to create a PHP function that does the same thing as the Export tab in phpMyAdmin, how could I do it? I don't know if there is a MySQL function that does this or if phpMyAdmin just builds the export file (in SQL that is) manually. Without shell access. Just using PHP.
I tried the documentation for mysqldump, but that seemed to require using the shell. I'm not quite sure what that even is -- maybe my question is: how do you use shell?
My silly idea is to allow non-technical users to build a site on one server (say a localhost) using MySQL then export the site, database and all, to another server (eg. a remote server).
I think I'm pretty clear on the Import process.
You can check the phpMyAdmin source code (an advantage of open-source software). Check the export.php script and the supporting functions in the libraries/export/sql.php script file.
In summary, what phpMyAdmin does is:
get a list of the tables in the given database (SHOW TABLES FROM...),
get the create query for each table (SHOW CREATE TABLE...),
parse it and extract column definitions from it,
get all data (SELECT * FROM...)
build a query according to column data.
I've written similar code for my own apps (for backup purposes, when the GPL license of phpMyAdmin doesn't allow me to use it), however I use DESCRIBE to get column definitions. I think they rather parse the SHOW CREATE TABLE output because contains more information than DESCRIBE output.
This way to generate SQL sentences requires a bit of care to handle the escaping but it allows for some flexibility, as you can convert types, filter or sanitize data, etc. It is also a lot slower than using a tool like mysqldump and you should take care of not consuming all available memory (write soon, write often, don't keep everything in memory).
If you will implement a migration process (from server to server) maybe it would be easier to do it with some shell scripting and calling mysqldump directly, unless you will do everything with PHP.
I've built a EER Model in MySQL Workbench that I forward engineer to create the database. The forward engineering works perfectly, and the database is created from the diagram as expected.
Apart from tables, there are also some Stored Procedures (aka Routines) that I've included in the model. These routines are designed to only be run once, as soon as the database has been set up. They automatically insert necessary data into the tables.
My question is, how can I get the forward engineering process to automatically call/execute one of these routines once the tables have been created.
At the moment, I have to forward engineer the database, and then manually call the stored procedures?
In your EER diagram on the workbench right click on a table and select edit table. This will open a pane at the bottom with a couple of tabs. The tabs are table, columns, indexes, foreign key, etc. There is a tab called insert. This tab allows you to insert records into the Model database.
When you click on the insert tab it will show a grid. Add the records you want to insert onto this grid. Make sure you commit these records. See screenshot for example.
Now when you forward engineer the database on the very first screen there is an option to Generate insert statements. Tick the option forward engineer and the data you want inserted will be scripted when you create the model. Save the script so you can run it over and over without going into the MySQL workbench.
I have not found options to update, delete or do other data manipulation in the workbench but I think this is what you are looking for.
NOTE : To directly import the records, you won't be able to do that via the workbench; there is no option. You can save the records to file. However to import/create them you would need to add them one at a time (from the modeller). You could however make a backup of the MySQL database with the records in already. Then copy those INSERT Statements from the MYSQL backup script into your setup script.
Steps would be:
Create database.
Import the files with the setup/config records into newly created database
Backup database
Open backup file, then copy and paste the INSERT statements you are looking for into the setup script created by the MySQL Workbench
UPDATE:
I did some experimenting when you get to the review script to be executed step in the forward engineering you can also at the end call the stored procedures (as you mentioned) by editing the script. Once done save the script to file and test.
Hope that helps!
I have a client who got a zipped file that has all the database they had in the SaaS app they were using. Now, we have a similar app but our column names are different (obviously) and in some cases we have less columns. So, now i want to upload all this data to my database but i am not sure how to do it?
I run phpmyadmin on the servers.
Extract the file on your desktop.
Login to your phpMyAdmin account.
Click the Import tab.
Select the file to import, file format, ect. and click Go.
Browse through the structure of the imported database to the columns of interest. For each column, click the pencil icon to edit the column (i.e. rename it), or click the X icon to delete it.
To merge data sets, after importing the tables, you would need to run your own query in the SQL tab to merge the data sets.
That are two different tasks in one question,
phpMyAdmin is able to import ZIP-files directly – you don't need to extract them on your local machine. Also be aware of max upload sizes and maximum script execution times, when importing huge database dumps.
To map an existing database to another structure involves a lot of manual work, like renaming tables and columns and copying data from on table to another.I would suggest, you import the old/original database to some "working copy" database and have your new database separate. That way you can use MySQL-features (INSERT INTO new_db.YX … SELECT XY_a FROM old_db.XY) to copy the data where it should go.
Well first you need to take a look at the data files and see how the columns/tables differ. After you sort that out you can go about about figuring out how it insert the data. If the files are large and there are quite a few i wouldnt use phpmyadmin. I'd ssh into the box and use the command line client or set the DB up for remote access and use a local copy of the client.
If youre lucky you won't have to do any processing on the data and you just map values from the old columns to the new columns as part of you LOAD DATA INFILE statement. Whatever you do youll want to test all this on a dummy db(s) first before you go running it in a live environment.
I'm using Firebird database and I need to load Excel file into a database table. I need a tool that does this well. I tried some I found on Google, but all of them have some bugs.
Since Excel data is not created by me, it would be good if it could scan the file and discover what kind of data is inside and suggest a table to be created in the database.
Also, it would be nice if I could compare the file against the data that is already in the database table, and I can pick which data to load and which not.
Tools that load CSV files are also fine, I can "Save as" CSV from Excel before loading.
Well, if you can use CSV, the I guess XMLWizard is the right tool for you. It can load a CSV file and compare with database data. And you can select the changes you wish to make to the table.
Don't let the name fool you, it does work with XML, but it also works very well with CSV files. And it can also estimate the column datatypes and offer CREATE TABLE statement for your file.
Have you tried FSQL?
It's a freeware very similar to Firebird's standard ISQL, but with some extra features, like import data from CSV files.
I've used it with DBF files and it worked fine.
There is also EMS Data import tool for Firebird and Interbase
http://www.sqlmanager.net/en/products/ibfb/dataimport
Not free, though, but it accepts a big variety of formats, including CSV and Excel.
EDIT
Another similar payware tool is Firebird Data Wizard http://www.sqlmaestro.com/products/firebird/datawizard/
There are some online tools which can help you to generate DDL/DML scripts from csv header/sample dump file, check out: http://www.convertcsv.com/csv-to-sql.htm
You can then use sql-workbench's Data Pumper or WbImport Tool from command line.
Orbada has GUI which support for importing csv file also.
DBeaver Free edition also support importing csv out of the box.
BULK INSERT
Other way is on Excell you build formula in new cells with data you want to export. The formula consists to format in strings and lenght to your field according lenght your field in firebird. So you can copy all this cells from excell and past on txt editor, so is possible to use the strategy of BULK INSERT in Firebird.
See more details in http://www.firebirdfaq.org/faq209/
The problem is if you have blob or null data to import, so see if you have this kind of values and if this way is to you.
If you have formated data in txt file, BULK INSERT will be quick way.
Hint: You can too to disable trigger and index associated with your table to accelerate BULK INSERT, and after enable them.
Roberto Novakosky
I load the excel file to lazarus spreadsheet and then export to firebird db. Everythong is fine and the only problem is fpspreadsheet will consider string field with numbers only as a number field. I can check the titles in the first row to see whether the excel file is valid or not.
As far as I can see all replies so far focus on tools that essentially read the Excel (or CSV) file and uses SQL inserts to insert the records into the Firebird database. While this works, I have always found this approach painstakingly slow.
That's why I created a tool that reads an Excel file and writes one file that has a (text) format suitable for Firebird external table (including support for UTF8 char columns) and one DDL file to create the external table in Firebird.
I then use regular SQL to select from the external table, cast as needed, and insert into whatever normal Firebird table I want. The performance with this approach is orders of magnitude faster than SQL inserts from a client app in my experience.
I would be willing to publish the tool. It's written in C#. Let me know if there's any interest.