I have a supervisor who has a very annoying habit of editing our MySQL database in MS Excel. He does this by exporting the tables as a CSV, opening them in Excel, editing them, saving as a CSV, and re-importing. But there are some incompatibilities between PHPMyAdmin/MySQL and Excel, and so far this has led to two major system crashes (once because he tried to import an entire database from a CSV, which obviously makes no sense since CSVs don't delineate tables; and once because Excel added two extra rows to the CSV with incompatible data).
Since he refuses to listen when I tell him to stop doing this, I'd like to disable CSV import on his copy of PHPMyAdmin. We still need to be able to import SQL files, though. Is there any way to specifically disable imports of CSV files? In the PHPMyAdmin settings, I see that you can set default values, but how can I just disable that format altogether?
Go into the PHPMyAdmin source code and comment out the body of the function that handles CSV? It looks like commenting out line 121 of config.values.php will sort you. Let me know if you need further assistance.
Related
after searching the internet for a bit, I'm pretty sure this hasn't been answered directly so I'm asking here.
I am currently creating a Runescape (Laugh at me all you want ;P) Skilling Calculator for a School Programming Project, and am creating databases for XP values with phpMyAdmin, using information that is already on the web.
Instead of having to manually type out approximately 6000 different entries, each with 3 columns, I would rather copy and paste them, alleviating both time, and chances for errors. For example, I want to copy and paste all the information from here:
http://www.tip.it/runescape/pages/view/divination_calc.htm onto phpMyAdmin in bulk; not one entry at a time. I was wondering if this was possible in any way.
I would suggest copying and pasting the HTML table into excel, tidying up columns to match your database, saving as a CSV and importing using PHPMyAdmin's import function.
Here's an article I found on importing a CSV into PHPMyAdmin: importing a CSV into phpmyadmin
So, I've got a MySQL database consisting of a bunch of tables that I want to give to my uncle.
Problem is, he doesn't know much about computers, so I can't just hook him up with the database.
Instead, I would like to extract all the data from the database into a more readable format, e.g. Excel spreadsheets.
I've tried mysqldump, but that just gives me a *.sql file which doesn't help much.
Any ideas?
If you only have command line, this answer explains how to dump into a tab delimited and this answer how to dump into a comma delimited file. You can then import the tab delimited file into Excel.
Alternatively, you can use phpMyAdmin to export to *.csv if you have PHP.
PHPmyadmin allows you to Export to many different formats - why don't you access your database through that?
I know this might seem like a simple question. I've been on it for about a week now. I'll be the first to say I already solved this problem with MS Access, but my heart tells me there is an open source solution and a developer waiting for a donation. And I can't stand to use MS products if at all possible (sorry Bill).
The process I need to achieve:
Import a pipe delimited .txt file into an application that I can export from. NOTE: Millions of records.
Export from this format to .sql.
I should be able to open the file via phpMyAdmin
I've tried (with no success):
MySQL Workbench
Open Office Base
Glob
phpMyAdmin (direct import) but even after unlimited php and MySQL .ini process setting, the large files will not complete import - will die after several million records.
MySQL Console direct queries
I'm getting burned out.. I hope I am just missing something with one of the available open source tools or plugins. SOS.
How can I import initial table data to a .mwb file? I know that there is an inserts tab for each table, but I would like to import around 200 records and I don't want to do this by hand.
It is not possible with the modern version of MySQL Workbench. There is no way, essentially, to model data - you can only upload it to the server (not the model). The only way currently is to edit one by one which isn't practical. Even if you reverse engineer a table filed with data, the inserts table of the EER model will be blank. You'll note that right-clicking on the row of the inserts tab gives a number of greyed out options including "load from file". I suspect the team didn't have time to implement them or something. Anyway, there is a simple work around if you know phpMyAdmin, which seems to handle CSV files well, or MySQL Workbench, which I have not gotten to work at all with CSV files.
Solution:
Draw your DB model in MySQL Workbench, structure only. Place all your data in associated CSV files - I use Excel and save as CSV - very easy. Once you have your data modeled in Excel and the structure modeled in Workbench, forward engineer the DB, then use some other tool or technique to upload your Excel modeled data.
Not the greatest solution, but bug them to provide data modeling and maybe we'll be lucky in the next version.
Currently this seems not to be possible. I too was hoping to be able to Reverse engineer from the insert statements in a script file, but 1. it didn't work :P and 2. actually the documentation explicitly states that these will be ignored:
http://download.oracle.com/docs/cd/E19078-01/mysql/mysql-workbench/wb-data-modeling.html#wb-reverse-engineering
7.7.9.1. Reverse Engineering Using a Create Script
Reverse engineering using a create script is done by using the File, Import, Reverse Engineer MySQL Create Script ... menu options. Doing this opens a file open dialog box with the default file type set to an SQL script file, a file with the extension sql.
You can create a data definition (DDL) script by executing the mysqldump db_name --no-data > script_file.sql command. Using the --no-data option ensures that the script contains DDL statements only. However, if you are working with a script that also contains DML statements you need not remove them; they will be ignored.
It seems that the lesson is that we ought to handle such resources (that are too large to be manually inserted) through some other medium, such as a versioned sql file. :(
I'm using Firebird database and I need to load Excel file into a database table. I need a tool that does this well. I tried some I found on Google, but all of them have some bugs.
Since Excel data is not created by me, it would be good if it could scan the file and discover what kind of data is inside and suggest a table to be created in the database.
Also, it would be nice if I could compare the file against the data that is already in the database table, and I can pick which data to load and which not.
Tools that load CSV files are also fine, I can "Save as" CSV from Excel before loading.
Well, if you can use CSV, the I guess XMLWizard is the right tool for you. It can load a CSV file and compare with database data. And you can select the changes you wish to make to the table.
Don't let the name fool you, it does work with XML, but it also works very well with CSV files. And it can also estimate the column datatypes and offer CREATE TABLE statement for your file.
Have you tried FSQL?
It's a freeware very similar to Firebird's standard ISQL, but with some extra features, like import data from CSV files.
I've used it with DBF files and it worked fine.
There is also EMS Data import tool for Firebird and Interbase
http://www.sqlmanager.net/en/products/ibfb/dataimport
Not free, though, but it accepts a big variety of formats, including CSV and Excel.
EDIT
Another similar payware tool is Firebird Data Wizard http://www.sqlmaestro.com/products/firebird/datawizard/
There are some online tools which can help you to generate DDL/DML scripts from csv header/sample dump file, check out: http://www.convertcsv.com/csv-to-sql.htm
You can then use sql-workbench's Data Pumper or WbImport Tool from command line.
Orbada has GUI which support for importing csv file also.
DBeaver Free edition also support importing csv out of the box.
BULK INSERT
Other way is on Excell you build formula in new cells with data you want to export. The formula consists to format in strings and lenght to your field according lenght your field in firebird. So you can copy all this cells from excell and past on txt editor, so is possible to use the strategy of BULK INSERT in Firebird.
See more details in http://www.firebirdfaq.org/faq209/
The problem is if you have blob or null data to import, so see if you have this kind of values and if this way is to you.
If you have formated data in txt file, BULK INSERT will be quick way.
Hint: You can too to disable trigger and index associated with your table to accelerate BULK INSERT, and after enable them.
Roberto Novakosky
I load the excel file to lazarus spreadsheet and then export to firebird db. Everythong is fine and the only problem is fpspreadsheet will consider string field with numbers only as a number field. I can check the titles in the first row to see whether the excel file is valid or not.
As far as I can see all replies so far focus on tools that essentially read the Excel (or CSV) file and uses SQL inserts to insert the records into the Firebird database. While this works, I have always found this approach painstakingly slow.
That's why I created a tool that reads an Excel file and writes one file that has a (text) format suitable for Firebird external table (including support for UTF8 char columns) and one DDL file to create the external table in Firebird.
I then use regular SQL to select from the external table, cast as needed, and insert into whatever normal Firebird table I want. The performance with this approach is orders of magnitude faster than SQL inserts from a client app in my experience.
I would be willing to publish the tool. It's written in C#. Let me know if there's any interest.