import sql file asreplace with phpmyadmin - mysql

How can I import a SQL-file (yes, sql not csv) with phpmyadmin so that it replaces or updates the data while importing?
I did not find option for that. I also created another temporary database where I imported the sql-file in question (having only INSERT -lines, only data no structure), and then went to export to select suitable option like INSERT ... SELECT ... ON DUPLICATE KEY UPDATE ..but did not find one or anything that would help in the situation.
So how can I achieve that? If not with phpMyAdmin, is there a program that transforms "insert" sql file to "update on duplicate", or even from "insert" to "delete" after which I could then re-import with original file?
How I came to this, if it helps the above or if someone has better solutions to previous steps:
I have a semi-large (1 GB) DB file to import, which I have then divided to multiple smaller files to get it imported. One of them being the structuce sql-dump and rest the data. When still trying to get the large file through, trying to adjust timeout settings through htaccess or phpmyadmin import options did not help - always getting the timeout anyway. Since those did not work, I found a program by Janos Rusiczki (https://rusiczki.net/2007/01/24/sql-dump-file-splitter/) to split the sql file into smaller ones (good program thanks Janos!). It also separated the structure from the data.
However after 8 succesfull imports I got timeout again, after phpmyadmin already imported part of the file. Thus I ended up in current situation. I know, I can always delete all and start over with even smaller partial files, but.. I am sure there is a better way to do this. There has to be a way to replace the files on import, or do some other way described above.
Thanks for any help! :)

Cribbing from INSERT ... ON DUPLICATE KEY (do nothing), you can use a regular expression to make every INSERT into an INSERT IGNORE in your sql file and it will pass over all the entries that have already been imported.
Note that will also ignore other errors, but other than timeout errors don't seem likely in this context.

Related

Erasing records from text file after importing it to MySQL database

I know how to import a text file into MySQL database by using the command
LOAD DATA LOCAL INFILE '/home/admin/Desktop/data.txt' INTO TABLE data
The above command will write the records of the file "data.txt" into the MySQL database table. My question is that I want to erase the records form the .txt file once it is stored in the database.
For Example: If there are 10 records and at current point of time 4 of them have been written into the database table, I require that in the data.txt file these 4 records get erased simultaneously. (In a way the text file acts as a "Queue".) How can I accomplish this? Can a java code be written? Or a scripting language is to be used?
Automating this is not too difficult, but it is also not trivial. You'll need something (a program, a script, ...) that can
Read the records from the original file,
Check if they were inserted, and, if they were not, copy them in another file
Rename or delete the original file, and rename the new file to replace the original one.
There might be better ways of achieving what you want to do, but, that's not something I can comment on without knowing your goal.

PHPMyAdmin put the most recent database

I'm wondering if there is any better way than going through the tables one by one adding the columns missing when some fields/tables needs to be added because of the most recent changes in the app?
For example, I'm working at the localhost and when I finish doing the new version of my app, I will put all the files into my FTP and, sometimes, I have done, in my local database, changes and so it means that I also need to update my database at my server.
There's any better way to add/edit the columns/tables without changing the info? Some of the columns are also deleted, etc.
Hopefully you've thought your database design through so that making changes to the structure is a rare occurrence. If you're making regular changes to the number of columns or adding tables, it's likely a sign that you haven't normalized your database structure sufficiently.
Anyway, I'd script it as an SQL file that you deploy (which you can then run through phpMyAdmin or the command line or any other means you prefer to execute SQL queries). This has the added advantage of being something you can easily duplicate across your development and production databases, send to customers, and if you wish store in version control so you know when exactly you made the changes to the database.
This way, you'll end up with an SQL file that has a couple of statements like
ALTER TABLE `foo` ADD `new` INT NOT NULL ;
or something similar.
As for how you'd make the file, probably the easiest way is just copying and pasting the generated SQL statement from phpMyAdmin after modifying the table -- the SQL code used to make the change is shown near the top of the screen on the next page. You can copy and paste that to a new text file to create your SQL file. You may wish to add the first line
use `baz`;
using your database name instead of "baz". That way you don't have to specify on import which database the changes are meant for.
Hope this helps.

Import some database entries through PHPMyAdmin with overwrite

I exported a couple of entries from a database I have stored locally on my MySQL dbase through PhpMyAdmin and I'd like to replace only those entries on my destination database hosted online. Unfortunately when I try to do so PHPMyAdmin says that those posts already exist and therefore he can't erase them.
It'll take me a lot of time to search for those entries manually within the rest of the posts and delete them one at a time so I was wondering if there's any workaround in order to overwite those entries on import.
Thanks in advance!
A great option is to handle this on your initial export from phpMyAdmin locally. When exporting from phpMyAdmin:
Export method: Custom
Format: SQL
Format-specific options - choose "data" (instead of "structure" or "structure and data")
In Data creation options - Function to use when dumping data: Switch "Insert" to "Update" <-- This is the ticket!
Click Go!
Import into your production database. (always backup your production database before hand just in case)
I know this is an old post, but it actually helped me find a solution built into phpMyAdmin. Hope it helps someone else!
This is a quick and dirty way to do it. Others may have a better solution:
It sounds like you're trying to run INSERT queries, and phpMyAdmin is telling you they already exist. If you use UPDATE queries, you could update the info.
I would copy the queries you have there, into a text editor, preferably one that can handle find and replace, like Notepad++ or Gedit, and then replace some code to change the queries around from INSERT to UPDATE.
See: http://dev.mysql.com/doc/refman/5.0/en/update.html
OR, you could just delete them, then run your INSERT queries.
You might be able to use some logic with find and replace to make a DELETE query that gets rid of them first.
http://dev.mysql.com/doc/refman/5.0/en/delete.html
Check out insert on duplicate. You can either add the syntax to your entries stored locally, or import into a temporary database, then run an INSERT ... SELECT ... ON DUPLICATE KEY UPDATE. If you could post a schema, it would help us guide you better.

How to Update the mysql database when large csv file gets modified

Initially, I created a database called "sample" and updated the data from massive size CSV file.
Whenever I have small changes in .csv file (some data are added/deleted/modified), I have to update this in database too. Always updating the entire .csv file (large) is not efficient.
Is there any efficient way to update the modified data from .csv file to database?
There's no simple way of doing this.
One plausible way would be to store the old version of the CSV somewhere, run a diff-program between the old and new version, and then use the resulting output to determine what has been updated, changed, or removed, and update the database accordingly.
This is however a bit unreliable, slow, and would take some effort to implement. If you can it would probably be better to adapt the source of the CSV file to update the database directly.
since you also want to delete entries that are not existing in the csv file anymore, you will have to load the complete csv file every time (and truncate the table first) in order to get a 1:1 copy.
For a more convenient synchonzation you probably will have to utilize some scripting language (php, python etc).
Sorry thats all I know...
In my experience it's almost impossible to consider a datafile that changes regularly as your "master data set": unless you can somehow generate a diff file that shows where the masterdata changed you will always be forced to run through the entire csv file, query the database to return the corresponding record and then either do nothing (if identical), insert (if new) or update (if modified). In many cases it will even be faster to just drop the table and reload the entire thing, but that can lead to serious operational problems.
Therefor, if it's at all possible for you I'd consider the database as the masterdata, and generate the csv file from there.

Appending rows in a database using toad and excel

Friends, I am using toad for MySQl, and have a huge database ready and validated.
Now i have an excel file which contains data-entries for a particular table. And i am also successfully able to import data into the db using import wizard, mapping the first row header with the column names etc.
But now i have appended a few data entries into it which i wish to insert into the database. However the old values also get selected and hence cause a primary_key_violation exception as the entry already exists! Otherwise a truncate table option is there which i dont wish to use as there may be many files from which i have inserted the data.
I tried my level best but didnt get any solution, atleast in toad for mysql. Please tell me what to do! the solution maybe simple but i need it SOS
An option may be to not append records to that excel file, but create a new excel file with only the new records