Appending rows in a database using toad and excel - mysql

Friends, I am using toad for MySQl, and have a huge database ready and validated.
Now i have an excel file which contains data-entries for a particular table. And i am also successfully able to import data into the db using import wizard, mapping the first row header with the column names etc.
But now i have appended a few data entries into it which i wish to insert into the database. However the old values also get selected and hence cause a primary_key_violation exception as the entry already exists! Otherwise a truncate table option is there which i dont wish to use as there may be many files from which i have inserted the data.
I tried my level best but didnt get any solution, atleast in toad for mysql. Please tell me what to do! the solution maybe simple but i need it SOS

An option may be to not append records to that excel file, but create a new excel file with only the new records

Related

MySQL Table Import always missing 1 row

Every week I import a table of our clients into MySQL and every week a random client is missing from that list. That same client might have been imported without problems the week before and then next week again imported perfectly fine, however MySQL will always omit one client from the import which makes my job very frustrating in trying to find who it was and manually fix it.
Has anyone run into similar issues in the past or have any idea why they might be happening?
I am using the MySQL table import wizard. Normally I don't have issues, but this specific client table export I get from another platform is having this issue every single week.
I would do the import from the Excel into a TEMPORARY (ie: not the final table), but one of the same structure. Then, once that is done and the import of all names and columns are parsed, you can left-join from LIVE table to the TEMP table where the entry exists on LIVE but not in TEMP. This would help find the missing import. THEN, since the data is already in a valid table structure, you can pull in all new values from the temp table to the production, including adding any new that did not yet exist.
If you are exporting to excel using the Export wizard, it creates a sheet with the name of the table, but also creates a named range with the same name. The named range then contains only the region with data. Then when you add new data, this named region is not extended. And when you are later importing the data, you probably chose the name without the $ character on end. The name without the $ in the name represents the named region, which probably doesn't contain your newly inserted data. To load all the date chose the name with $ character or extent the named region to cover all the data in the sheet.

import sql file asreplace with phpmyadmin

How can I import a SQL-file (yes, sql not csv) with phpmyadmin so that it replaces or updates the data while importing?
I did not find option for that. I also created another temporary database where I imported the sql-file in question (having only INSERT -lines, only data no structure), and then went to export to select suitable option like INSERT ... SELECT ... ON DUPLICATE KEY UPDATE ..but did not find one or anything that would help in the situation.
So how can I achieve that? If not with phpMyAdmin, is there a program that transforms "insert" sql file to "update on duplicate", or even from "insert" to "delete" after which I could then re-import with original file?
How I came to this, if it helps the above or if someone has better solutions to previous steps:
I have a semi-large (1 GB) DB file to import, which I have then divided to multiple smaller files to get it imported. One of them being the structuce sql-dump and rest the data. When still trying to get the large file through, trying to adjust timeout settings through htaccess or phpmyadmin import options did not help - always getting the timeout anyway. Since those did not work, I found a program by Janos Rusiczki (https://rusiczki.net/2007/01/24/sql-dump-file-splitter/) to split the sql file into smaller ones (good program thanks Janos!). It also separated the structure from the data.
However after 8 succesfull imports I got timeout again, after phpmyadmin already imported part of the file. Thus I ended up in current situation. I know, I can always delete all and start over with even smaller partial files, but.. I am sure there is a better way to do this. There has to be a way to replace the files on import, or do some other way described above.
Thanks for any help! :)
Cribbing from INSERT ... ON DUPLICATE KEY (do nothing), you can use a regular expression to make every INSERT into an INSERT IGNORE in your sql file and it will pass over all the entries that have already been imported.
Note that will also ignore other errors, but other than timeout errors don't seem likely in this context.

Erasing records from text file after importing it to MySQL database

I know how to import a text file into MySQL database by using the command
LOAD DATA LOCAL INFILE '/home/admin/Desktop/data.txt' INTO TABLE data
The above command will write the records of the file "data.txt" into the MySQL database table. My question is that I want to erase the records form the .txt file once it is stored in the database.
For Example: If there are 10 records and at current point of time 4 of them have been written into the database table, I require that in the data.txt file these 4 records get erased simultaneously. (In a way the text file acts as a "Queue".) How can I accomplish this? Can a java code be written? Or a scripting language is to be used?
Automating this is not too difficult, but it is also not trivial. You'll need something (a program, a script, ...) that can
Read the records from the original file,
Check if they were inserted, and, if they were not, copy them in another file
Rename or delete the original file, and rename the new file to replace the original one.
There might be better ways of achieving what you want to do, but, that's not something I can comment on without knowing your goal.

java and mysql load data infile misunderstanding

Thanks for viewing this. I need a little bit of help for this project that I am working on with MySql.
For part of the project I need to load a few things into a MySql database which I have up and running.
The info that I need, for each column in the table Documentation, is stored into text files on my hard drive.
For example, one column in the documentation table is "ports" so I have a ports.txt file on my computer with a bunch of port numbers and so on.
I tried to run this mysql script through phpMyAdmin which was
LOAD DATA INFILE 'C:\\ports.txt" INTO TABLE `Documentation`(`ports`).
It ran successfully so I went to do the other load data i needed which was
LOAD DATA INFILE 'C:\\vlan.txt' INTO TABLE `Documentation` (`vlans`)
This also completed successfully, but it added all the rows to the vlan column AFTER the last entry to the port column.
Why did this happen? Is there anything I can do to fix this? Thanks
Why did this happen?
LOAD DATA inserts new rows into the specified table; it doesn't update existing rows.
Is there anything I can do to fix this?
It's important to understand that MySQL doesn't guarantee that tables will be kept in any particular order. So, after your first LOAD, the order in which the data were inserted may be lost & forgotten - therefore, one would typically relate such data prior to importing it (e.g. as columns of the same record within a single CSV file).
You could LOAD your data into temporary tables that each have an AUTO_INCREMENT column and hope that such auto-incremented identifiers remain aligned between the two tables (MySQL makes absolutely no guarantee of this, but in your case you should find that each record is numbered sequentially from 1); once there, you could perform a query along the following lines:
INSERT INTO Documentation SELECT port, vlan FROM t_Ports JOIN t_Vlan USING (id);

Can I import tab-separated files into MySQL without creating database tables first?

As the title says: I've got a bunch of tab-separated text files containing data.
I know that if I use 'CREATE TABLE' statements to set up all the tables manually, I can then import them into the waiting tables, using 'load data' or 'mysqlimport'.
But is there any way in MySQL to create tables automatically based on the tab files? Seems like there ought to be. (I know that MySQL might have to guess the data type of each column, but you could specify that in the first row of the tab files.)
No, there isn't. You need to CREATE a TABLE first in any case.
Automatically creating tables and guessing field types is not part of the DBMS's job. That is a task best left to an external tool or application (That then creates the necessary CREATE statements).
If your willing to type the data types in the first row, why not type a proper CREATE TABLE statement.
Then you can export the excel data as a txt file and use
LOAD DATA INFILE 'path/file.txt' INTO TABLE your_table;