Duplicate entry 'groser' for key 'PRIMARY' - mysql

I get the following error after attempting to import a Drupal site database dump on my local server.
The import stops at the search_total table. I have looked in search_total for the word "groser" which I have found and isn't duplicated.
On the other hand I have found the word "großer" which leads me to believe that Mysql interprets the "ß" as a "s".
search_total MyISAM utf8_general_ci
Any ideas on how to deal with this issue?
Thanks!

That is a known issue when exporting from an older MySQL version (5.0) to a newer one (5.1) for example. If you don't really care about having the exact same data in your search table (should be irrelevant if you're setting up a local development environment for example), you can ignore these duplicates with the -f flag.
See http://linuxadminzone.com/ignore-mysql-error-while-executing-bulk-statements/

Related

Postgresql Encoding Issue Using tsearch With Thai Text After Conversion from MariaDB using Pgloader

I am trying to convert a MySQL UTF8mb4 database which contains both Thai and English to Postgresql. This appears to go
well until I try and add tsearch. Let me outline the steps taken.
Install this Thai parser https://github.com/zdk/pg-search-thai
I restore a copy of production locally from a dump file into MariaDB
Fix some enum values that trip up Postgresql due to them being
missing. MariaDB is happy with them :(
Convert some polygons to
text format as pgloader does not deal with them gracefully.
-Run pgloader against a fresh postgresql database, testdb
pgloader mysql://$MYSQL_USER:$MYSQL_PASSWORD#localhost/$MYSQL_DB postgresql://$PG_USER:$PG_PASSWORD#localhost/testdb
This appears to work, the site, a Laravel one, appears to function although with some bugs to fix due to differences
between MariaDB and Postgresql constraint behavior. However when I try and create text vectors for tsearch, I run into
encoding issues. This is where I need advice.
-- trying to create minimal case, dumping Thai names into a temporary table
CREATE EXTENSION thai_parser;
CREATE TEXT SEARCH CONFIGURATION thai_unstemmed (PARSER = thai_parser);
ALTER TEXT SEARCH CONFIGURATION thai_unstemmed ADD MAPPING FOR a WITH simple;
-- to test the parser is working, which it is
SELECT to_tsvector('thai_unstemmed', 'ข้าวเหนียวส้มตำไก่ย่าง ต้มยำกุ้ง in thailand');
-- to recreate my error I did this
CREATE TABLE vendor_names AS SELECT id,name from vendors_i18n;
ALTER TABLE vendor_names ADD COLUMN tsv_name_th tsvector;
-- this fails
UPDATE vendor_names SET tsv_name_th=to_tsvector('thai_unstemmed', coalesce(name, ''));
The error I get is ERROR: invalid byte sequence for encoding "UTF8": 0x80
If I dump that table and restore into a new Postgresql database I do not get the encoding error.
Questions:
What is the correct encoding to use for UTF8mb4 to Postgresql for pgloader?
Is there any way, other than the above, of checking the data being correct UTF8 or not?
Is the problem in the Thai parser tool?
Any suggestions as to how to solve this would be appreciated.
Cheers,
Gordon
PS I'm an experienced developer but not an experienced DBA.
Have you tried manually importing the dataset row-by-row to see which rows are successfully imported and which ones fail? If some imports succeed but others fail it would seem to be a data integrity problem.
If none of the records are successfully imported it's obviously an encoding problem.

Importing SQL database to live server, tables missing

Okay, so first of all, I am SO sorry if this is an ignorant and stupid question. I have absolutely no knowledge of databases. I have only used them when creating and uploading Wordpress sites, and it works if everything goes without any errors.
So here is my problem:
I've created a Wordpress website on a local server. I've done the usual, exported the database, tried to upload it on the live server, but there seems to be an error.
I get the #1064 error.
1064 - You have an error in your SQL syntax; check the manual that corresponds to your MariaDB server version for the right syntax to use near '?' at line 59
Half of the tables do not import on the live server. Here's what I've already tried:
Exporting the tables with the "Enclose export in a transaction" and "Disable foreign key checks" checked.
Exporting in two files with and without the mentioned options checked. This way I got more tables, and the wp_options table got
full instead of empty, but still only 15 tables instead of 23.
I checked to see if the "TYPE" syntax is "ENGINE" and it seems fine to my unknowing eyes.
I am a total ignorant when databases are concerned. I don't know what to check anymore. My guess is that the live server uses MariaDB and it is somehow not compatible with the SQL I'm trying to upload.
I tried to see the line 59, but there is no "?" there, at least not where I'm looking at. It might be that I'm looking at the wrong place, the blond that I am.
Here's the code around the line 59 when database is opened in editor.
--
-- Table structure for table `wp_gg_folders`
--
CREATE TABLE `wp_gg_folders` (
`id` int(11) NOT NULL,
`title` varchar(255) NOT NULL,
`date` datetime NOT NULL
) ENGINE=InnoDB DEFAULT CHARSET=utf8;
I tried to see the MariaDB's documentation and the possible way to deal with any incompatibility would be to update my MySQL. I have no idea how to do it.
Oh and btw. I already have a working website on this server but with an older version of Wordpress, so if it's an old SQL version, why is it working there?
This turned out to be long... sorry. And thanks in advance!
Edit: I discovered there is a problem with exporting. In the exported file, there are always last third of the tables missing. I have no idea why. Can I somehow get the tables/whole database manually, and not through PhpMyAdmin?
Okay guys and girls, I've found the solution.
The problem was not in any error, it was in exporting the database at the very beginning, therefore, importing showed different kinds of errors each time I tried to import the database.
The query, while exporting would abort the export causing only part of the tables to be exported. By changing the length of the query while doing the export, I solved the problem.
Instead of 50000, I wrote in 1047551.
Here's the article that helped me, with screenshots:
https://wpengine.com/support/exporting-database/

#1062 - Duplicate entry '_site_transient_timeout_theme_roots' for key 'option_name'

I'm migrating a site from Site5 hosting to InMotion Hosting and the importation of the mysql file (via phpMyAdmin) is not working.
import of mysql file failed, returning this error msg:
#1062 - Duplicate entry '_site_transient_timeout_theme_roots' for key 'option_name'
From what I've been able to garner on various forums with my pleas for assistance is this:
every table has a key?
so I guess I should hunt for some type of duplicate in the option_name table?
Am lost
Please help
I have seen this type of problem many times.
My employer's company has a DB Hosting client that uses Drupal.
There is a particular table called search_index that holds words.
The character set being used for the table differs from the character set being used to record the data from an incoming Web Browser session. When I mysqldump this database and reload it into another MySQL Instance, I was getting error 1062 also. I was screaming "How in the world would a mysqldump fail on a reload ?"
Since the table's data was being collected on an on-going basis, the client gave me permission to truncate the table, then mysqldump the database. Needlessly to say, the reload of the mysqldump was successful.
Given the error message, the only way this could be happening to you is if the _site_transient_timeout_theme_roots table has an indexed column called option_name and that column's character data is incompatible with the table's character set. For example, the table may have been created with UTF-8 and you are loading Latin-1 characters or some freakish Unicode into option_name. You won't have any issue with the data being stored. It's when you mysqldump the table and reload that the character set weirdness rears its ugly head.
Make sure you exported only the database you want to import and not more than 1 database.
I got this error when I accidentally exported the DB server, not the DB, then tried to import multiple DBes.
If you have Caching plugings such as Autoptimize and WP Fastest Cache and others alike, make sure you clear all caches BEFORE you export the database.
If anybody see this post, the solution is align all charset and collation types between the file (when save the file set the charset or open with textpad and change the charset) and the database then put this line at the beginning of the file :
SET NAMES utf8mb4;
worked fine for me.

MySQL 5.1 database export fails while importing into MySQL 5.1

I am having a particularly weird problem putting the data into my production server. My test server runs MySQL 5.1.41. I export the database (tried both via mysqldump and PHPMyAdmin) and then try to import into my production server that runs MySQL 5.0.92.
In one of the tables, I get an error "#1062 - Duplicate entry '1' for key 1". That table has a PRIMARY key and a UNIQUE composite key. When I look into output of phpmyadmin error, I do not see any duplicates.
I already tried:
- exporting with the option "add AUTO INCREMENT"
- checking whether the collations are the same. They are. Besides, the keys in question are numerics.
So if anybody knows what could be causing the error, and how to fix it - I would appreciate it.
Try to remove the index from the table in a dump and then, when all the data will be imported, add it manually.
If everything works fine in the source database, the only thing that I can think of is that the target database doesn't set the appropriate column to AUTO_INCREMENT.
Have you tried creating that table manually in the target database and setting the column in question to AUTO_INCREMENT before importing the source dump?
The the solution was to remove the AUTO_INCREMENT attribute from columns before importing. Still do not understand why, though. The indexes could actually stay. Then the AUTO_INCREMENT could be easily added back. A mixture of what Martin & Silver Light proposed, but I guess none of the answers were exactly right.

CSV import problem; on different servers import throws error or succeeds

We recently migrated a site from Superb.net to MediaTemple. Part of the upkeep of this site is a 60000+ record export (in 3 CSVs) from Raiser's Edge which I import into mySQL.
The tables retained the same schema before and after the move. This week when I went to do my import I found that each of the CSVs caused an invalid field count error thousands of rows into the data. In one case the error occurs more than 12000 rows into the data.
I examined three rows for each error, before the line #, after, and the row itself. They look fine. They have no quoted values, no bad characters, nothing I can see wrong. They have the correct number of fields. There are no quoted values in the whole file. Verified this in UltraEdit text editor. There are no commas in the data.
After trying the import using mysqlimport and then a LOAD DATA INFILE query and finding both to be disallowed I contacted MediaTemple who said sorry both of those were not available to me. I could upgrade mysql on our dedicated virtual server but then any problems are not theirs. MediaTemple says this is a version issue with mySQL.
So on a whim I took the CSV and tried the import on the old server and lo and behold it rolled in fine. I don't know the mySQL version on the old server (Superb) but they run phpMyAdmin 2.11.8.1.
MediaTemple is running mysql Ver 14.12 Distrib 5.0.45, for redhat-linux-gnu (i686) using readline 5.0 and phpMyAdmin 2.8.2.4.
Does this ring any bells? Make any sense to anyone? Any advice?
This may not fall in the category 'answer', but here's my 2 cents.
To be honest, I don't recognize your problem and I can't think of any sensible explanation.
But, the time it took to troubleshoot and to type up this post, wouldn't that be better spent writing some sort of shell/perl/php script that does the import for you? Of course, it wouldn't be blazing fast, but hey, we're talking 60K records here, that shouldn't take more than a couple of minutes tops.
In the end I felt pretty dim for not having realized I should just install the current version of phpMyAdmin. It was in the end clearly a version bug as the upgrade solved the import problem with no further contortions.