My Doctrine 1.2 is integrated inside CodeIgniter as a hook and I know that my char-set is utf8 with collation utf8_unicode_ci.
I have two YAML files, one for creating the DB and its tables and one to load some test data. My data can contain French accents (çéïë...). In my schama.yml I have correctly specified the collation and char-set:
options:
type: INNODB
charset: utf8
collate: utf8_unicode_ci
I double checked the settings in phpMyAdmin, everything is correct.
When I run my doctrine script from commandline to load my fixture to my one of my tables, all the French accents are replaced by junk!
Am I missing a setting or configuration or is there a bug in Doctrine?
You should have in your /config/database.php Doctrine connection:
// Load the Doctrine connection
$doctrine = Doctrine_Manager::connection($db['default']['dsn'], $db['default']['database']);
To fix the problem with the encoding you have to add this line:
$doctrine->exec('set names utf8');
Related
I'm using Symfony 4 with a MySQL database (5.5.57-0ubuntu0.14.04.1) that doesn't support JSON type.
When I map a field to "json" using a Doctrine yml config file, the migration file generatet by doctrine:migrations:diff uses JSON MySQL type instead of LONGTEXT and it ends up with an error if I try to run the migration.
And every time a make a migration diff, I have to manually change JSON to LONGTEXT for the last changes and remove the changes to JSON as a result of older changes.
It's easy to screw up if in one of these migrations you forget to remove the "ALTER TABLE CHANGE somefield JSON".
My solution, at the moment, is to add in the doctrine.yml config file the following:
doctrine:
dbal:
mapping_types:
longtext: json
It makes some weird, but easy to track and remove, changes when I run doctrine:migrations:diff -v , like changing to LONGTEXT some couple of fields that were already LONGTEXT. I don't know why and I have to fix it this eventually.
But, at least, I could stop doctrine from trying to change my longtext to MySQL json data type that are mapped as doctrine json data type.
Did someone came across this problem?
Thanks!
I was wrong by using the "mapping_types" configartion that way. It make not sense. The solution It's even more easy, and I didn't realize the following parameter exists:
server_version
Just add the server version on the dbal configuration like this:
doctrine:
dbal:
driver: 'pdo_mysql'
server_version: '5.5.57'
charset: utf8mb4
default_table_options:
charset: utf8mb4
collate: utf8mb4_unicode_ci
For me i got the same problem and the solution was to go inside the created migration (ex: Version20190306110143.php) and change the dataType "JSON" into "LONGTEXT"!!
Actually, this worked for me, but frankly, i don't know if this will not cause problems with time. Any way if this happened, i will post them
thanks
I have a Rails 3.2 project using Mysql 5.5.34, with utf8 encoding. Now I found that with utf8 encoding Mysql could not save unicode characters which represent emoji.
So is it OK for me to convert the whole database to use utf8mb4 encoding that I found on the web that could hold 4 byte unicode include emoji?
Is all the information I have in the database covered by utf8mb4 encoding? Will I face data loses if I do that?
Is there any way that Rails provide to do that?
Thanks a lot for helping.
Actually you just need to migrate the column you want to encode with utf8mb4.
execute("ALTER TABLE yourtablename MODIFY yourcolumnname TEXT CHARACTER SET utf8mb4 COLLATE utf8mb4_bin;")
If you plan to migrate the data itself it might not be possible, since the common utf8 consists out of 3 byte chars and the utf8mb4 out of 4 byte. So you might already have corrupt data in your db.
Furthermore Rails 3.2 has an encoding issue within ActiveSupports JSON encoding. In case you plan to work with json and emojis, you will need to add a patch like the following (based on the solution in rails 4 https://github.com/rails/rails/blob/4-0-stable/activesupport/lib/active_support/json/encoding.rb) or just simply upgrade to rails 4.
module ActiveSupport
module JSON
module Encoding
class << self
def escape(string)
if string.respond_to?(:force_encoding)
string = string.encode(::Encoding::UTF_8, :undef => :replace).force_encoding(::Encoding::BINARY)
end
json = string.gsub(escape_regex) { |s| ESCAPED_CHARS[s] }
json = %("#{json}")
json.force_encoding(::Encoding::UTF_8) if json.respond_to?(:force_encoding)
json
end
end
end
end
end
I'm trying to seed categories in my database. I set the title as unique. However, in french it's spelled catÉgorie.
I try to seed Catégorie 1, Catégorie 2, Catégorie 3. I got an error when I php artisan db:seed because he read it has Cat?gorie 1, Cat?gorie 2, Cat?gorie 3 and that Cat is repeated (therefore not unique).
I've set my db to utf8_general_ci and did the change in config/database.php.
What did I miss here?
Although you do not specify that, Im assuming you are using mysql and the "Categorie 1" is the content of some filed in one of the seeded tables. Things to remember if you want to use Laravel with utf8 mysql data:
Laravel 4 Config
//file: app/config/database.php
'charset' => 'utf8',
'collation' => 'utf8_unicode_ci',
Seeder files encoding
obviously, all your files should be saved with utf8 encoding.
Data collation
the database collation is set to the utf8 you set in the above mentioned config file
the tables collation is the same utf8
the fields are also utf8 where needed
Database connection
the connection itself should be of utf8.
Regarding the last remark, ofcourse Laravel already deals with that while establishing mysql connection by executing "set names '$charset' collate '$collation'" . If for some reason you will still be facing problems, try enforcing utf8 charset and collation in mysql server config and it should ultimately solve your issues:
//file /etc/mysql/my.cnf
[mysqld]
...
collation-server = utf8_unicode_ci
init-connect='SET NAMES utf8'
character-set-server = utf8
...
Hints & Analysis
"Cat?gorie 1" or "Catégory 1" are processed as normal strings being inserted in a DB Table row collumn.
But Some of the modern query strings parameters are represented by a like
query("SELECT something FROM somewhere WHERE condition IN (?)", array('this', 'that'))
Here your query compiler may get lost if stuff is not well formatted.
.
Another point is, if you are using Artisan command line in Dos if you're on Windows for example, the output messages either mentioning success, errors or whatever will not be displayed correctly in your console, because the DOS output will not be supported by UTF-8.
Solution
1 . The seeder errors may be right, check flat your Database if the records really exists.
2 . If the records doesn't exists, please attach your : migration & seed classes.
I've setup a MySQL DB with utf8_unicode_ci collation, and all the tables and columns on it have de same collation.
My Doctrine config have SET NAMES utf8 as a connection option and my html files use utf8 charset.
The text saved on those tables contain accented characters (á,è,etc).
The problem is that when I save the content to the DB, it saves with strange characters, like when I try to save ISO in UTF8 table. (e.g.: NotÃcias)
The only workaround that i've found is to, utf8_decode before save, and utf8_encode before printing.
That means that, for some reason, something in between is messing up utf8 with iso.
What might be?
Thanks.
EDIT:
I've setup to encode before saving and decode before printing, and it prints correctly but in DB my chars change to:
XPTÓ -> XPTÓ
This makes searching in DB for "XPTÓ" impossible...
I would print bin2hex($string); at each step of the original workflow (i.e. without encode/decode steps).
Go through each of:
the raw $_POST data
the values you get after form validation
the values that get put in your bound Entity
the values you'd get from the db if you query it directly using PDO (get this from
$em->getConnection())
the values that get populated into your Entity on reload (can do this via $em->detach($entity); $entity = $em->find('Entity', $id);)
You'd be looking at the point at which the output changes, and focus your search there.
I would also double check:
On the db: SHOW CREATE TABLE 'table' shows CHARSET=utf8 for the whole table (and nothing different for the individual columns)
That the tool you use to see your database values (Navicat, phpMyAdmin) has got the correct encoding set.
I am developing an app in which I am using MySQL database. The database contains certain characters which can not be encoded to the client side & I found those values null.
Like, a string containing a special character is represented as null at the client side.
I found that the default charset for the db was latin1, I changed it to utf-8, including all tables and individual columns of those tables. Also in my pdo_construct I have mentioned the charset to be utf-8,
$db = new PDO('mysql:dbname=$dbname;host=$dbhost;charset=utf8',$dbname,$dbhost);
I also configured the response headers to use utf-8 charset. But the characters are still not encoded, I am still getting null string in case where the special character is present.
I tried changing the my.ini file configuration by setting the default charset, it gives me error in my connection file at PDO construct.
Its urgent for me to fix this! Can someone help?