UTF-8 characters saved as ? on Linux MySQL - mysql

I have a Spring App with Tomcat and MySql who runs with no problems in Windows, but when it's running on Linux server, special utf-8 characters (like spanish Ñ) are saved in Mysql table with the ? symbol.
If I change this character directly on the database, it's correctly showed by the app. It's when I save the object using the app when the Ñ is replaced by the ?
Other apps are running on the same server and they don't have this issue.
The data base has the UTF-8 charset/collation.
Can anybody help?
Thanks a million!

Make sure your Datasource URL includes UTF-8 character encoding!
Example:
jdbc:mysql://localhost:3306/databaseName?useUnicode=yes&characterEncoding=UTF-8

you have to put the right encoding in your properties file or in your DataSource URL. (characterEncoding=UTF-8)

Related

How to set UTF8 encoding for ClearDB (MySQL) on Heroku

Database I have on heroku doesn't support special characters so I want to set utf8 encoding. When I was working on local version I simply changed config files but I wonder how I can achieve this using Heroku's DB.
This added to connection url doesn't help:
?useUnicode=true&characterEncoding=UTF-8
I resolved this problem when I set this ?useUnicode=true&characterEncoding=UTF-8 into "Config Vars" on site heroku.com in section "settings" after question mark. Look up at image description here http://images.vfl.ru/ii/1588540106/e4c9cfa2/30400062.jpg
Switch from the ClearDB heroku add-on JawsDB MySQL solve my problem.

iconv() illegal character issue on Live server in Laravel 4 / dom2pdf

I have a Laravel 4.2 application which renders text to a PDF using dom2pdf (0.4) which works fine on the dev server but not live. It throws a 'iconv(): Detected an illegal character in input string' error on live which I believe is due to a long hyphen ( – ) in the text. Each system is using exactly the same data.
Is there a setting on the live server to work around this? All text is stored as utf8_unicode_ci in the MySQL database, and the html for the PDF has a utf-8 charset directive in the header. Both servers are running Apache/CentOS/Cpanel, dev server is running PHP Version 5.5.38 while live is 5.5.34.
Found the issue - it was because the mbstring PHP extension was not installed on the live server. If this is absent Laravel/dom2pdf falls back to its own mbstring implementation which was failing in my case.
Now mbstring is installed it works fine. Thanks to those who looked.

Azure website IIS not encoding utf8

My website is in Azure and the Database is MySQL, ASP.NET 4.0
When I run on localhost (Connected to production DB), I can read-write to the DB in utf8.
When I run on Azure (aka, production) I can only read DB in utf8 but when trying to write it inserts '???? ???? ????'.
So, If it's the same Database and same code, the difference must be IIS, no?
Can anyone assist me how to define it to work?
(btw, the MySQL connection-string has 'charset=utf8')
update:
the web.config file has:
<globalization requestEncoding="utf-8" responseEncoding="utf-8" />
A Ha!
It was Azure!
I needed to go to the Azure Portal and change the connection string there as it ignores my web.config connection string and uses that one instead.
By the way, adding the charset=utf8 there did the trick.
I hope someone will find it useful.

UTF-8 Character encoding issue with DataNucleus JDO application (exposing MySQL database)

I have a DataNucleus JDO web application that exposes a MySQL database table as a web service. One of the table rows contains text in the Russian. When the web service returns that row, that text displays as ?'s.
Here is what I've tried so far:
I have confirmed that the data in the MySQL database table is in fact encoded correctly. When I view the data in MySQL Workbench, it displays correctly.
I tried writing a small MySQL client in Java that simply connects to the database and prints out the row. At first, I was receiving similar output (all ?'s). I then tried running the client with a JVM argument, -Dfile.encoding=UTF-8, and it worked.
I am running the web services in Tomcat. I tried adding -Dfile.encoding=UTF-8 to JAVA_OPTS in catalina.bat, and there was no change.
I have my JDBC Connection URL specified as a property in my pom.xml file, and it is referenced in the jdoconfig.xml. I tried appending ?useEncoding=true&characterEncoding=UTF-8 to the connection url, and I keep receiving the following NullPointerException error when I try to reach the web service:
java.lang.NullPointerException
org.datanucleus.api.rest.RestServlet.doGet(RestServlet.java:271)
javax.servlet.http.HttpServlet.service(HttpServlet.java:621)
javax.servlet.http.HttpServlet.service(HttpServlet.java:728)
I'm all out of ideas now. I've been careful to make sure that I am viewing any output in a medium that supports UTF-8 encoding. I don't have a lot of experience with DataNucleus and the JDO API, so I'm wondering if I am missing something related to that.

mysql console (windows->linux), wrong character set?

When I make a query from the mysql console and it has accents or any character that needs to be utf-8 encoded, it gets mugged
INSERT INTO users (userName) VALUES ("José Alarcón");
SELECT userName FROM users;
José Alarcón
SET NAMES utF8 changes nothing --default-character-set=utf8 as parameter changes nothing
Keep in mind than this is ONLY from the console. If I use phpmyadmin or make any query from a program, there is no problem at all, but an inserted row from the console gets muggled.
I'm using putty on windows as client
~$ locale
LANG=en_US.UTF-8
LC_CTYPE="en_US.UTF-8"
LC_NUMERIC="en_US.UTF-8"
LC_TIME="en_US.UTF-8"
LC_COLLATE="en_US.UTF-8"
LC_MONETARY="en_US.UTF-8"
LC_MESSAGES="en_US.UTF-8"
LC_PAPER="en_US.UTF-8"
LC_NAME="en_US.UTF-8"
LC_ADDRESS="en_US.UTF-8"
LC_TELEPHONE="en_US.UTF-8"
LC_MEASUREMENT="en_US.UTF-8"
LC_IDENTIFICATION="en_US.UTF-8"
LC_ALL=
Clarification:
Mi local computer is windows XP, i'm using putty 0.60 as terminal client. The target system where MySQL is running is a Debian linux
I can't find any configuration in putty for character encoding...
Update: Stupid PuTTY, having the encoding configuration inside a menu called "translation" WTF?
Set PuTTY to interpret received data as UTF8 in Window -> Translation "Character set on received data".
Windows can't handle UTF8 in console and system messages (which putty uses). It wants to use your locale codepage. This is a common and known problem, and it's not solvable without rewriting cmd.exe, or using a different command line tool.
Microsoft have never really bothered about encodings outside their own world, which results in weird windows specific codesets.
Maybe you can change the putty encoding somewhere in its options so that it can at least communicate correctly to the mysql cli?
Your terminal client must be configured using UTF8. Your shell environment on the server must also be configured as UTF8. You can check it out with the following command.
locale
It depends on the distribution (I'm assuming you are using linux) how the system prefers how you fix the locale information if needed. For instance, Debian (and, I guess, Ubuntu) ask you to use the following command to reconfigure the locale settings.
dpkg-reconfigure locales
Notice; I'm not sure if they've changed this, haven't tested it in a while. :-)
You can of course set the locales in the shell each time you log in or in your profile. I recommend that you use the distribution's method to do it (if you need to do it after all :-)).