I would like to update mysql from v5.1 to v5.6, so that we can use utfmb4 instead of utf8.
After I updated my mysql, I changed the hibernate conf file connection url to
jdbc:mysql://xxx.xxx.xxx.xxx:3306/app?characterEncoding=utf8mb4, i.e. utf8 to utf8mb4 and
then I failed to connect db. If I changed it back to utf8, it worked.
The error info:
com.mchange.v2.resourcepool.CannotAcquireResourceException: A ResourcePool could not acquire a resource from its primary factory or source
pls help me.
Hope this will help:
http://info.michael-simons.eu/2013/01/21/java-mysql-and-multi-byte-utf-8-support/
Best regards,
Related
i'm trying to setup a SonarQube server using the 7.1 docker official image connecting with Oracle 12C 12.2.0.1 database with AL32UTF8 character set. The thing is UTF8 is deprecated in oracle, and instead they use AL32UTF8, that is pretty much the same but with more space to storage data.
When trying to start the sonar server and error is been throwing: "web server startup failed: Oracle NLS_CHARACTERSET does not support UTF8: WE8MSWIN1252". I can't find any doc or workarounds to fix this problem.
If any have experience this or have any clues will be very helpful. I'm stock with this problem and can't find a solution.
Thanks in advance.
Finally was able to solve this issue that was tormenting me sometime. Sonarqube perform a verification for ensure the database supports UTF8 encoding:
private void expectUtf8(Connection connection) throws SQLException {
String charset = this.getSqlExecutor().selectSingleString(connection, "select value from nls_database_parameters where parameter='NLS_CHARACTERSET'");
if (!StringUtils.containsIgnoreCase(charset, "utf8")) {
throw MessageException.of(String.format("Oracle NLS_CHARACTERSET does not support UTF8: %s", charset));
}
}
If you see the error showed in the question, run this query on your database: **select value from nls_database_parameters where parameter='NLS_CHARACTERSET'** for find out the character set your database have.
The value obtained by the previous query is used for check the characterset. In my specific case the result was WE8MSWIN1252, and that's why i was getting that error.
So, set the NLS_CHARACTERSET to AL32UTF8 fix the problem.
Hope this can help someone out there.
When deploying my jhipster based application to cloud foundry (in my case Pivotal with ClearDB service) I don't have option to change the DB character set and not to update the JDBC parameters as it shared DB.
the charset of the DB is latin1 and I need it to be utf-8 to be able support languages like Arabic and Hebrew.
So the only option I think about to support those languages is to init the DB session/connection when it's created, like running below sqls:
SET session character_set_client = charset_name;
SET session character_set_results = charset_name;
SET session character_set_connection = charset_name;
How this can be done in jhipster I don't see place where we can set DB connection/session init sqls and if you have any other recommendation?
Currently what happen is that Arabic/Hebrew input data coming from client saved in the DB as ????
BTW if I will update the DB entries using MYSQL Workbench the Arabic/Hebrew values are save correctly and also displayed correctly.
Thanks,
Rabiaa
The data was destroyed during INSERTion.
The bytes to be stored are not encoded as utf8/utf8mb4. Fix this.
The column in the database is CHARACTER SET utf8 (or utf8mb4). Fix this.
Also, check that the connection during reading is UTF-8.
See "question marks" in http://stackoverflow.com/questions/38363566/trouble-with-utf8-characters-what-i-see-is-not-what-i-stored
Is that one question mark per character? Or two (in the case of Arabic / Hebrew)?
Workaround to solve the issue, by creating proxy service:
1 Unbind the clearDB service (but keep it, just unbind)
2 Create new user provided service which will call the clearDB service with custom uri.
See the following commands:
cf create-user-provided-service mysql-db -p '{"uri":"mysql://<uri of the clearDB service>?useUnicode=true&characterEncoding=utf8&reconnect=true"}'
cf bind-service <app name> mysql-db
cf restart <app name>
My website is in Azure and the Database is MySQL, ASP.NET 4.0
When I run on localhost (Connected to production DB), I can read-write to the DB in utf8.
When I run on Azure (aka, production) I can only read DB in utf8 but when trying to write it inserts '???? ???? ????'.
So, If it's the same Database and same code, the difference must be IIS, no?
Can anyone assist me how to define it to work?
(btw, the MySQL connection-string has 'charset=utf8')
update:
the web.config file has:
<globalization requestEncoding="utf-8" responseEncoding="utf-8" />
A Ha!
It was Azure!
I needed to go to the Azure Portal and change the connection string there as it ignores my web.config connection string and uses that one instead.
By the way, adding the charset=utf8 there did the trick.
I hope someone will find it useful.
Our MySQL error log contains several thousands of the following Warning: Client failed to provide its character set. 'utf8' will be used as client character set.
We're using MySQL 5.6.14 CE Server on Windows Server 2012. I can't figure out how to get the warning to go away.
I've tried updating the ODBC DSN to use a specific character set, tried setting the charset in the connection string, and tried setting numerous variables in my.ini, and restarted the database. Nothing seems to work.
I updated the connector for our websites to the latest version and this warning has stopped appearing. I'm just going to go out on a limb and guess that the 1.10.1.0 version of MySQL Connector/.Net was not passing the charset variable I had placed in our connection strings.
Still curious why --log-warnings=0 parameter prevents the MySQL service from starting, and why log-warnings=0 in the my.ini did not stop the warning from showing up in the error log, but I guess that is another question.
I'm running java over jetty, on an EC2 linux instance, using MySQL DB. The column is a VARCHAR, set to accept utf8mb4 encoding.
After playing around with stuff, I've found out that it works when I run this code through gradle jettyRunWar, or even when running the same code on a tomcat server.
It doesn't work when I place the exact same war that was working before in $JETTY_HOME/webapps/root.war, then run jetty with sudo service jetty start.
The error shown is -
java.sql.BatchUpdateException: Incorrect string value: '\xF0\x9F\x99\x89' for column 'name' at row 1
Current column definition -
`name` varchar(50) CHARACTER SET utf8mb4 DEFAULT NULL
Value is set in SQL through preparedStatement.setString(...) and I made sure that mysql connector JAR is the same.
Any ideas?
Problem resolved once I've set all character_set_... MySQL DB variables to utf8mb4.
That probably means that newer version of jetty (8+) treat JDBC connection different than Tomcat or than older version, because the exact connection string, beans definition & DB was used at all cases.
Until now, character_set_... params were set to utf8 with specific columns defined as utf8mb4, that used to be enough.
Hopes that saves anyone else the day I was stuck on it.