In the local environment everything with encoding is ok, but when i make dist and run my app on the server (ubuntu) and do POST, cyryllic characters of json in the request body turn in └я▀п╡я└п╟я▀я└' (as it turned out it's only the terminal issues) in controllers:
def editUser = SecuredAction(WithRole(ADMIN)).async(parse.json) { implicit request =>
log.debug(request.body) // here I have └я▀п╡я└п╟я▀я└' instead of cyrillic characters
I checked request headers:
Accept:application/json
Accept-Encoding:gzip, deflate
Accept-Language:ru-RU,ru;q=0.8,en-US;q=0.6,en;q=0.4
Connection:keep-alive
Content-Length:192
Content-Type:application/json; charset=UTF-8
Maybe some of you encountered with that. Thx!
The problem was in mysql, here the answer:
I added useUnicode=true&characterEncoding=UTF-8:
db.default.url="jdbc:mysql://localhost:3306/mydb?useUnicode=true&characterEncoding=UTF-8"
but it didn't help. So I added to my.etc on the server:
[mysql]
default-character-set=utf8
[mysqld]
character-set-server=utf8
OK!
Related
I have such a route:
#app.route('/wikidata/api/v1.0/ask', methods=['POST'])
def get_tasks():
print(request.data)
print(request.json)
return jsonify(1)
I send a request:
curl -i -H "Content-Type:application/json" -X POST -d "{\"название\": \"значение?\",\"param1\": \"Q29424\"}" http://localhost:8529/wikidata/api/v1.0/ask
and get the error:
HTTP/1.0 400 BAD REQUEST
Content-Type: text/html
Content-Length: 223
Server: Werkzeug/0.14.1 Python/3.6.5
Date: Fri, 15 Feb 2019 08:34:27 GMT
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 3.2 Final//EN">
<title>400 Bad Request</title>
<h1>Bad Request</h1>
<p>Failed to decode JSON object: 'utf-8' codec can't decode byte 0xed in position 2: invalid continuation byte</p>
Meanwhile print(request.data) shows that request.data is b'{"\xed\xe0\xe7\xe2\xe0\xed\xe8\xe5": "\xe7\xed\xe0\xf7\xe5\xed\xe8\xe5?","param1": "Q29424"}'
The only thing that helped so far is
decoded_data = request.data.decode('windows-1251')
question = json.loads(decoded_data)
I'm looking a way to send a request properly (or configure server) so that I can use request.json without errors.
Thank you.
That's a Windows specific issue likely due to the default charset in the Windows console prompt. The cyrillic characters in your command line are misinterpreted with a non UTF-8 compatible charset.
Since you already use Python, the easiest way IMHO to send your request is to use the requests module (that you can install using pip install requests). Then type this command in a python file, using UTF-8 as charset:
import requests
requests.post("http://localhost:8529/wikidata/api/v1.0/ask", json={"название": "значение?","param1": "Q29424"})
Run it and it will have the same effect as your curl command line, only with proper cyrillic characters handling.
I have a Phoenix app and on the javascript side I use the Filestack client. Filestack requests a JSON file from my server. I had put the file in my asset directory and it gets loaded but the Filestack Javascript client crashes with an error because it can't read the json do to german umlauts (öäü). I looked at the header and it gets served like this Content-type: application/json. I think what I need is Content-type: application/json; charset=utf-8. I also use webpack2 btw.
How do I accomplish this?
Plug.Static uses the mime package to set the content-type header. You can override the value for json as described in the mime package's README. Make sure your app is using mime version 1.1.0 or later because the builtin mime types were not overridable due to a bug that was fixed in 1.1.0.
Add this to config/config.exs:
config :mime, :types, %{"application/json; charset=utf-8" => ["json"]}
Then, force recompile mime:
mix deps.clean --build mime
and then start Phoenix:
mix phoenix.server
After this, the content-type of .json files served by Plug.Static should be application/json; charset=utf-8:
$ curl -I localhost:4000/js/foo.json
HTTP/1.1 200 OK
server: Cowboy
date: Sat, 18 Feb 2017 14:36:51 GMT
content-length: 3
cache-control: public
etag: 8EA91E
content-type: application/json; charset=utf-8
I am trying to activate gzip compression on all JSON output on sails.js.
I added this in config/http.js:
order: [
'startRequestTimer',
'cookieParser',
'session',
'myRequestLogger',
'bodyParser',
'handleBodyParserError',
'compress',
'methodOverride',
'poweredBy',
'$custom',
'router',
'www',
'favicon',
'404',
'500'
],
compress: require('compression')(),
I know the compress: require('compression')() line is called because I try with a wrong value and it crashes.
I restarted sails but the headers do not show gzip compression.
Requested headers show I accept gzip compression:
Accept: application/json, text/plain, */*
Accept-Encoding: gzip, deflate
Thank you for your help!
I was struggling with the same thing, then I went through the Sails source code and found that the compress middleware is only activated if the app is run in production environment (i.e. NODE_ENV === production).
Could it be that you're doing this locally? I bet it will work if you set NODE_ENV to production.
This should at least apply to the default compress middleware, so maybe try removing the one you added yourself.
I'm running Apache 2.4.7 on Ubuntu Server 14.04
I have a webserver running. It returns 304 modified for images, but it doesn't return the same for json files. I have checked the answers and comments for this post and this post, however, they don't work for me.
In my .conf file, when I do not load mod_deflate, the server returns a 304 response for my json file. But when I GZIP this file, the server returns 200 OK.
This is what I add to my apache2.conf file:
<IfModule mod_deflate.c>
AddOutputFilterByType DEFLATE text/html text/plain text/css application/json
AddOutputFilterByType DEFLATE application/javascript
</IfModule>
Any workaround to enable both mod_deflate and 304 for .json files?
Thanks!
Simply set Etag to NONE in apache config.
Having mod_deflate on will append -gzip to Etag, which server then doesn't accept.
Look at the mod_deflate spec
AddSuffix Append the compression method onto the end of the ETag,
causing compressed and uncompressed representatins to have unique
ETags. This has been the default since 2.4.0, but prevents serving
"HTTP Not Modified" (304) responses to conditional requests for
compressed content.
I set charset utf-8; at http tag, but error.log given:
2012/08/22 10:47:33 [error] 6588#1560: *1 no "charset_map" between the charsets "GB2312" and "utf-8" while reading response header from upstream, client: 127.0.0.1, server: localhost, request: "GET /index2.php HTTP/1.1", upstream: "fastcgi://127.0.0.1:9000"....
Any idea? I want set default language to utf-8 only.
From the error you gave it would seem that your fastcgi-app is passing gb2312 encoded text to nginx. So either:
make sure it's sending utf-8 text, or
make sure you have an appropriate charset_map set up in nginx (see http://nginx.org/en/docs/http/ngx_http_charset_module.html#charset_map)