how to get around "Content-encoding gzip deflate" header sent by Chrome? - google-chrome

We have a simple HTML login form on our embedded device's web server. The web server is custom coded because of severe memory limitations. Regardless of these limitations, we like Chrome and would like to support it.
All browsers post an HTTP Request to our login form containing the expected "username=myname&password=mypass" string, but not Chrome. Instead we receive from Chrome a "Content-encoding gzip deflate" request. BTW, by "all browsers", I mean this was tested to work fine on Internet Explorer versions 9 beta, 8, 7, 6 ; Firefox versions 4 beta, 3, 2 ; Opera 10, 9 ; Safari 5, 4, 3 ; and SeaMonkey 2.
Referring to section "14.2 Accept Charset" of the w3.org's http://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html we tried sending back a HTTP 406 code to indicate that this server does not support that encoding in the hope that Chrome would try again and post the expected strings the standard way. The 406 code returned by the web server is clearly displayed in Chrome's "Inspect Element" window, but it seems to be treated by Chrome as an error code, and no further requests are sent to the web server. "Login failed." We also tried HTTP return codes 405 and 200, same result.
Is there a way to get around this behavior either with client-side JavaScript that will prevent Chrome from sending the "Content-encoding gzip deflate" request, or with a server-side response that will explain nicely to Chrome we don't do gzip, just send it to us the regular way?
We tried posting to the Google Chrome Troubleshooting forum with no response.
Any help would be greatly appreciated!
Best regards,
Bert

You're looking in the wrong section for the error code: Section 14.11 of RFC 2616 specifies that you send a 415 (Unsupported Media Type) if you can't deal with the Content-Encoding.

It sounds like when using chrome to do a post to a server the first time, chrome defaults to using a gzip encoding. Pretty strange.
Easy way out is to just place your username/pass as GET parameters, and when sending the response, as long as you don't send gzip content encoding, chrome should start using none-gzipped posts from that point on. Hope that works?

I tested this out a bit with a simple Python script that printed to stdout. I thought I was getting the same problem, but then I realized that I was just forgetting to flush stdout. It seems that Chrome always sends the request up to the end of the headers before sending the request content, and you have to use a second recv call to get the POST data. In contrast, the entire Firefox request is returned in a single recv call.

Related

HTTP No Authorization field in the digest authentication requests

I have a http server with digest authentication on my SOC. On attempt to authenticate the server correctly sends response with 401 code and WWW-Authenticate header with a nonce and Digest schema. However on some hosts browsers do not include Authorization field back with nonce and etc. in consequent requests which they supposed to include.
Here is the Edge login attempt:
Response with WWW-Authenticate - https://i.imgur.com/tcw1XYL.png.
In the screen above correct WWW-Authenticate field returned by server.
Request without Authorization - https://i.imgur.com/4z61rU5.png.
I expect Authorization field in the next request but there is none!
The Chrome attempt is similar except it instantly shows 401 page without login prompt because there is no Authorization field in header.
Chrome and Edge both are latest 64bit versions on Windows 10.
What possible issues could cause this behavior?
Apparently the problem was multi-line WWW-Authenticate header. You can see the "/r/n" separators between header field values in the screenshots(0x0d 0x0a bytes).
Such multi-line was allowed in the original RFC 2616 and then deprecated by the newer RFC 7230. See https://stackoverflow.com/a/31324422/8876135 for details and links.
After fixing the header field by making it single line the problem was gone. Still i have no idea why the exact same browsers had this issue with the header at some hosts but was completely fine at my work/home PC's.

Why does Chrome ignore Set-Cookie header?

Chrome has a long history of ignoring Set-Cookie header. Some of these reasons have been termed bugs and fixed, others are persistent. None of them are easy to find in documentation.
Set-Cookie not allowed in 302 redirects
Set-Cookie not allowed if host is localhost
Set-Cookie not allowed if Expires is out of acceptable range
I am currently struggling with getting chrome to accept a simple session cookie. Firefox and Safari seem to accept most any RFC compliant string for Set-Cookie. Chrome stubbornly refuses to acknowledge that a Set-Cookie directive was even sent on the request (does not show up in Developer Tools (Network)). curl looks fine.
So does anyone have either 1) modern best practices for cross-browser Set-Cookie formatting or 2) more information regarding what can cause Chrome to bork here?
Thanks.
One thing that has bitten me and is not on your list: if you are trying to set a secure cookie through HTTP on localhost, Chrome will reject it because you are not using HTTPS.
This kind of makes sense, but is annoying for local development. (Firefox apparently makes an exception for this case and allow to set secure cookies over HTTP on localhost).

Cannot play wav file on safari

I'm trying to play a wav file on safari. Pretty much the same question as this: playing a WAV file on iOS Safari
But the accepted answers aren't working. I'm using a rails server with apache and phusion passenger. The audio file plays fine on chrome but not on any safari (desktop, mobile, and through uiwebview).
I'm sending the file in rails with
send_file filename, :type => "audio/x-wav", :disposition => "inline"
From the other stack overflow q&a, I tried adding Content-Range and Content-Length headers to the response
size = File.size(filename)
response.header["Content-Range"] = "bytes 0-#{size-1}/#{size}"
response.header["Content-Length"] = "#{size}"
The error i'm receiving is pretty nondescript "Failed to load resource: Plug-in handled load"
Here are the headers of the response
x-runtime 0.589797
Date Wed, 17 Aug 2016 16:38:49 GMT
X-Content-Type-Options nosniff
Server Apache/2.2.15 (CentOS)
X-Powered-By Phusion Passenger 5.0.7
Status 200 OK
Content-Type audio/x-wav
content-range bytes 0-3243/3244
Cache-Control private
Content-Transfer-Encoding binary
Content-Disposition inline; filename="eng-182-msg0026.wav"
Connection close
Content-Length 3244
X-XSS-Protection 1; mode=block
x-runtime 0.589797
When requesting a direct link to a media file this would work. But I was requesting this through a controller. Safari was making multiple http requests each for a certain range. The server was responding with the entire file every time which is why it fails.
The annoying thing is that the Safari web inspector decides not to show the "Range" header in the network request. It'll show other http headers but not the most important one in this case...
Anyways, short answer, respond back with the the correct byte range that each request wants.
I had the same problem, playing audio files on the Safari didn't work, but on the Chrome and every other browser everything worked as expected.
Set proper Content-Range in response header solve my problem. I used forked gem send_file_with_range, which contain fix for the Rails 5.1.
I recommend this gem because code in this gem is pretty simple and short, and this solution seems like the simplest and quickest.
Note: this solution is specific for the Ruby on Rails developers, but general solution is to set proper Content-Range in the response header, for every request.
I hope that this will be helpful for somebody.

Chrome is not sending if-none-match

I'm trying to do requests to my REST API, I have no problems with Firefox, but in Chrome I can't get the browser to work, always throws 200 OK, because no if-none-match (or similar) header is sent to the server.
With Firefox I get 304 perfectly.
I think I miss something, I tried with Cache-Control: max-age=10 to test but nothing.
One reason Chrome may not send If-None-Match is when the response includes an "HTTP/1.0" instead of an "HTTP/1.1" status line. Some servers, such as Django's development server, send an older header (probably because they do not support keep-alive) and when they do so, ETags don't work in Chrome.
In the "Response Headers" section, click "view source" instead of the parsed version. The first line will probably read something like HTTP/1.1 200 OK — if it says HTTP/1.0 200 OK Chrome seems to ignore any ETag header and won't use it the next load of this resource.
There may be other reasons too (e.g. make sure your ETag header value is sent inside quotes), but in my case I eliminated all other variables and this is the one that mattered.
UPDATE: looking at your screenshots, it seems this is exactly the case (HTTP/1.0 server from Python) for you too!
Assuming you are using Django, put the following hack in your local settings file, otherwise you'll have to add an actual HTTP/1.1 proxy in between you and the ./manage.py runserver daemon. This workaround monkey patches the key WSGI class used internally by Django to make it send a more useful status line:
# HACK: without HTTP/1.1, Chrome ignores certain cache headers during development!
# see https://stackoverflow.com/a/28033770/179583 for a bit more discussion.
from wsgiref import simple_server
simple_server.ServerHandler.http_version = "1.1"
Also check that caching is not disabled in the browser, as is often done when developing a web site so you always see the latest content.
I had a similar problem in Chrome, I was using http://localhost:9000 for development (which didn't use If-None-Match).
By switching to http://127.0.0.1:9000 Chrome1 automatically started sending the If-None-Match header in requests again.
Additionally - ensure Devtools > Network > Disable Cache [ ] is unchecked.
1 I can't find anywhere this is documented - I'm assuming Chrome was responsible for this logic.
Chrome is not sending the appropriate headers (If-Modified-Since and If-None-Match) because the cache control is not set, forcing the default (which is what you're experiencing). Read more about the cache options here: https://developer.mozilla.org/en-US/docs/Web/API/Request/cache.
You can get the wished behaviour on the server by setting the Cache-Control: no-cache header; or on the browser/client through the Request.cache = 'no-cache' option.
Chrome was not sending 'If-None-Match' header for me either. I didn't have any cache-control headers. I closed the browser, opened it again and it started sending 'If-None-Match' header as expected. So restarting your browser is one more option to check if you have this kind of problem.

how to reverse engineer an http API call using REST console

I'm trying to replicate a request I make on a website (ie zoominfo.com) using the same http POST parameters using chrome rest console, but it fails for some reason. I'm not sure if there is a missing field or it's not working because the origin of the request isn't valid.. can someone point me out in the right direction? Below is a detailed explanation of the experiment:
ORIGINAL CASE
basically if I go to zoominfo.com (registered and all) I see a form page that I need to fill:
if I hit enter.. the site makes an ajax call. If I open the chrome web dev tools, and open the network tab, I see the details of the ajax call:
notice the body of the POST has the name John Becker in it:
{"boardMember":{"value":"Include","isUsed":true},"workHistory":{"value":"CurrentAndPast","isUsed":true},"includePartialProfiles":{"value":true,"isUsed":true},"personName":{"value":"john%20becker","isUsed":true},"lastUpdated":{"value":0,"isUsed":true}}
the response is shown under the respones tag:
WHAT I'M TRYING TO DO
basically replicate what i've done above using a REST console (note: so there is nothing illegal here.. i'm just replacing a chrome browser action with a rest client action.. i'm not hacking anyone and i'm not getting information I can't get the normal way, but if someone feels otherwise.. please let me know)..
so I plug in the same parameters as above into the rest console:
now i'm not sure about authentication.. but just to be safe, i entered the same user name and pwd i have for the site into the REST console:
but then I keep on getting an error as a response to my rest console's request:
UPDATE: CORRECT ANSWER:
so according to JMTyler's answer.. I had to simply include criteria in the RAW body, and convert it to url encoding.. in addition to that, I had to explicitly set the encoding in the rest console body..
looking at the chrome inspector more closely, it turns out that I simply had to click on view source:
to get the url-encoded value that I needed to put in the RAW body in the rest console:
I also had to set encoding to gzip,deflate,sdch and things worked fine!
The form is posting all that JSON under the field criteria. You can see this in the screencap of the chrome dev console you posted.
Just start your raw body in rest console with criteria= and make sure the json has been url-encoded. That should do it.
No authentication is needed because none is passed through the headers in your screencap. Any cookies you have when you load the page normally will also be loaded through rest console, so you don't need to worry about explicitly setting them.
Reading your problems I'll make an educated guess:
zoominfo does not provide an RESTful API.
Rest-Console understands and uses HTTP Authentication, which is different from the authentication handler zoominfo implemented.
A possible way to work around may be:
Make a call to the login-page via rest console. you'll get back cookies and a lot more.
In subsequent requests to zoominfo be sure to include those cookies (likely holding some session information) in your request, therefore acting like a browser.