So I have the same website making the same request to the same server on (1) Chrome 76 and (2) Chrome 77 from different networks and computers.
One request has (1) Sec-Fetch-Mode: no-cors, Sec-Fetch-Site: cross-site and the other one (2) Sec-Fetch-Mode: cors, Sec-Fetch-Site: same-site.
The one with no-cors fails with a 400 to a C# Web API endpoint with CORS enabled (for years and thousands of different users on all kinds of devices).
What is going on? There is talk of a Chrome bug not sending that header for pre-flight, but there it is and set to no-cors.
Security setting or bug in Chrome? Fixable server-side or front-end-side?
This is sent by an XMLHttpRequest, not the new Fetch-API.
Related
I'm trying to track down a problem in our Java, SpringMVC/WebFlow web application, that seems to occur whenever some extra, unexpected requests arrive from Firefox, with no session cookie attached, and with some Sec-Fetch headers different from the requests before and after them:
Sec-Fetch-Dest=empty;Sec-Fetch-Mode=cors;Sec-Fetch-Site=same-origin
These requests do not appear in the Firefox developer panel's network tab, even with all requests selected.
But both the HTTP access log and a custom "log everything" filter show the above, with the same Firefox User Agent, no cookies, and no Referer header.
e.g.
9:13:12,426 DEBUG RequestLogFilter:GET->http://localhost:9080/payment/pos?execution=e1s1 (null/false), headers:Host=localhost:9080;User-Agent=Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:107.0) Gecko/20100101 Firefox/107.0;Accept=/;Accept-Language=en-US,en;q=0.8,es-MX;q=0.5,es;q=0.3;Accept-Encoding=gzip, deflate, br;DNT=1;Connection=keep-alive;Sec-Fetch-Dest=empty;Sec-Fetch-Mode=cors;Sec-Fetch-Site=same-origin;, cookies:
(Where "null/false" is the HttpServletRequest.getRequestedSessionId and isRequestedSessionIdValid().)
I can run the same sequence of web requests dozens or more times without seeing these extra requests appear, it's quite unpredictable. With or without fresh/private/incognito, cleared cookies, etc.
I'm looking for any ideas why Firefox is sending these requests. And, no, I don't think this is specific to Firefox. We've seen log evidence that Chrome at least seems be doing something similar in our production environment, but I haven't yet been able to duplicate it under Chrome (107) locally, where I have all this additional logging enabled.
Users have started having problems with Flash-based traffic under Chrome 80: Cookies are not being sent with POST requests.
I'm aware of the SameSite updates, but our traffic is all same-domain, so I assumed this wouldn't affect us.
Debugging the request headers from the debug tools:
Here's what I note:
In an older version of Chrome 73:
there are no Sec-Fetch-* headers
Origin header is always correct
cookies are sent properly
In Chrome 80, GET requests:
Origin is correct, and cookies are sent
now has Sec-Fetch-* headers
the Sec-Fetch-Site cookie says cross-site -- Is this right? This is determined by the browser, correct? Why would Chrome label the traffic as cross-site? - the request URL is the same as my page, same as window.location.hostname.
In Chrome 80, POST requests:
Sec-Fetch-* cookies same as GET
the Origin header is null - wait, why? This also is assigned by the browser, right? Why null?
cookies are not sent
This makes absolutely no sense to me. It's always worked, and we don't use multiple domains, and our cookies are secure and httponly. Can someone help me understand:
why Chrome 80 would label my requests as Sec-Fetch-Site: cross-site?
why Chrome 80 would send Origin: null and no cookies for POSTs?
I am experiencing the same problem. For the post request from Flash, sometimes the request doesn't contain cookies at all, sometimes it only contains one cookie key (I have multiple keys in my cookie). It looks like a bug in Chrome 80, unless they did it on purpose b/c they want to kill Flash for a long time.
Chrome 80 and up (released on 4/Feb/2020) enforces setting the SameSite attribute (which I downloaded one day early using Chrome Beta to test my site).
It gives my site the following error:
A cookie associated with a cross-site resource at URL was set without the SameSite attribute. It has been blocked, as Chrome now only delivers cookies with cross-site requests if they are set with SameSite=None and Secure. You can review cookies in developer tools under Application>Storage>Cookies and see more details at https://www.chromestatus.com/feature/5088147346030592 and https://www.chromestatus.com/feature/5633521622188032.
I've looked in Application>Storage>Cookies but I have only one cookie set which looks like this:
How do I know which Cookie was blocked and which request it was blocked on?
Does this issue causes a cookie to not be set OR does it cause a cookie not to be sent in an HTTP request?
Does it treat cookies set client side and cookies set server side (using the Set-Cookie response header) differently?
Thanks
See here for more info: https://www.chromium.org/updates/same-site/test-debug
How do I know which Cookie was blocked and which request it was blocked on?
You will need to look through the Network panel in DevTools, find the request, and look at the filtered out cookies.
Does this issue causes a cookie to not be set OR does it cause a cookie not to be sent in an HTTP request?
Both are possible.
Does it treat cookies set client side and cookies set server side (using the Set-Cookie response header) differently?
No.
I was working on chrome extension with motive of intercepting all HTTP(S) requests/responses and log all headers into persistent file (on disk). I was almost close to my goal. But when I looked some requests closely, I found that in many requests "If-None-Match" and "If-Modified-Since" are missing in requestHeaders. Though, I can see them in Network panel displayed by the Developer Tools.
I tried hard to figure out any patterns, causing such behavior. But, unfortunately there is no such pattern.
Anybody please help.
The receive the list of requestHeader is necessary to use the onBeforeSendHeaders event from chrome.webRequest API.
In the onBeforeSendHeaders description is mentioned that some headers are not available for reading/processing:
Authorization
Cache-Control
Connection
Content-Length
Host
If-Modified-Since
If-None-Match
If-Range
Partial-Data
Pragma
Proxy-Authorization
Proxy-Connection
Transfer-Encoding
I'm afraid you will not be able to read or modify these headers, because it's an API forced limitation.
When I try use f.e. google maps API, sometimes it is asking for login and password in the browser, and then it is returning 401 Unauthorized in the browser (Firefox 37).
Then when I try to go to the www.google.com and try to search something, it is doing the same thing - 401 Unauthorized.
What does this mean and why is it showing? After couple minutes it is gone (sometimes it is enough to clear cache of the browser).
Now I tried load www.google.com:
Connection
close
Content-Type
text/html
Date
Thu, 01 Jan 1970 00:02:23 GMT
Server
httpd
WWW-Authenticate
Basic realm="NAT Router"
Auth:
No Proxy-Authenticate Header is present.
WWW-Authenticate Header is present: Basic realm="NAT Router"
The message in the dialog box of the browser (where login and password are asked) is "NAT Router".
I'm afraid some networking equipment at your office intercepts HTTP sessions and acts as transparent HTTP proxy which requires authentication and prompts you for login/password. This could be just a misconfiguration, security feature or even malicious activity.
I would recommend you to investigate this in more detail with a networking specialist who knows the topology of your local network, etc. Additional information provided in the question is not enough to tell for sure.