Chrome dev tools: displaying cookies - google-chrome

Visit a random site, for example example.com
Open dev tools, application -> cookies
Make sure that it's empty
Open console tab, make a fetch request to a random website, for example fetch('https://api.chucknorris.io/jokes/random')
Open application -> cookies again and notice cookies from the domain from step 4.
Many times a day I clear a website's cookies for debugging purposes. But in Chrome clearing one website's cookies affects clearing other sites' cookies as well and it clears my sessions on other websites, which is not what I want.
Is it a bug or a feature in Chrome? I think that it's a bug and cookies from other domain shouldn't be shown, but maybe I miss something.
P.S. Firefox don't show cookies from other domains.

What I have observed is that the list of URLs under the 'Cookies' entry is the page that made a request to the origin server for the cookies shown. If you look in the network traffic you can see the the URLs in the 'Cookies' list are the referer of the requests to the origin servers, whose response sets the cookie. This is a common method for tracking cookies to be set. A.com in the 'Cookies' URL list will have some page with lots of IMG or Script or iframe elements that make requests to the domains in the list of cookies and the responses from those domains set the cookies. What I found confusing is that the Chrome documentation (https://developers.google.com/web/tools/chrome-devtools/storage/cookies) refers to the list of URLs under the 'Cookies' entry as 'Origins'. They are not the cookies origin as defined in RFC 6265, they are the referer page that made the requests to the cookie origin servers.

Related

How do I validate HSTS is being enforced by the browser

I set the HSTS header on my site and i want to test that the different browsers (chrome, Firefox, IE, Opera) do enforce the header.
I set a trusted certificate, connect to the site and I can see the the header at the HTTP response. but i want to validate that the browser do enforce the protocol.
In Chrome it's easy and it works:
- I can query the site at chrome://net-internals/#hsts
- When trying to connect with HTTP i get 0kb response with status 307.
- If i change back self-signed cert i can't connect the site and there is no proceed option.
The other browsers behave differently, i can't query the HSTS list, the response status and size is different and when changing to self-signed cert (after first trusted connection) i do have proceed option.
So how can i validate that the protocol is enforced on each browser?
Although Chrome's ability to query the HSTS cache and see the fake 307 redirect is handy, you can just check whether HSTS is enforce.
HSTS offers you two options:
Automatically load HTTP resources over HTTPS
Prevent click through of cert errors.
You are concentrating on the second option, but why not use the first option as the test? Just load the site up with the HTTP and check whether it is redirected (i.e. loads HTTP URL and so is not using a HSTS rule) or if it just loads HTTPS URL immediately (i.e. is using HSTS).
So in Firefox for example open network tools click on "Persist logs" option (and let's do "Disable Cache" to avoid any confusion). Then go to a site which has an HSTS header over HTTP (e.g. http://stackoverflow.com) and you'll see a 301 redirect if this is your first visit:
Next time you go to it (after it has cached the HSTS header) it should go directly to the HTTPS URL even though you typed the HTTP URL in address bat:
If you've already been on stackoverflow.com then you can clear the HSTS cache to try this again.
Once you've confirmed that HSTS is being used or not, you can then investigate the click-through issue. Browsers should not allow click through when HSTS is in place, including for self-signed certs, but maybe there's a bug, or it's still cached your old cert in some places, or the HSTS policy has expired or something else...

Delete site from Chrome's preloaded HSTS list

Is there any way to remove entries from Chrome's preloaded HSTS list?
For development reasons I need to route a webpage google-analytics.com which refers to different from the original ip address. But google-analytics.com is on Chrome's preloaded HSTS list. This results in an error while loading web page, because my ssl of google-analytics.com certificate does not properly signed.
I know that I can remove entries from the dynamically created HSTS list via chrome://net-internals/#hsts - but not entries that come with the browser.
Is there any way to tell Chrome that I know what I'm doing?
1) Navigate to chrome://net-internals/#hsts
2)First, to confirm the domain's HSTS settings are recorded by Chrome type the host name into query domain section. Click the query button. If the box returns found with settings information below, the domains HSTS settings are saved in your browser.
For your purpose(if you want to delete):
3)Type the same domain name into the DELETE DOMAIN section and click Delete button.
Your browser will no longer force an HTTPS connection for that site! You can test the working by refreshing of navigating to that site.
If you have create your own CA, create the certificate for google-analytics.com with this CA and import the CA as trusted into the browser/CA store then it should work. It will ignore pinning information if the certificate is signed by an explicitly imported CA.
See also Man in the middle attack to a website which uses public key pinning.
Nope: https://bugs.chromium.org/p/chromium/issues/detail?id=483634
You could try using a self-signed cert but imagine they preload pinning as well for their sites so doubt that would work either.

Fiddler not seeing all cookies.

When I access the URL http://kgnzb.rvxrg.servertrust.com/login.asp using Chrome, I can see that there are 3 cookies in the browser. Using Chrome Developer tools with Javascript disabled to view the cookies.
However when I look at the Fiddler traffic, I see only two cookies. Screenshot http://prntscr.com/27pecx.
I see the same behavior as Fiddler when I scrape the page also.
Could some one explain why Fiddler and the Scraper sees only two cookies where the browser sees 3 cookies?
Thanks
Fiddler shows you exactly what is sent to and from the server.
You're only showing the Set-Cookie responses of one HTTP response; those are the cookies set by only that response.
If you want to look at all cookies being sent to the server for a request, look at the Request Header Inspector's Cookie value for a subsequent request.
This web page is made up of multiple resources, each of which may include a Set-Cookie header. Also, when JavaScript is enabled, cookies can be added or removed by manipulating the document.cookie property. Such changes would only be observable on the network in a subsequent request.

How does Firefox implement HSTS in detail?

I was doing some research on how Firefox and Chrome are implementing HSTS (HTTP Strict Transport Security) in detail.
Turns out that they have a predefined list with some sites that already implement HSTS. This can be seen here here and/or here.
And these list seems to be somehow linked to the sourcecode itself which makes somehow sense...but how do Firefox and Chrome handle my own HSTS headers? How and where do they store my URL, my max-age and whether I includeSubDomains or not?
I wasn't able to find this in about:config or likewise....
So maybe somebody knows more about this issue than me, I'm just curious (:
Thx!
See http://hg.mozilla.org/mozilla-central/file/20bbf73921f4/netwerk/protocol/http/nsHttpChannel.cpp#l1072 and then http://hg.mozilla.org/mozilla-central/file/20bbf73921f4/security/manager/boot/src/nsStrictTransportSecurityService.cpp#l249 which calls http://hg.mozilla.org/mozilla-central/file/20bbf73921f4/security/manager/boot/src/nsStrictTransportSecurityService.cpp#l147
So the data ends up stored in the permission manager, which is the normal place per-host information gets stored in Firefox. The permission manager stores its state in permissions.sqlite, I think.
Sites that want HTTP Strict Transport Security (HSTS) enforced send a header in response - Strict-Transport-Security: max-age=31536000
max age being time for it to expire. It is sent on each request so that it gets updated to that much more time every time it is requested.
Browser (I have tried only Firefox) stores this data with it and will use it every time the site is accessed. This is true even for incognito mode. If you have ever accessed the site before in non incognito mode then the details of that site is saved and used even if you try to open it now in incognito mode.
For firefox this data is stored in a file called SiteSecurityServiceState.txt which is in your firefox profile folder. You can enter about:support in browser and then select "Show in folder" to open your profile folder where you can locate this file.
I am not sure about predefined sites but above is the file where normal site HSTS details are updated for firefox.
More details - Understanding HTTP Strict Transport Security (HSTS)
PS: Above link goes to my personal blog that has more details on HSTS.

More Odd Firefox Cache Manifest Behavior: Redirect to Outside Domain Results in 404 Failure

I have an HTML5/Javascript (PHP/MySQL on the server) app with a cache manifest and it runs fine on almost all mobile and desktop browsers except for Firefox.
When I remove the cache manifest, it works fine in Firefox. Therefore, it's something odd with the cache manifest in Firefox that I can't figure out. It's loading a file from the cache, even though the file sends a Cache-Control: no-store, no-cache header.
The file handles the OAuth dance for getting a LinkedIn access token and follows these steps:
The app calls the file via Javascript using window.location.replace('file.php')
file.php loads and is redirected to file.php?param=initiate
file.php?param=initiate loads, gets a request token from LinkedIn, then redirects to the LinkedIn authorization page, then gets redirected to file.php?param=initiate&otherparameters
file.php?param=initiate&otherparameters loads, otherparameters is used to get an access token from LinkedIn, then reloads the app because now it has access.
However, on Firefox (16.0.2 on Windows 7), I get the following:
The app calls the file via Javascript using window.location.replace('file.php')
file.php loads and is redirected to file.php?param=initiate
(FireBug shows Status 302 Found and the Response Headers show the location /file.php?param=initiate)
file.php?param=initiate loads, gets a request token from LinkedIn, but does NOT redirect to the LinkedIn authorization page: it shows the 404 page (FireBug shows Status 302 Found and the Response Headers show the location https:linkedin.com/authenication link, but Firefox does not go to the LinkedIn page, it makes another GET request for file.php?param=initiate and loads it from the cache: Status 200 OK (BF Cache) and shows the 404 page).
file.php is NOT in the cache manifest.
Basically it does not go to the Location in the response header from step 3 that should take it to the LinkedIn authorization page, but I can't figure out why not.
Any ideas on how to fix this?
If you want to reproduce this problem, here's a link to a test event. Try to send a LinkedIn connection request and watch Firebug. All the LinkedIn profiles for this event (except mine) are dummy profiles, so don't worry about sending a LinkedIn connection request to a random stranger. You have to register first with your e-mail to get an activation link, but you can use a disposable e-mail address if you want to.
Some things I've tried:
No cache manifest: this fixes it, but I need offline functionality
Sending headers with various permutations of no-store, no-cache, must-ravalidate, past Expires date, etc.
Reducing the number of entries in the cache manifest
Various combinations of SETTINGS: prefer-online, NETWORK: *, NETWORK: https://*, etc.
I solved this problem by re-writing my LinkedIn/OAuth library so that it does everything via ajax instead of sending the Location header via PHP.
After much frustration I figured out why this problem was happening, so hopefully this will help others who face a similar predicament.
It turns out the cache manifest does not allow redirects to outside domains (this is probably documented somewhere, but it didn't show up for me when I searched for a solution).
Part of the problem was that I didn't know the redirect was causing the problem. I thought it was just the weirdness of the cache manifest. After all, it worked fine on Chrome and Safari and I didn't get any useful debugging info from Firebug.
Here's a useful article on the gotchas of cache manifest (this issue is listed as Gotcha #8).
Here's a link to the HTML Offline Spec (this issue appears to be listed in section 6.7.6(4), but the document is so opaque I can't event tell whether that's really what it's referring to).