How to preserve SSL with HTML5 application cache - html

I have an existing site that works fine over http and https (SSL). The SSL certificate is valid and can be confirmed by inspecting in the browser.
I am starting to use a manifest file to enable the HTML5 application cache on my website. This is useful for making the page load faster, and eventually for offline capabilities. This is working great when using a regular http connection. The problems happens when accessing the site over https (SSL). When I do this, I can access my website's content just fine, and the URL says "https" however I see the following behavior:
Safari: It displays the lock icon, but when I click the lock icon to inspect the certificate, it says that the certificate is invalid.
Firefox: Does not display the colored address bar indicating encryption, and when inspecting the certificate, it says that there is no certificate.
Chrome and Opera: Correctly displays the secure nature of the URL, and when clicking the lock icon it displays the SSL certificate information. Yes!
I understand that using the application cache causes resources to be served locally from the browser, and as such there is no encryption happening, however customers don't necessarily know that there is an application cache happening in the background, and they are expecting to see a valid SSL certificate and indications that the connection is secure. Safari and Firefox appear to be doing this incorrectly, unless I am missing something. That is my question. Does anyone know how to get Safari and Firefox to display the SSL certificate for pages served from the application cache? Is there something special that you need to do, or is it a Safari and Firefox bug?

I believe someone has discussed this with me before. Please let me know if this helps.
Change all of your script and css references from
http:// or https:// to //.
If you haven't any then it is moot, but if you do, please let me know if that has an effect.
I believe this may be related to not being able to verify the references from a cached page.

Based on the history of vulnerabilities, I'd guess this may have been overlooked for the sake of fixing more critical issues. That said, I think this should be reported to both vendors now that some of the glaring vulnerabilities have been patched. Have you tested this with the latest releases of Firefox and Safari?

Did you serve the application manifest over SSL?

Related

Third-Party Cookies in Chrome

On the latest version of Chrome (Version 90.0.4430.85 on MacOS), I found that third-party cookies are disabled even though in my browser settings I checked "Allow all Cookies". I also tried adding the site to "Sites that can always use cookies" and checked "Including third-party cookies on this site", but it still doesn't seem to work. I'm accessing a course on LTI that requires third-party cookies. (The course is quite old so that might be an issue)
I'm getting this error:
Chrome Settings:
The reason I know this is because my other laptop has an earlier version of Chrome(around 80) and the cookies are working(the course loads), but it no longer works when I update my Chrome to the latest version.
When I tried in Safari, I'm able to allow third-party cookies by disabling "Prevent cross-website tracking", but I cannot find such settings in Chrome.
Any ideas on what I might try to be able to use third-party cookies on the latest version of Chrome? Also, why is it the case that this site functions in an earlier version of Chrome but not the latest? Thanks in advance.
Your browser settings for 3rd-party cookies look OK as far as testing by allowing all.
The issue may be with the LTI tool/app provider who is providing their product through Canvas and other LMS's - for example, if the LTI tool/app provider hasn't set their cookies with SameSite=None - sounds like you're suspecting that as far as course/app being older.
I think you can test this by temporarily disabling the same site requirement here: chrome://flags/#same-site-by-default-cookies
There are a few other decent testing tips from Chromium here.
If that's the issue and you still need to provide access quickly for a bunch of users, but can't wait for the LTI tool/app to be updated, you can usually update the LTI app/tool settings in Canvas to open it in a separate tab/window instead of as an iframe - e.g. these settings in Canvas.
Hope it works out!

Chrome uses different versions of HTTP randomly

I have a website that supports HTTP2. If I use Firefox or Chrome in anonymous mode then all the calls are made with HTTP2 (shown as h2 in Chrome developer tools Network tab).
When I use ordinary Chrome though it does exactly the same call but uses HTTP1.1 for some reason. I have tried to turn of all extensions but it is still HTTP1.1.
Sometimes if I restart Chrome is starts using HTTP2 again but then falls back to HTTP 1.1 after a while or after one of the restarts. I cannot figure out what can be the problem and it looks very mysterious.
I did not change any settings explicitly in Chrome if they even exist. What can be the problem here?
P.S. I did "clear browser cache" and "clear browser cookies" and suddenly I got only http2, don't know how this ca affect protocol. This was a temporary improvement though and I got http1.1 after a while again
P.P.S. I have even got a case when loading a website that one request was http1.1 but the other one was http2 to exactly same server.

Local virtual hosts show Privacy Error on Chrome due to HSTS

I have created several virtual hosts for my development processes. They were working just fine till yesterday. But in my chrome app, today they stopped working. Chrome shows: NET::ERR_CERT_AUTHORITY_INVALID
All my vhosts end with .dev. I changed one .dev to .work and its again working. But I can not do this for all vhosts as there are too many of them. What do I do?
PS:
They are working fine in firefox.
The error remains same in chrome incognito mode.
I tried clearing cache and hard reload, deleted my history and cache, restarting chrome even windows multiple time, nothing works.
In one solution, I found an exception can be included in chrome://net-internals/#hsts. I tried deleting domain in there but somehow it still appears in Query Domain search.
Chrome have switched the .dev sub domain to HTTPS only.
They have done this by turning on HSTS for this top level domain, but by preloading this in the Chrome code rather than sending the HSTS header. This means it cannot be switched off in the chrome://net-internals/#hsts screen.
More info:
https://ma.ttias.be/chrome-force-dev-domains-https-via-preloaded-hsts/
So you’re only options are:
Update you’re vhosts to a different TLD (e.g. .test). And yes this might be painful because you have so many.
Move to HTTPS by creating a certificate and updating your URLs. A self signed certificate that you can create yourself will do, however note that HSTS not only blocks accessing the site over plaintext HTTP, but also prevents you clicking through certificate errors. So you’ll need to manually accept any certificate to your trust store before it can be used.
The chrome team have been pushing HTTPS more and more and certain features are now HTTPS-only so even dev envs will need it now. So maybe it’s finally time take the effort to make the switch.

ERR_TOO_MANY_REDIRECTS error after using Cloudflare

After I started using Cloudflare I started facing this error in my website(frequently in chrome and opera).
Some of the points I have found out after testing multiple times are:
So far I have faced this problem in Google Chrome, Chromium, and Opera. Chrome being the major one for my visitors.
The inner pages are working fine. For example example.com/about-us is working fine but the error occurs when we visit example.com .
I once thought this is because of too many links in my home page. So tried removing all the links on home page but the error continued.
If I type example.com/node instead of example.com it works fine(example.com/node being the default home page of drupal website). But users generally type example.com so it can't be the solution.
In cloudflare settings I have changed the SSL mode to "Full Strict" from "Flexible" as suggested in this answer. It's been more than 10 hours but it hasn't helped so far.
I am using Godaddy's linux hosting. It's a Drupal Website.
Any clue is appreciated. Thanks in advance.
Since you are using SSL the issue most likely is due to the fact that you have some redirects on your website ( .htaccess ) to the httpS version of the site . On top of that Cloudflare have features in their Crypto section that allow you to make redirects directly from there. Thus if you have set such rules both places, you will most likely get a redirect looop.
My advice would be for you to check the Crypto section of your CloudFlare account and make sure that
Automatic HTTPS Rewrites
and
HTTP Strict Transport Security
are disabled.
Then clear the Cloudflare cache and try accessing your website from a fresh browser.
This should do the trick, if not , you should provide us with your .htaccess content so that the case can be resolved.

How does Firefox implement HSTS in detail?

I was doing some research on how Firefox and Chrome are implementing HSTS (HTTP Strict Transport Security) in detail.
Turns out that they have a predefined list with some sites that already implement HSTS. This can be seen here here and/or here.
And these list seems to be somehow linked to the sourcecode itself which makes somehow sense...but how do Firefox and Chrome handle my own HSTS headers? How and where do they store my URL, my max-age and whether I includeSubDomains or not?
I wasn't able to find this in about:config or likewise....
So maybe somebody knows more about this issue than me, I'm just curious (:
Thx!
See http://hg.mozilla.org/mozilla-central/file/20bbf73921f4/netwerk/protocol/http/nsHttpChannel.cpp#l1072 and then http://hg.mozilla.org/mozilla-central/file/20bbf73921f4/security/manager/boot/src/nsStrictTransportSecurityService.cpp#l249 which calls http://hg.mozilla.org/mozilla-central/file/20bbf73921f4/security/manager/boot/src/nsStrictTransportSecurityService.cpp#l147
So the data ends up stored in the permission manager, which is the normal place per-host information gets stored in Firefox. The permission manager stores its state in permissions.sqlite, I think.
Sites that want HTTP Strict Transport Security (HSTS) enforced send a header in response - Strict-Transport-Security: max-age=31536000
max age being time for it to expire. It is sent on each request so that it gets updated to that much more time every time it is requested.
Browser (I have tried only Firefox) stores this data with it and will use it every time the site is accessed. This is true even for incognito mode. If you have ever accessed the site before in non incognito mode then the details of that site is saved and used even if you try to open it now in incognito mode.
For firefox this data is stored in a file called SiteSecurityServiceState.txt which is in your firefox profile folder. You can enter about:support in browser and then select "Show in folder" to open your profile folder where you can locate this file.
I am not sure about predefined sites but above is the file where normal site HSTS details are updated for firefox.
More details - Understanding HTTP Strict Transport Security (HSTS)
PS: Above link goes to my personal blog that has more details on HSTS.