iFrame accessed with https is marked 'unencrypted' by Chrome ... why? - google-chrome

I am writing a basic web application that uses <iframe> elements for certain features.
The entire site uses https, including the links to and from the iframe, but when I view the website using Google Chrome and right-click on the displayed frame and look at "View Frame Info" --> Connection, I see the following message:
Your connection to www.example.com is not encrypted
The connection uses TLS 1.1.
The connection is encrypted using AES_256_CBC, with SHA1 for message authentication and ECDHE_RSA as the key exchange mechanism.
Confusingly, it says that the connection is not encrypted ... and then it says that it IS encrypted.
I thought that whenever you use https://, the content is encrypted. Am I wrong? Or is Google Chrome referring to something else?

I ended up posting this issue to the Chromium bug discussion board and someone there indicated that it's a bug.

Related

Mixed-content warning from Chrome 87 when accessing HTTP image source from an HTTPS page

We have an in-house (.Net) application that runs on our corporate desktops. It runs a small web server listening on for HTTP requests on a specific port on localhost. We have a separate HTTPS website that communicates with this application by setting the ImageUrl of a hidden image to the URL of the - this invokes an HTTP request to localhost, which the application picks up on and actions. For example, the site will set the URL of the image to:
http://127.0.0.1:5000/?command=dostuff
This was to work around any kind of "mixed content" messages from the site, as images seemed to be exempt from mixed-content rules. A bit of a hack but it worked well.
I'd seen that Chrome was making moves towards completely blocking mixed content on pages, and sure enough Chrome 87 (currently on the beta channel) now shows these warnings in the Console:
Mixed Content: The page at 'https://oursite.company.com/' was loaded
over HTTPS, but requested an insecure element
'http://127.0.0.1:5000/?command=dostuff'. This request was
automatically upgraded to HTTPS, For more information see
https://blog.chromium.org/2019/10/no-more-mixed-messages-about-https.html
However, despite the warning saying the request is being automatically upgraded, it hasn't been - the application still gets a plain HTTP request and continues to work normally.
I can't find any clear guidance on whether this warning is a "soft fail", and whether future versions of Chrome will enforce the auto-upgrade to HTTPS (which would break things). We have plans to replace the application in the longer term, but I'd like to be ahead of anything that will suddenly stop the application from working before then.
Will using HTTP to localhost for images and other mixed content, as used in the scenario above, be an actual issue in the future?
This answer will focus on your main question: Will using HTTP to localhost for images and other mixed content, as used in the scenario above, be an actual issue in the future?
The answer is yes.
The blog post you linked to says:
Update (April 6, 2020): Mixed image autoupgrading was originally scheduled for Chrome 81, but will be delayed until at least Chrome 84. Check the Chrome Platform Status entry for the latest information about when mixed images will be autoupgraded and blocked if they fail to load over https://.
That status entry says:
In developer trial (Behind a flag) (tracking bug) in:
Chrome for desktop release 86
Chrome for Android release 86
Android WebView release 86
…
Last updated on 2020-11-03
So this feature has been delayed, but it is coming.
Going through your question and all comments - and putting myself in your shoes, I would do the following:
Not messing with either the currently working .Net app/localhost server (HTTP), nor the user-facing (HTTPS) front-end.
Write a simple/cheap cloud function (GCP Cloud Function or AWS Lambda) to completely abstract away your .Net app from the front-end. Your current HTTPS app would only call the cloud function (HTTPS to HTTPS - not having to pray anymore that Google will not shut-down mixed traffic, which will happen eventually, although nobody knows when).
The cloud function would simply temporarily copy the image/data coming from the (insecure) .Net app to cloud storage and then serve it straight away through HTTPS to your client-side.

Can a Chrome extension get information about a connection?

I want my Chrome extension to collect information about how the current browser tab negotiated its secure connection.
In particular, I want to know the protocol and the cypher/auth & key exchange mechanisms used in the HTTPS connection: SSL3? TLS 1.2? And those ugly strings like AES_128_GCM or CHACHA20_POLY1305, ECDHE_RSA or ECDHE_ECDSA.
Is this even possibile?
Within the API index, the best fitting module seems to be chrome.webRequest. But I can't see any means to gather connection data. Am I missing something?
You cannot get any information about the TLS connection via the Chrome extension APIs. A few days ago, a popular feature request on Chromium's issue tracker was marked as WontFix because of the complexity of implementing such a feature in Chrome (Issue 107793: Provide information about the TLS connections to extensions via the webRequest API).
The only way to get the certificate information in Chrome is by clicking on the lock icon, then the Connection tab.

Chrome extension losing requested permissions after browser restart

I developed a chrome extension which communicates with IP phones.
The communication is done in a event page which is sending POST requests via the XMLHttpRequest object.
Because the hostname or IP address of the phone is configured in the options page I added optional_permissions to the manifest file and request them from the user after saving the options with chrome.permissions.request.
Cross-Origin XHR works now without any problems until I restart chrome...
After restarting chrome it seems like the requested permission is lost and I get the typical
is not allowed by Access-Control-Allow-Origin error.
When I click on the extensions permissions I can also see that my requested permission is no longer listed.
Because the chrome.permissions.request is only working for a user gesture I can't request it during the load of my extension or on the fly. If I request the permission again in my options page I don't get asked again whether I want to allow it or not put the permission is granted and everything works again as usual.
Is there a way to get this permission granted persistent after requesting it? I only want the extension to have access to the endpoints it needs.
Thank you very much.
For me the following reported issue answered my question:
Issue 158004: chrome.permissions.request support for user-supplied URL.
To make it clear: It is not possible to request a subset of the permissions defined in optional_permissions. If you define http://*/* then you need to request exactly this string! A subset like http://example.org/* wont work!
Here is a quote from a comment in the issue description which makes that clear:
"There's no wildcard handling, just plain string comparison between the URLPatterns"
The Issue has been fixed in Revision 182287
The only thing left is to cross your fingers that this fix gets included in a upcomming chrome release soon. We'll have to use the bloody Access your data on all websites permission in the meanwhile.

How does Firefox implement HSTS in detail?

I was doing some research on how Firefox and Chrome are implementing HSTS (HTTP Strict Transport Security) in detail.
Turns out that they have a predefined list with some sites that already implement HSTS. This can be seen here here and/or here.
And these list seems to be somehow linked to the sourcecode itself which makes somehow sense...but how do Firefox and Chrome handle my own HSTS headers? How and where do they store my URL, my max-age and whether I includeSubDomains or not?
I wasn't able to find this in about:config or likewise....
So maybe somebody knows more about this issue than me, I'm just curious (:
Thx!
See http://hg.mozilla.org/mozilla-central/file/20bbf73921f4/netwerk/protocol/http/nsHttpChannel.cpp#l1072 and then http://hg.mozilla.org/mozilla-central/file/20bbf73921f4/security/manager/boot/src/nsStrictTransportSecurityService.cpp#l249 which calls http://hg.mozilla.org/mozilla-central/file/20bbf73921f4/security/manager/boot/src/nsStrictTransportSecurityService.cpp#l147
So the data ends up stored in the permission manager, which is the normal place per-host information gets stored in Firefox. The permission manager stores its state in permissions.sqlite, I think.
Sites that want HTTP Strict Transport Security (HSTS) enforced send a header in response - Strict-Transport-Security: max-age=31536000
max age being time for it to expire. It is sent on each request so that it gets updated to that much more time every time it is requested.
Browser (I have tried only Firefox) stores this data with it and will use it every time the site is accessed. This is true even for incognito mode. If you have ever accessed the site before in non incognito mode then the details of that site is saved and used even if you try to open it now in incognito mode.
For firefox this data is stored in a file called SiteSecurityServiceState.txt which is in your firefox profile folder. You can enter about:support in browser and then select "Show in folder" to open your profile folder where you can locate this file.
I am not sure about predefined sites but above is the file where normal site HSTS details are updated for firefox.
More details - Understanding HTTP Strict Transport Security (HSTS)
PS: Above link goes to my personal blog that has more details on HSTS.

How to preserve SSL with HTML5 application cache

I have an existing site that works fine over http and https (SSL). The SSL certificate is valid and can be confirmed by inspecting in the browser.
I am starting to use a manifest file to enable the HTML5 application cache on my website. This is useful for making the page load faster, and eventually for offline capabilities. This is working great when using a regular http connection. The problems happens when accessing the site over https (SSL). When I do this, I can access my website's content just fine, and the URL says "https" however I see the following behavior:
Safari: It displays the lock icon, but when I click the lock icon to inspect the certificate, it says that the certificate is invalid.
Firefox: Does not display the colored address bar indicating encryption, and when inspecting the certificate, it says that there is no certificate.
Chrome and Opera: Correctly displays the secure nature of the URL, and when clicking the lock icon it displays the SSL certificate information. Yes!
I understand that using the application cache causes resources to be served locally from the browser, and as such there is no encryption happening, however customers don't necessarily know that there is an application cache happening in the background, and they are expecting to see a valid SSL certificate and indications that the connection is secure. Safari and Firefox appear to be doing this incorrectly, unless I am missing something. That is my question. Does anyone know how to get Safari and Firefox to display the SSL certificate for pages served from the application cache? Is there something special that you need to do, or is it a Safari and Firefox bug?
I believe someone has discussed this with me before. Please let me know if this helps.
Change all of your script and css references from
http:// or https:// to //.
If you haven't any then it is moot, but if you do, please let me know if that has an effect.
I believe this may be related to not being able to verify the references from a cached page.
Based on the history of vulnerabilities, I'd guess this may have been overlooked for the sake of fixing more critical issues. That said, I think this should be reported to both vendors now that some of the glaring vulnerabilities have been patched. Have you tested this with the latest releases of Firefox and Safari?
Did you serve the application manifest over SSL?