I am using SharedArrayBuffer for some of functionalities in one of the webapp. On Chrome version 92 I have enabled cross origin isolation as per instructions here and added following headers to root page and wasm files.
Cross-Origin-Resource-Policy: cross-origin
Cross-Origin-Embedder-Policy: require-corp
Cross-Origin-Opener-Policy: same-origin
After that wasm files were loaded successfully.
But now I am not able to add ReCaptchaV2. The first request to fetch main script is successful https://www.google.com/recaptcha/api.js... but the subsequent iframe loading of https://www.google.com/recaptcha/api2/anchor... url is being blocked by Chrome.(Reason: This resource needs Cross-Origin-Resource-Policy: same-site/cross-origin header)
One way to avoid this issue for a while might be to get a token from Chrome to allow usage of SharedBufferArray by getting a Trial Token and using that. Ref https://developer.chrome.com/blog/enabling-shared-array-buffer/#origin-trial. But this might not be a scalable thing to do as I have several origins to take care of.
Is there any other way to use ReCaptcha with Cross Origin Isolation (COEP Headers) ?
SharedArrayBuffer is currently supported in Firefox 79+, and will arrive in Android Chrome 88.
However, it's only available to pages that are cross-origin isolated.
SharedArrayBuffer is currently available in Desktop Chrome, but from Chrome 92 it will be limited to cross-origin isolated pages. If you don't think you can make this change in time, you can register for an origin trial to retain the current behavior until at least Chrome 96.
If you intend to enable cross-origin isolation to continue using SharedArrayBuffer evaluate the impact this will have on other cross-origin elements on your website, such as ad placements. Check if SharedArrayBuffer is used by any of your third-party resources to understand impact and guidance.
You can make a page cross-origin isolated by serving the page with these headers:
Cross-Origin-Embedder-Policy: require-corp
Cross-Origin-Opener-Policy: same-origin
Once you do this, your page will not be able to load cross-origin content unless the resource explicitly allows it via a Cross-Origin-Resource-Policy header or CORS headers (Access-Control-Allow-* and so forth).
There's also a reporting API, so you can gather data on requests that failed as a result of Cross-Origin-Embedder-Policy and Cross-Origin-Opener-Policy.
If you don't think you can make these changes in time for Chrome 92, you can register for an origin trial to retain current Desktop Chrome behavior until at least Chrome 96.
Related
I am making an ASP.NET application and would like to be able for my file upload to allow attachments to be dragged straight from Gmail. The problem is that I cannot load the data from the links dragged in because of cross-origin rules.
There are 2 problems:
First, cross-origin rules prevent me from making requests to the gmail attachment server.
Second, even if I were to make the request with cross-origin, the cookies would not be included.
I am using Chrome and only interested in doing this on my own computers.
One option is I could make a Chrome extension which allows cross-origin requests but only from my website.
Another option would be for my locally hosted server to communicate with Chrome to make the request itself.
Which of these would be the best option and how would I do it?
We have an in-house (.Net) application that runs on our corporate desktops. It runs a small web server listening on for HTTP requests on a specific port on localhost. We have a separate HTTPS website that communicates with this application by setting the ImageUrl of a hidden image to the URL of the - this invokes an HTTP request to localhost, which the application picks up on and actions. For example, the site will set the URL of the image to:
http://127.0.0.1:5000/?command=dostuff
This was to work around any kind of "mixed content" messages from the site, as images seemed to be exempt from mixed-content rules. A bit of a hack but it worked well.
I'd seen that Chrome was making moves towards completely blocking mixed content on pages, and sure enough Chrome 87 (currently on the beta channel) now shows these warnings in the Console:
Mixed Content: The page at 'https://oursite.company.com/' was loaded
over HTTPS, but requested an insecure element
'http://127.0.0.1:5000/?command=dostuff'. This request was
automatically upgraded to HTTPS, For more information see
https://blog.chromium.org/2019/10/no-more-mixed-messages-about-https.html
However, despite the warning saying the request is being automatically upgraded, it hasn't been - the application still gets a plain HTTP request and continues to work normally.
I can't find any clear guidance on whether this warning is a "soft fail", and whether future versions of Chrome will enforce the auto-upgrade to HTTPS (which would break things). We have plans to replace the application in the longer term, but I'd like to be ahead of anything that will suddenly stop the application from working before then.
Will using HTTP to localhost for images and other mixed content, as used in the scenario above, be an actual issue in the future?
This answer will focus on your main question: Will using HTTP to localhost for images and other mixed content, as used in the scenario above, be an actual issue in the future?
The answer is yes.
The blog post you linked to says:
Update (April 6, 2020): Mixed image autoupgrading was originally scheduled for Chrome 81, but will be delayed until at least Chrome 84. Check the Chrome Platform Status entry for the latest information about when mixed images will be autoupgraded and blocked if they fail to load over https://.
That status entry says:
In developer trial (Behind a flag) (tracking bug) in:
Chrome for desktop release 86
Chrome for Android release 86
Android WebView release 86
…
Last updated on 2020-11-03
So this feature has been delayed, but it is coming.
Going through your question and all comments - and putting myself in your shoes, I would do the following:
Not messing with either the currently working .Net app/localhost server (HTTP), nor the user-facing (HTTPS) front-end.
Write a simple/cheap cloud function (GCP Cloud Function or AWS Lambda) to completely abstract away your .Net app from the front-end. Your current HTTPS app would only call the cloud function (HTTPS to HTTPS - not having to pray anymore that Google will not shut-down mixed traffic, which will happen eventually, although nobody knows when).
The cloud function would simply temporarily copy the image/data coming from the (insecure) .Net app to cloud storage and then serve it straight away through HTTPS to your client-side.
My web client application is setting HTTP POST requests via fetch API.
I see that OPTIONS preflight requests are sent via debugging proxy (Charles Proxy), but they are not displayed in Google Chrome Developer Tools\Network tab.
I don't have any filters setup on the network tab. I remember OPTIONS requests being visible there, but not anymore. How do I bring them back?
You'll need to go to: chrome://flags/#out-of-blink-cors, disable the flag, and restart Chrome.
This is an expected behavior change according to:
https://bugs.chromium.org/p/chromium/issues/detail?id=995740#c1
I originally came across this via:
https://support.google.com/chrome/thread/11089651?hl=en
As of 2021 in CHROME the OPTIONS request is visible in the NETWORK tab filter OTHER requests
To see it together with XHR just CTRL+click and pick the request filters you want to see.
UPDATE (April 17) Chrome Version 90.0.4430.72 has made the options requests hidden again :(
Chrome 81 does not seem to display anything even after changing the option and restarting on my computer.
As an alternative solution, I started to use Firefox and its Network tab for development.
https://getfirefox.com
I'm Takashi from Chromium Project, and drove the Out-Of-Blink/Render CORS project.
The project intended to introduce a process isolated CORS implementation for better security and privacy, and many of new network related features rely on this new implementation. Unfortunately we temporarily disabled preflight support in DevTools as it turned out continuing to support it weakens security and privacy. Sorry for inconvenience during this period.
Good news is now Chrome 83 implements the CORS preflight DevTools support again in a security preserved way. So you can monitor the CORS preflight requests as you could do before the Out-Of-Blink/Renderer CORS.
Best,
Looks like Push notifications are finally usable for web-apps! Unfortunately, this requires https for ServiceWorker, which not all sites may have.
One thing I noticed in the spec it mentions:
if r's url's scheme is not one of "http" and "https", then:
Throw a TypeError."
So I'm confused - can the site be http, as long as it includes a serviceworker that is from https? For example, mydomain.com could include an https serviceworker from https://anotherdomain.com?
Another standard, web-api simple-push, doesn't mention requiring https (likely an omission in the documentation?), and "The user experience on Firefox Desktop has not been drawn out yet". Is the documentation on this outdated, or is push really only supported in FirefoxOS??
Simple-push, that is the current push solution in Firefox OS doesn't have anything to do with ServiceWorkers.
The next generation of push, implemented by both Google and Mozilla will be done through ServiceWorkers:
Push API spec
In that case yes, your content will need to be served over HTTPS.
Probably you will be interested in the LetsEncrypt initiative:
letsencrypt.org
A new certification authority that will help developers to transition their content over HTTPS.
Also just for development purposes, both Google and Mozilla implementations of ServiceWorkers allow you to bypass the check of the secure content, if you develop against localhost.
In the case of Mozilla you will need to enable the flag:
devtools.serviceWorkers.testing.enabled: true
But again this will be just for development, and AFAIK, Mozilla push landed or is about to land, and will be available in the nightly builds, you can follow the work here:
https://bugzilla.mozilla.org/show_bug.cgi?id=1038811
No, the new generation of push notifications (i.e. Push API) requires HTTPS.
If you need to add push notifications to a website without HTTPS you can use a third-party service like Pushpad (I am the founder) that delivers notifications on your behalf.
The text you cited from the spec is from the Cache.addAll() section (5.4).
Here's the summary of addAll() on MDN:
The addAll() method of the Cache interface takes an array of URLS, retrieves them, and adds the resulting response objects to the given cache. The request objects created during retrieval become keys to the stored response operations.
Service workers can request & cache URLs that are either HTTP or HTTPS, but a Service Worker itself can only work in its registered Scope (which must be HTTPS).
simple-push is not related to Service Workers; it seems comparable to the approaches other platforms have taken:
Apple Push Notifications
Google Cloud Messaging
I found a nice bypass workaround to allow notifications from websites and domains without SSL, hence http:// and not https:// for Firefox.
Firefox holds a file inside the Mozilla directory called permissions.sqlite which is a sqlite database file that holds the permissions for domains. You can add your domain there http://yourdomainname with permissions for notifications and it will work.
I have created a demonstration for Windows here https://gist.github.com/caviv/8df5fa11a98e0e33557f75215f691d54 in golang
The users of our website run our Chrome plugin which, amongst other things, performs cross-origin requests via XMLHttpRequest as described on the Chrome extension development pages. This has been running just fine for a few years now. However, ever since our users upgraded to the latest version of Chrome (v38), these requests have failed. Our site runs on HTTPS and some of the URLs loaded via our content script are on HTTP. The message is:
[blocked] The page at 'https://www.ourpage.com/' was loaded over
HTTPS, but ran insecure content from 'http://www.externalpage.com':
this content should also be loaded over HTTPS.
The reported line where the error occurred is in the content script where I'm issuing the HTTP call:
xhr.send(null);
I have no control over the external page and I would rather not remove SSL from our own page. Question: Is this a bug or is there a workaround that I am not aware of?
(Note: The permissions in the manifest were always set to <all_urls> which had worked for a long time. Setting it to http://*/ and https://*/ did not help.)
If possible, use the https version of that external page.
If that is not possible, use the background page to handle the AJAX request (example).