We are having a strange issue regarding CORS from our chrome extension. We are using manifest V3 and have the path set up in host permissions correctly.
We know it's correct, because when you load the extension for the first time, nothing breaks. But if you switch the extension off, then back on again, we get a CORS issue.
The backend is receiving the requests still so I know it's not an 'allowance' issue. Not sure how this is happening but would welcome some help.
"host_permissions": [
"http://localhost:3000/*",
"https://*.ourdomain.com/*",
"https://maps.googleapis.com/maps/api/place/autocomplete/json"
],
Error
Access to fetch at 'https://api.ourdomain.com/api/v1/auto_login/' from origin 'chrome-extension://nlbdcdgjnplflacipfcamfcpogbfmbjl' has been blocked by CORS policy: No 'Access-Control-Allow-Origin' header is present on the requested resource. If an opaque response serves your needs, set the request's mode to 'no-cors' to fetch the resource with CORS disabled.
Many thanks
We fixed it. We have a rails backend with rack-cors gem.
The issue with this gem is that it doesn't accept any host other than http, https, file. So chrome-extension://******** naturally failed. I still don't understand why it worked then stopped working and this potentially highlights an issue with the gem.
Removing the host so it left our chrome extension ID as the origin allowed CORS to pass.
Related
All request comes on HTTPs and internally managed on HTTP. Everything works fine until it hits re-direct return "redirect:/link/somePage.htm";
The moment redirect is called, Chrome browser complains about insecure page and the message displayed is The information you’re about to submit is not secure. The same works fine on Firefox.
Of course redirect causes link to change from HTTPs to HTTP on Chrome. Where as Firefox has no issue.
Has anyone encountered this recently on above mentioned browsers or something underline has changed, which would mean deeper investigation.
Firefox - 81.0.2 & 83.0, Chrome - 87.0.4280.88
We searched for a solution all day long.
We found that we need to add the following headers:
For Appache server: RequestHeader set X-Forwarded-Proto https
For NGINX server: proxy_set_header X-Forwarded-Proto $scheme;
Another possible solution on NGINX with reverse proxy is to use the command :
proxy_redirect http:// https://;
Source : https://community.bitnami.com/t/ssl-https-connection-implemented-but-getting-error-while-accessing-some-pages/89877/4
We had this problem, reported to me on Monday (12/14/2020). It was reproduceable on Tuesday (12/15/2020). Today, Wednesday (12/16/2020) we do not get the warning from Chrome. No changes have been made to the site. However, accessing the site with lighthouse, does report the problem. Did something happen to change the detection or reporting?
I am using the NPM module Axios to make http requests to a server from a webpage on my localhost. The server url is using http but when I check the request in the browser dev tools, I see the URL in the request is being changed to https, thus failing. How can I prevent this redirect from happening? This is happening in Chrome and Safari.
Browsers will always follow HTTP redirects, so a library like Axios will not be able to prevent this from happening. There is an issue on their GitHub explaining that this can't be fixed by them.
If you are running in node.js then you can prevent this from happening by setting maxRedirects: 0 in the request config.
I am currently doing some debugging on my website which involves calling the facebook API.
I've installed dnsmasq to work with my mac os X to redirect all request to facebook.com to 127.0.0.1
I have a echo server which will print out all the raw http request header on port 80 on my laptop.
Now comes my problem. When I access facebook.com, I realize chrome will automatically forward http:// to https:// for facebook.com
I googled and found the way of deleting this HSTS issue. I visit chrome://net-internals#hsts to see something like this:
HSTS chrome image
After entering "facebook.com" under "Delete domain", I can still query "facebook.com" in the input box below.
I tried clearing all user data on chrome, closing and reopening chrome and even using incognito mode.
Why is chrome still redirecting all request to facebook.com to https?
How can I disable this if chrome://net-internals#hsts is not
reliable?
The text next to the Delete domain box on chrome://net-internals/#hsts clearly states that preloaded entries cannot be deleted. This feature request was closed as WontFix in the Chrome bug tracker.
facebook.com and quite a few of its subdomains are included in Chrome's preload list.
You could use another domain name for your tests.
Just make api-calls to facebook-api-test.com, map that domain to localhost and proxy the calls.
I am using google chrome version 43.0.2357.81 on OS X and attempting to display a webpage within an iframe.
ie:
I followed this link with instructions to disable web security and found it helpful for displaying local files within iframes but I am still encountering the same origin error when trying to display disparate web pages.
Disable same origin policy in Chrome
I ran the command open -a Google\ Chrome --args --disable-web-security in terminal and received the banner message confirming that it worked:
You are using an unsupported command-line flag: --disable-web-security. Stability and security will suffer.
However when I view my webpage in chrome I still got a same origin error and was unable to view the site within the iframe.
This has nothing to do with Chrome itself; the server you call within the iframe sends back a http header with
X-Frame-Options SAMEORIGIN
setting. Even "chrome.exe --user-data-dir=c:\tmp\chrome2 --allow-file-access-from-files --disable-web-security" does not disable the iframe same origin check in Chrome. The only option you have is to switch the X-Frame-Options of your server to
X-Frame-Options ALLOWALL
(if you can).
I've been reading about Access-Control-Allow-Origin because it seems effective at allowing cross domain requests since I have access to the external site. My question ism how do I use Access-Control-Allow-Origin to allow cross domain requests. I tried this (don't laugh) (by the way all I want is for a single number, 1 or 0 to be returned)
<html>
<head>
Access-Control-Allow-Origin: *
</head>
<body>
1
</body>
</html>
Am I close? Thanks for your help. If there is an easier way to do a simple cross-domain request let me know.
There are 3 ways to allow cross domain origin (excluding jsonp):
Set the header in the page directly using a templating language like PHP. Keep in mind there can be no HTML before your header or it will fail.
Modify the server configuration file (apache.conf) and add this line. Note that "*" represents allow all. Some systems might also need the credential set. In general allow all access is a security risk and should be avoided:
Header set Access-Control-Allow-Origin "*"
Header set Access-Control-Allow-Credentials true
To allow multiple domains on Apache web servers add the following to your config file
SetEnvIf Origin "http(s)?://(www\.)?(example.org|example.com)$" AccessControlAllowOrigin=$0$1
Header add Access-Control-Allow-Origin %{AccessControlAllowOrigin}e env=AccessControlAllowOrigin
Header set Access-Control-Allow-Credentials true
For development use only hack your browser and allow unlimited CORS using the Chrome Allow-Control-Allow-Origin extension
Disable CORS in Chrome: Quit Chrome completely. Open a terminal and execute the following. Just be cautious you are disabling web security:
open -a Google\ Chrome --args --disable-web-security --user-data-dir
That is an HTTP header. You would configure your webserver or webapp to send this header ideally. Perhaps in htaccess or PHP.
Alternatively you might be able to use
<head>...<meta http-equiv="Access-Control-Allow-Origin" content="*">...</head>
I do not know if that would work. Not all HTTP headers can be configured directly in the HTML.
This works as an alternative to many HTTP headers, but see #EricLaw's comment below. This particular header is different.
Caveat
This answer is strictly about how to set headers. I do not know anything about allowing cross domain requests.
About HTTP Headers
Every request and response has headers. The browser sends this to the webserver
GET /index.htm HTTP/1.1
Then the headers
Host: www.example.com
User-Agent: (Browser/OS name and version information)
.. Additional headers indicating supported compression types and content types and other info
Then the server sends a response
Content-type: text/html
Content-length: (number of bytes in file (optional))
Date: (server clock)
Server: (Webserver name and version information)
Additional headers can be configured for example Cache-Control, it all depends on your language (PHP, CGI, Java, htaccess) and webserver (Apache, etc).
<?php header("Access-Control-Allow-Origin: http://example.com"); ?>
This command disables only first console warning info
console
Result:
console result
If you are using npm and want to load some files/folders allowing cross origin resource sharing(CORS) try the following,
install the http-server:
npm install http-server -g
Go to your files/folders folder and run the command below to make your files/folders available at http://127.0.0.1:8080 . This a good choice when you want to keep the files in a safe place and control who can request inferences to it. The -c1 parameter is added to disable caching, and
the --cors flag enables cross origin resource sharing(CORS) allowing the hosted files to be used by the client side JavaScript for a given domain.
http-server -c1 --cors .
It worked for me in WSL-ubuntu terminal on windows 10. It should work with npm CLI on any OS.
Thank you.
If you use Java and spring MVC you just need to add the following annotation to your method returning your page :
#CrossOrigin(origins = "*")
"*" is to allow your page to be accessible from anywhere.
See https://developer.mozilla.org/fr/docs/Web/HTTP/Headers/Access-Control-Allow-Origin for more details about that.