I have a gulp task that runs browsersync.
var options = {
proxy : 'localhost:9000/html' ,
port : 3000 ,
files : [
config.root + config.srcPaths.htmlBundle ,
config.htmlRoot + 'main.css' ,
'!' + config.htmlRoot + '**/*.scss'
] ,
injectChanges : false ,
logFileChanges : true ,
logPrefix : 'broserSync ->' ,
notify : true ,
reloadDelay : 1000
};
browserSync( options );
browsersync detects changes and tries to inject them but chrome blocks it with this error:
Refused to connect to
'ws://localhost:3000/browser-sync/socket.io/?EIO=3&transport=websocket&sid=gOQQPSAc3RBJD2onAAAA'
because it violates the following Content Security Policy directive:
"default-src 'self'". Note that 'connect-src' was not explicitly set,
so 'default-src' is used as a fallback.
Uncaught
SecurityError: Failed to construct 'WebSocket': Refused to connect to
'ws://localhost:3000/browser-sync/socket.io/?EIO=3&transport=websocket&sid=gOQQPSAc3RBJD2onAAAA'
because it violates the document's Content Security Policy.
How can i overcome this issue? Can i turn off the security policy?
Or you can add rules to your content security policy in the main html file (ex. index.html) to accept web socket connections from browser-sync. You can do it by adding ws://localhost:* to your default-src, for example like that:
<meta http-equiv="Content-Security-Policy"
content="
default-src 'self' ws://localhost:*">
You can also specify the exact browser-sync port like that:
<meta http-equiv="Content-Security-Policy"
content="
default-src 'self' ws://localhost:3000">
Just remember to remove this from policy before publishing to production servers!!
Not sure if it's the best solution, but what i ended up doing is to install a chrome plugin that disables the csp:
https://chrome.google.com/webstore/detail/disable-content-security/ieelmcmcagommplceebfedjlakkhpden
If anyone has a better solution i'll be glad to hear it.
If the CSP is set in the html meta tag then a slightly less ugly solution is to have browser-sync disable this itself. Adding something like this to the browser-sync config should do the trick:
rewriteRules: [
{
match: /Content-Security-Policy/,
fn: function (match) {
return "DISABLED-Content-Security-Policy";
}
}
],
If you're really smart you could inject the correct CSP rules that permit browser-sync to do its stuff. Perhaps one diligent soul will end up writing a plugin to do just this?
Related
I have been using a content-security-policy-report-only header for multiple weeks and have been seeing violations for multiple domains which are allowlisted in the same csp header. I placed all the domains I want to allowlist in the default src and don't have any other directives (other than style src for a nonce). I see other traffic which seem to pass and when I test the url on my own browser it succeeds without any violations. I have been looking at possible reasons behind this and that lead me to posts such as Content-Security-Policy Blocking Whitelisted Domains, Why is script-src-elem not using values from script-src as a fallback?, and https://csplite.com/csp277/
These links say that status code 0 or empty indicate the request was blocked while the browser tries to load the link and can happen due to ad blockers. I was also seeing these status code 0 violations reports and have filtered out status code 0/empty. But even after the change I still see a few requests that violate on allowlisted domains and with status code 200. Could this also be due to adblockers?
I did notice that some violations caused by extensions would list in source-file as chrome-extension so unsure if this could be due to extensions.
One thing I did notice was that if a report had a blocked uri that was not in the allow list in the default src, the violated directive would be a frame-src (which should default to default-src where there would be no domain so violation is expected). But in the case of a report with a blocked uri that was in the allow list, the violated directive and effective directive would be an img-src (which should also default to default but maybe it's not seeing the allowlisted domain there)
Example Report
{"linenumber":"",
"request":"",
"documenturi":"mysite.com",
"originalpolicy":"default-src 'self' mysite.com *.redirectsite.com redirectsite.com; style-src 'nonce-d93e18cc'; report-uri /csp-reports",
"violateddirective":"img-src",
"statuscode":"200",
"referrer":"",
"scriptsample":"",
"effectivedirective":"img-src",
"columnnumber":"",
"requestheaders":"",
"blockeduri":"https://x.redirectsite.com/s........",
"sourcefile":""}
Does anyone have any experience with this?
Ended up switching from content-security-policy-report-only to content-security-policy header even though was seeing these status code 200 violations. After switching, still saw status code 0 violations but status code 200 disappeared. Perhaps it is a bug with how browsers support the content-security-policy-report-only header. But ended up working out for this usecase. Hope this helps someone else
I followed the README instructions from vfat.tools (https://github.com/vfat-tools/vfat-tools), i.e. ran npm install and finally npm run dev. I see the following on the console:
[Browsersync] Access URLs:
Local: http://localhost:3000
External: http://192.168.0.197:3000
UI: http://localhost:3001
UI External: http://localhost:3001
[Browsersync] Serving files from: dist
[Browsersync] Watching files...
However, when I open localhost:3000 to access the UI, I see the following error on Chrome`s console:
Refused to execute inline script because it violates the following Content Security Policy
directive: "default-src 'self'". Either the 'unsafe-inline' keyword, a hash ('sha256-
ACotEtBlkqjCUAsddlA/3p2h7Q0iHuDXxk577uNsXwA='), or a nonce ('nonce-...') is required to enable
inline execution. Note also that 'script-src' was not explicitly set, so 'default-src'
is used as a fallback.
Options to solve this problem include adding unsafe-inline somewhere in the code (for example -> Script causes “Refused to execute inline script: Either the 'unsafe-inline' keyword, a hash… or a nonce is required to enable inline execution”), but I also have the impression this is not good practice.
How can I get the webpage to load properly?
There is 3 kinds of inline scripts: <script>...</script>, <a href='javascript:void(0)' and <a onclick='eventHandler()'.
The first one you can resolve using 'nonce-value' or 'hash-value', the last two require using of mandatory 'unsafe-inline' (or code refactoring).
Therefore to get rid of this error you need to know which kind of inline script caused a violation.
As I can see vfat.tools uses document.getElementById("theme-button").onclick = setTheme (and maybe somewhere else something like that is used). Therefore you have to use 'unsafe-inline' or to rewrite this part of code using addEventListener() function to bind an event listener.
I am doing a project for developing a website. I chose Django for my backend. I have uploaded my static files on Amazon s3 bucket. All my CSS files and images and every static file are loading except the icons from font-awesome. I tried using their CDN. Yet no result.
<link href="{% static 'vendor/fontawesome-free/css/all.min.css' %}" rel="stylesheet">
<link href="https://stackpath.bootstrapcdn.com/font-awesome/4.7.0/css/font-awesome.min.css" rel="stylesheet">
I see no results whatsoever. The URL of my website:
https://fine-arts-club.herokuapp.com/
This is usually caused by improperly configured CORS handling.
Open your browser developer tools. If there is an error saying that Font Awesome could not be loaded because it is blocked by CORS policy, this is the reason.
To solve this you need to set the access-control-allow-origin HTTP header.
Setting the value to * will allow any domain.
access-control-allow-origin: *
You can set the Header value to a specific domain. This is usually the best option for files hosted for a specific site.
access-control-allow-origin: https://www.your-domain.com
There are two ways to set the CORS header. You can create a policy in your S3 bucket or you can moddify the header using a Lambda#Edge function.
S3 Configuration
How to add CORS Documentation
CORS Configuration Documentation
<CORSConfiguration>
<CORSRule>
<AllowedOrigin>*</AllowedOrigin>
<AllowedMethod>GET</AllowedMethod>
<AllowedHeader>https://www.your-domain.com</AllowedHeader>
</CORSRule>
</CORSConfiguration>
Lambda#Edge Function
You can also add the access-control-allow-origin HTTP header using a Lambda#Edge function. It is probably overkill for what you need, but it can be helpful if have a more complex setup.
To do this you need to create a Lambda function at US-East-1.
An example of the code you would use would be something like this...
exports.handler = async (event, context) => {
const request = event.Records[0].cf.request;
const response = event.Records[0].cf.response;
// Add logic here
let cors = "something";
response.headers['access-control-allow-origin'] = [{ key: 'Access-Control-Allow-Origin', value: cors }];
return response;
};
Once you have created the Lambda function you can add hook it into your CloudFront distribution by editing a behavior and adding your function to Lambda Function Associations as either "Viewer Response" or Origin Response".
I would recommend using the S3 method for almost all situations.
[
{
"AllowedHeaders": [
"*"
],
"AllowedMethods": [
"GET"
],
"AllowedOrigins": [
"*"
],
"ExposeHeaders": []
}
]
This configuration is working for me perfectly. You can try it.
I am trying to implement Single Sign Out using idsvr 3, I have two client apps (a MVC 5 and an asp.net core) both registered as clients and logging in works perfect
MVC 5 - Client A Asp.NET core - Client B
When both apps are logged in and i click on the log out link on Client B , Client A is logged out successfully. But on vice versa (Logging out of Client A first) Client B is not logged out. On checking on the browser's (Chrome Version 56.0.2924.87) console i get the following error
Refused to frame
'https:/client_B/myDomain/Signout_oidc/?sid=2adc40bd3ae432a81671118b09a'
because it violates the following Content Security Policy directive:
"frame-src 'self' https:/client_B.myDomain https:/client_A.myDomain".
How can I resolve this?
Try to add below code to your IdentityServerOptions instance.
CspOptions = new CspOptions
{
FrameSrc = "*"
}
More information how to configure CSP in IdvSrv3 in the documentation:
IdentityServer3 > CSP
IdentityServer Options
Thanks #Damian, I found where the issue was.
Issue was with client A URL, it had an underscore () character in it. In some way that violated a CSP rule or something else. Removing the '' character in the url solved the problem.
I am working on an extension where proxy is set through my extension using chrome extension proxy api(chrome.proxy.settings). Everything works fine and I get all traffic on my proxy server except the ones in bypass list of course.
Now my question is: How do we bypass the proxy set dynamically? I mean Is there a way to bypass proxy for some URLs (not in proxy bypass list) programmatically? I understand bypass list included urls/patterns are bypassed. But I require some urls to bypass proxy on the go.
Anyone faced a similar requirement? Any direct method to bypass the same dynamically or any work around would be appreciated.
Bypass proxy on the go?
You can pull any specify adressess from your web server. What's a problem?
For example:
chrome.windows.onCreated.addListener(function() {
var config = {
mode: 'fixed_servers',
rules: {
proxyForHttp: {
scheme: 'http',
host: '127.0.0.1',
port: '80'
},
bypassList: []
}
},
proxyByUrl = 'path/to/proxiesByUrl?path=' + window.location.href;
$.get(proxyByUrl, function(data) {
// Set up current proxy, dependent on request URL
config.rules.proxyForHttp.host = data.proxyAddress;
config.rules.proxyForHttp.port = data.proxyport;
// Excluded URL's for this proxy
bypassList.push(data.bypassUrls);
});
});
Match all hostnames that match the pattern . A leading "." is interpreted as a "*.".
Examples: "foobar.com", "foobar.com", ".foobar.com", "foobar.com:99", "https://x..y.com:99".
For more detailed patterns of bypassList, read the official documentation.