Error with migrating my Chrome Extension to Manifest v3 - google-chrome

I want to migrate my Chrome Extension to manifest V3.
The content_security policy looks as follows on manifest V2:
{...
"content_security_policy": "script-src 'self' 'sha256-...'; object-src 'self'"
}
Notice that I'm using sha-256 value which is the most specific property.
Furthermore, I performed a "semi-official" converting using this tool.
When I convert the manifest to V3 and then update the extension, I get the following error. I don't understand why it is considered as an insecure CSP value, while it is accepted on Manifest V2 and is considered secure to use the specified hash value of the code.
How can I overcome it?

In manifest MV3, CSP is an object, however in the error it is a string, so it needs to be reformatted.
Example and instructions from migration guide:
Manifest V2
"content_security_policy": "..."
Manifest V3
"content_security_policy": {
"extension_pages": "...",
"sandbox": "..."
}
extension_pages: This policy covers pages in your extension, including html files and service workers. These page types are served from the chrome-extension:// protocol. For instance, a page in your extension is chrome-extension://<extension-id>/foo.html.
sandbox: This policy covers any sandboxed extension pages that your extension uses.
Important!
In addition, MV3 disallows certain CSP modifications for extension_pages that were permitted in MV2. The script-src, object-src, and worker-src directives may only have the following values:
self
none
Any localhost source, (http://localhost, http://127.0.0.1, or any port on those domains)
CSP modifications for sandbox have no such new restrictions.
Going through this guide, it seems having sha-256 values is not allowed for extension pages. But these are typically inlined scripts. You can save the script as a js file, and load it from there using <script/> tag instead; this will not require CSP policy.

Related

Receiving Content Security Policy errors with no Content Security Policy header present

I am integrating with a third party library (Chargebee) and I am receiving Content Security Violation errors.
As far as I can tell, I don't have CSP defined in my web page as it doesn't return a CSP response header:
Although, when trying to utilise this library I'm getting errors along the lines of:
Questions
I'm using other third party JS libraries, why am I not getting this error for any others? I've never had to specify a CSP for any previously.
If I do actually define a CSP, I'm then blocking the other third party libraries. Can I allow Chargebee without then blocking others (and without having to include them all in the CSP)?
It doesn't make sense. It suggests that I don't have a CSP defined seeing as it's not returned in my headers and I can use third party resources, I have even verified this using the Chrome CSP evaluator extension. And yet, the error messages show that I do have a CSP policy defined.
Am I missing something?
If I do actually define a CSP, I'm then blocking the other third party libraries. Can I allow Chargebee without then blocking others (and without having to include them all in the CSP)?
No - CSP is an allowlist protocol.
If you define a CSP, you need to map out all the various needed resources - at least for the directives that you are using (for example you can confine your CSP to only script-src and object-src, and ignore others).
Since you cannot control or predict which directives are needed for Chargebee CSP, it's best to use predetermined CSP packages, and a CSP generator.

ETags for server-side rendered pages that contain CSP nonce

I have a server-side-rendered React app and Node/Express so far were able to generate the correct, stable ETags, allowing for taking advantage of client-side caching.
Additionally, generated HTML contains fragments of render-blocking (above-the-fold) CSS and JS inlined as <script> and <style> tags for faster client-side first renders (as promoted by Google and its PageSpeed and Lighthouse tools).
Now I want to enable Content Security Policy (CSP) and I provide a nonce as an attribute to those <script> and <style> tags on every page request, to avoid unsafe-inline violations. However, ever-changing nonce makes ETags to change on every request as well. HTML is never cached and every request hits my Express server.
Is there a way to combine simultaneously:
inlined CSS and JS
CSP features (that is nonce, or similar)
ETags or alternatives
?
So far I see a contradiction between current performance vs security guidelines.
Are there equivalents to CSP nonce or can CSP nonce be provided while keeping HTML intact? Is there a way to otherwise cache pages that contain CSP nonce?
Ideally, I would like a solution to be contained within the Express server, without resorting to tinkering with my reverse proxy config, but any options are welcome.
One solution is to leave the whole content generation and caching to web application (Node in your case) and CSP nonce generation to front-end webserver (e.g. Nginx). I have implemented it with Django which does page caching with ETag, does all the Vary header logic etc and the HTML it produces contains such a static CSP nonce placeholder:
< script nonce="+++CSP_NONDE+++"> ... </script>
This placeholder is then filled in by Nginx using ngx_http_subs_filter_module:
sub_filter_once off;
sub_filter +++CSP_NONCE+++ $ssl_session_id;
add_header Content-Security-Policy "script-src 'nonce-$ssl_session_id'";
I have seen solutions using an additional Nginx module to generate a truly unique random nonce for each request but I believe it's an overkill and I'm just using TLS session identifier, which is unique per each connecting client and may be cached for some time (e.g. 10 minutes) depending on your Nginx configuration.
Just make sure the web application returns uncompressed HTML as Nginx won't be able to do string substitution.

Google Drive's save to drive with Laravel?

I'm trying to use the "Save to drive" button that Google provides to make Drive uploads even easier, it looks like this:
<script src="https://apis.google.com/js/platform.js" async defer></script>
<div class="g-savetodrive"
data-src="//example.com/path/to/myfile.pdf"
data-filename="My Statement.pdf"
data-sitename="My Company Name">
</div>
My question is, since I am using Laravel and the php artisan serve command to serve my project, how am I supposed to write the path to my file? It's located at 'Project name'/storage/app/docs/, I've tried //storage/app/docs/{{ $file->path }} but it doesn't work, and using storage_path() didn't change anything. What am I missing here?
EDIT:
I tried using another file, one that was hosted somewhere else. So I enabled CORS on my project and, using Postman, I tested to see the headers I was using:
Access-Control-Allow-Headers →Content-Type, X-Auth-Token, Origin, Range
Access-Control-Allow-Methods →POST, GET, OPTIONS, PUT, DELETE
Access-Control-Allow-Origin →*
Access-Control-Expose-Headers →Cache-Control, Content-Encoding, Content-Range
According to the Google documentation, it should be working now, yet it's not.
This is the error that I'm getting in the console:
Response to preflight request doesn't pass access control check:
No 'Access-Control-Allow-Origin' header is present on the requested resource.
Origin 'http://localhost:8000' is therefore not allowed access.
The response had HTTP status code 400.
And I'm oficially out of ideas.
As stated in the document - Troubleshooting,
If you get an XHR error when downloading your data-src URL, verify that the resource actually exists, and that you do not have a CORS issue.
If the Save to Drive button works with all browsers except Internet Explorer 9, you may need to configure your browser to enable CORS, which is disabled by default.
If large files are truncated to 2MB, it is likely that your server is not exposing Content-Range, likely a CORS issue.
Take note the answer on the related SO question - Save To Drive Button Doesn't Work and the documentation that:
The data-src URL can be served from another domain but the responses from the HTTP server needs to support HTTP OPTION requests and include the following special HTTP headers:
Access-Control-Allow-Origin: *
Access-Control-Allow-Headers: Range
Access-Control-Expose-Headers: Cache-Control, Content-Encoding, Content-Range

Chrome Extension Content Security Policy not Allowing Resources

I receive the following errors when loading a local chrome extension:
Refused to load the script 'https://widget.intercom.io/widget/APPID' because it violates the following Content Security Policy directive: "script-src 'self' http://localhost https://widget.intercome.io/ 'unsafe-eval'".
index.html:1 XMLHttpRequest cannot load http://localhost:5000/login. No 'Access-Control-Allow-Origin' header is present on the requested resource. Origin 'chrome-extension://bbmhkchajfbhfjkfiaadlnohbfhbnegj' is therefore not allowed access.
I attempted to put the following line in my manifest.json:
"content_security_policy": "script-src 'self' http://localhost https://widget.intercome.io/ 'unsafe-eval'; object-src 'self'"
Shouldn't this last line in particular allow me access to these two resources?
You have the right idea with the https://widget.intercome.io CSP rule, but you may need to
correct the spelling of your domain name (your error has intercom.io, but your CSP says intercome.io)
remove the trailing slash (I'm not sure, but none of the CSP examples I've found use a trailing slash)
reload your extension after making changes
The localhost error is caused by enforcement of the same-origin policy, not the CSP. (Ajax requests, which are governed by the connect-src CSP directive, are not restricted by Chrome extensions' default CSP.) You need to add http://localhost/* as a host permission in your manifest's permissions field:
"permissions": [
"http://localhost/*",
...
]
To expand on apsillers' answer, for anyone that comes here in the future:
This is correct, but make sure you include it in a separate file. I was running into an inline issue, so I just created a file called intercom.js and then included the script there and in my HTML I include it <script src="intercom.js" charset="utf-8"></script>
Also this is what worked for me:
"content_security_policy": "script-src 'self' http://localhost https://widget.intercom.io https://js.intercomcdn.com 'unsafe-eval'; object-src 'self'"

EvalError when trying to load realtime data model

I'm trying to get realtime data model with gapi.drive.realtime.load method in my Chrome Packaged App :
gapi.drive.realtime.load(fileId, onLoad)
but I'm caught an EvalError:
Uncaught EvalError: Refused to evaluate a string as JavaScript because 'unsafe-eval' is not an allowed source of script in the following Content Security Policy directive: "script-src 'self' apis.google.com drive.google.com".
Of courcse, I can just add 'unsafe-eval' to manifest, but I guess it is bad way to solve this problem. Can you suggest better solution ?
A packaged app can't have a less restrictive CSP than the default, so your proposed change to the manifest wouldn't have worked. Instead, create a sandboxed iframe that allows eval(), then message back and forth between the iframe and your app. This is a good example with links to further documentation.