<Geolocation API> "granted" on Firefox BUT "denied" on Chrome - google-chrome

Greetings,
I am facing the following issue; question will follow after.
I have a production application using the Geolocation API from the navigator:
Yes => the app is served over HTTPS
Yes => all third-party sources are served over HTTPS
Yes => I have granted geolocation on browsers
The results I get are:
localhost => geolocation is OK on both Firefox & Chrome
production => geolocation is OK on Firefox, but not OK on Chrome
Firefox => before granting geolocation, I get the browser pop-up asking me to allow or block geolocation; OK
Chrome => either before or after granting geolocation, I get no such browser pop-up
On Chrome:
Permissions API => tells me that geolocation is denied
Geolocation's getCurrentPosition => tells me that Geolocation has been disabled in this document by permissions policy.
More info:
tech stack => Vue3 app hosted on Netlify
production version => there are several subdomains according to the city selected by user
the production app is actually for demo purpose; thus I cannot make any screenshot
Now, here is the question:
What could make the whole work without any issue on Firefox, and could
also displease Chrome?
Any help is highly appreciated.
Thanks much in advance.

Problem solved
I feel grateful to a wonderful colleague, whose pair of fresh eyes have permitted to look at the right place: the _headers file!
This file contained:
Permissions-Policy: fullscreen=(), geolocation=()
In the Permissions-Policy, you indeed write what is permitted, but you need to precise what you permit!
https://www.w3.org/TR/permissions-policy-1/#policy-controlled-feature :
SecureCorp Inc. wants to disable use of Fullscreen and Geolocation APIs within their application. It can do so by delivering the following HTTP response header to define a permissions policy:
Permissions-Policy: fullscreen=(), geolocation=()
By specifying an empty origin list, the specified features will be disabled for all documents, including nested documents, regardless of their origin.
SecureCorp Inc. wants to completely disable use of the Geolocation API within all browsing contexts except for its own origin and those whose origin is "https://example.com", even in the presence of an attacker who can embed their own iframes on SecureCorp’s pages. It can do this by delivering the following HTTP response header to define a restricted permissions policy for Geolocation:
Permissions-Policy: geolocation=(self "https://example.com")
Firefox seems to ignore this however, hence the fact it was working on this browser.
Now in my _headers file:
Permissions-Policy: fullscreen=(), geolocation=(self)
Feature-Policy: geolocation 'self'
Cheers.

Related

How to check if browser add-on contacts developer?

For desktop applications there is a firewall and determines which app can connect to the internet.
Of course internet browsers and other internet apps are always unblocked, other way they are useless.
Now we change point of view from OS->APPS and lets look same way at BROWSER->PLUGINS.
Browser is always online and how I obtain something similar to OS firewall but for the browser plugins? How I know which add-on is actually connecting to its developers server and send some data about my browsing acitvity, add-on usage and so on... ?
Read the code
Unfortunately, AFAIK you have to read the code. For example, the extension
https://github.com/m0rtem/CloudFail/ does call home. You can search for "http" in the code.
Inspect the extension
On Firefox you can inspect an extension.
For example, inspect (aka debug) uBlock. You get the full dev tools on the extension's background page. Go to the "network" tab. Now for testing, go to the extension's options. Update your filter lists. Then go back to extension inspector network tab : you see all the remote calls that the extension made, at your request. But you could also see any hidden call.
content security policy
Sending the user's data to a remote server is not the same thing as writing code with poor security practices, exposing the user to malicious code execution from hackers. But still, it's related.
For Firefox, the default CSP is "script-src 'self'; object-src 'self';" https://developer.mozilla.org/en-US/docs/Mozilla/Add-ons/WebExtensions/Content_Security_Policy.
So you can read the extension's manifest.json, to see if they changed the default policy.
You can also search for "google analytics" in the code.
Now, be aware that on the official stores, every extension's code base is reviewed by Chrome or Mozilla, so the worst practices (like hacking) are forbidden.
https://wiki.mozilla.org/WebExtensions/policy#II.Security.2F_Privacy

Are push notifications possible in html5 without fully https site?

Looks like Push notifications are finally usable for web-apps! Unfortunately, this requires https for ServiceWorker, which not all sites may have.
One thing I noticed in the spec it mentions:
if r's url's scheme is not one of "http" and "https", then:
Throw a TypeError."
So I'm confused - can the site be http, as long as it includes a serviceworker that is from https? For example, mydomain.com could include an https serviceworker from https://anotherdomain.com?
Another standard, web-api simple-push, doesn't mention requiring https (likely an omission in the documentation?), and "The user experience on Firefox Desktop has not been drawn out yet". Is the documentation on this outdated, or is push really only supported in FirefoxOS??
Simple-push, that is the current push solution in Firefox OS doesn't have anything to do with ServiceWorkers.
The next generation of push, implemented by both Google and Mozilla will be done through ServiceWorkers:
Push API spec
In that case yes, your content will need to be served over HTTPS.
Probably you will be interested in the LetsEncrypt initiative:
letsencrypt.org
A new certification authority that will help developers to transition their content over HTTPS.
Also just for development purposes, both Google and Mozilla implementations of ServiceWorkers allow you to bypass the check of the secure content, if you develop against localhost.
In the case of Mozilla you will need to enable the flag:
devtools.serviceWorkers.testing.enabled: true
But again this will be just for development, and AFAIK, Mozilla push landed or is about to land, and will be available in the nightly builds, you can follow the work here:
https://bugzilla.mozilla.org/show_bug.cgi?id=1038811
No, the new generation of push notifications (i.e. Push API) requires HTTPS.
If you need to add push notifications to a website without HTTPS you can use a third-party service like Pushpad (I am the founder) that delivers notifications on your behalf.
The text you cited from the spec is from the Cache.addAll() section (5.4).
Here's the summary of addAll() on MDN:
The addAll() method of the Cache interface takes an array of URLS, retrieves them, and adds the resulting response objects to the given cache. The request objects created during retrieval become keys to the stored response operations.
Service workers can request & cache URLs that are either HTTP or HTTPS, but a Service Worker itself can only work in its registered Scope (which must be HTTPS).
simple-push is not related to Service Workers; it seems comparable to the approaches other platforms have taken:
Apple Push Notifications
Google Cloud Messaging
I found a nice bypass workaround to allow notifications from websites and domains without SSL, hence http:// and not https:// for Firefox.
Firefox holds a file inside the Mozilla directory called permissions.sqlite which is a sqlite database file that holds the permissions for domains. You can add your domain there http://yourdomainname with permissions for notifications and it will work.
I have created a demonstration for Windows here https://gist.github.com/caviv/8df5fa11a98e0e33557f75215f691d54 in golang

How does Firefox implement HSTS in detail?

I was doing some research on how Firefox and Chrome are implementing HSTS (HTTP Strict Transport Security) in detail.
Turns out that they have a predefined list with some sites that already implement HSTS. This can be seen here here and/or here.
And these list seems to be somehow linked to the sourcecode itself which makes somehow sense...but how do Firefox and Chrome handle my own HSTS headers? How and where do they store my URL, my max-age and whether I includeSubDomains or not?
I wasn't able to find this in about:config or likewise....
So maybe somebody knows more about this issue than me, I'm just curious (:
Thx!
See http://hg.mozilla.org/mozilla-central/file/20bbf73921f4/netwerk/protocol/http/nsHttpChannel.cpp#l1072 and then http://hg.mozilla.org/mozilla-central/file/20bbf73921f4/security/manager/boot/src/nsStrictTransportSecurityService.cpp#l249 which calls http://hg.mozilla.org/mozilla-central/file/20bbf73921f4/security/manager/boot/src/nsStrictTransportSecurityService.cpp#l147
So the data ends up stored in the permission manager, which is the normal place per-host information gets stored in Firefox. The permission manager stores its state in permissions.sqlite, I think.
Sites that want HTTP Strict Transport Security (HSTS) enforced send a header in response - Strict-Transport-Security: max-age=31536000
max age being time for it to expire. It is sent on each request so that it gets updated to that much more time every time it is requested.
Browser (I have tried only Firefox) stores this data with it and will use it every time the site is accessed. This is true even for incognito mode. If you have ever accessed the site before in non incognito mode then the details of that site is saved and used even if you try to open it now in incognito mode.
For firefox this data is stored in a file called SiteSecurityServiceState.txt which is in your firefox profile folder. You can enter about:support in browser and then select "Show in folder" to open your profile folder where you can locate this file.
I am not sure about predefined sites but above is the file where normal site HSTS details are updated for firefox.
More details - Understanding HTTP Strict Transport Security (HSTS)
PS: Above link goes to my personal blog that has more details on HSTS.

Disable Chrome Same Origin Policy for a specific domain

Is it possible to modify the same origin policy in Chrome?
I plan to allow the specific domain foo.com to access an iframe with a different origin. The idea is to have a "login machine" which knows the login data and where to put it. I know about CORS but it's not the solution because I don't have access to some of the different origins to set the custom header.
I know there are many other solutions to build a "login machine" :)
I tried selenium, and I tried a chrome browser extension. but it wasn't that good and the user experience was bad.
I like Firefox's enablePrivilege but it isnt supported in newer versions.
Is it a lack of functionality if I can't disable-web-security for a specific domain?
best regards

Avoiding mixed content messages in IE

We have a secure website (SSL) in which we want to make calls to google's map server. The map server is http not https and every time there is a refresh of this screen (every minute for us) IE pops up its annoying mixed content message (trying to view a site with secure and non-secure info).
What I am looking for is a way around this. For example, is there a way to proxy the request so that our internal request is https but the other side of the proxy is not secure? I'm trying essentially to spoof the data to trick the browser.
Any ideas here? The actual security of the end point is less important than avoiding the error message itself.
Thanks!
Don
There is a way to suppress this at browser level, which might not be desirable for you, but I thought I'd throw it out there. In IE, Tools | Internet Options | Security | Internet Zone | Custom dialog box, you can set the "Display mixed content" to Enable. It's probably on prompt right now. Again, this is a single user browser level setting, so probably will not work for you. This does open up a lot of problems security wise though, and most admins will not do this (DNS poisoning, m-i-m etc).
Your second option is to become a premier customer: http://code.google.com/apis/maps/faq.html#ssl
Your third option is to use Virtual Earth - which supports native SSL w/o any strings
EDIT see similar question: here
As of March 2011, the Google Maps API is available to everyone over SSL:
http://googlegeodevelopers.blogspot.com/2011/03/maps-apis-over-ssl-now-available-to-all.html
Here's the problem with that. Even though the API is SSL the thumbnail images the map has for locations are NOT ssl. So you can still get a message.
remove runat="server" from head, where you are using code to link API to your page