how to exclude my site from Google's interest-cohort policy? - google-chrome

today I have some new warnings in chrome console:
Error with Permissions-Policy header: Unrecognized feature: 'interest-cohort'.
Error with Permissions-Policy header: Unrecognized feature: 'ch-ua-full-version-list'.
I don't want my site to be included inside this google's feature
searching the web I found the following solution (.htaccess file):
<IfModule mod_headers.c>
Header always set Permissions-Policy: interest-cohort=()
</IfModule>
but it doesn't help. The warnings are still in console
Is there an efficient way to stop appearing this messages ?

Google have stopped running the experiment and are looking at a new approach: https://blog.google/products/chrome/get-know-new-topics-api-privacy-sandbox/amp/
So you don’t need to exclude you site from it anymore (hence the error message).
Note also even when they were running that experiment, it only generated and used an error interest cohort when using Google Ads. So again I need to do anything if not running ads. And if that interested in privacy then probably not running any ads that would use this.

Related

CORS still working after disabling browser web security

So I have been testing a website using Selenium, specifically a page with a credit card form embedded in an iframe. I want to access the content of said iframe but due to CORS I get the error message:
Uncaught DOMException: Blocked a frame with origin "<url>" from accessing a cross-origin frame.
I made a quick google search and realized you can bypass CORS using the "--disable-web-security" flag, so my code now looks like the following:
options = webdriver.ChromeOptions()
options.add_argument("--disable-web-security")
self.driver = webdriver.Chrome(os.getenv("CHROME_DRIVER"), options=options)
Surprisingly, the CORS exception keeps on popping and I'm currently stuck on how to go from here. I really do have to access the iframe's content, there isn't a workaround for this.
Since I was confused as to why this didn't work, I replicated the problem with another website, in this case Amazon, which functions similarly (credit card form embedded in an iframe). I ran the code with web security enabled and I get the same CORS error, as expected. But then I disable web security, exactly as mentioned before, and it works! I can now access the iframe.
I also downgraded to an older version of Chrome (86) from the current most stable (88) and nothing happens again. I'm using Ubuntu 20.04.
So now I'm wondering - why isn't the flag working for the first scenario I mentioned? Is there a chance the first website is forcing the browser's web security or something related? I'm not an expert in web development so any input on this would be valuable.
Turns out, all I needed was to add --disable-site-isolation-trials and a non-empty --user-data-dir along with --disable-web-security as stated here
For me, adding these flags to my Chrome launcher fixed my CORS issues:
--disable-web-security --user-data-dir=~/chromeTemp
Pls note, this only turns warnings off, this is definitely not a CORS solution

XMLHttpRequest cannot load. Unloaded resources show caution: Provisional headers are shown

I actually don't know the right title for this question, because of I've found many similars but still couldn't figure the solution out.
My problem is that (on Chrome) the website I've working on remain showing console log: XMLHttpRequest cannot load http://resource.domain.com/file.css. No 'Access-Control-Allow-Origin' header is present on the requested resource. Origin 'http://www.my-site-url.tld' is therefore not allowed access.
I've tried to allow that resources via .htaccess files using this code.
<FilesMatch "\.(jpg|jpeg|png|gif|js|css)$">
<IfModule mod_headers.c>
Header set Access-Control-Allow-Origin http://wp.com,http://ajax.googleapis.com,http://fonts.googleapis.com,http://zor.fyre.co
Header set Access-Control-Allow-Credentials true
</IfModule>
</FilesMatch>
Doesn't work. I've also tried php and jsonp version.
Anyone please help me to fix this issue.
Additional info:
Lately, in inspect elements > Networks > Headers section, I found this message:
Your issue most likely stems from the fact that you are trying to load content from an external domain, which is prohibited in a normal json call hence the No 'Access-Control-Allow-Origin' error. The technique you may need is known as JsonP. I would link the documentation; however, it really can depend on the javascript framework you are using. I recommending reviewing this post for a walk-through on how to invoke this behaviour via normal JS. https://stackapps.com/questions/891/how-to-call-the-api-via-jsonp-in-plain-old-javascript
Hope this helps!

Application Cache Error event: Manifest fetch failed (404)

I have already one site, for that I have created one mobile site using jquery mobile with application cache features. (Both have same urls)
Using apache2 userAgent settings I redirected to mobile site for (Android and Iphone).
When It redirected to mobile site, I get below error:
Application Cache Error event: Manifest fetch failed (404)
test.manifest
CACHE MANIFEST
# version 1
CACHE
/index.html
/static/js/main.js
/static/css/style.css
Example:
main site:
www.example.com/
mobile site
www.example.com/
www.example.com/test.manifest open directly but using redirection it give 404 error randomly.
Please suggest any solution.
One thing wrong with your manifest. You need a colon after CACHE.
Is your redirect using code something like this?
$.mobile.changePage
I see similar issues if I use bookmarks to some of my pages rather than $.mobile.changePage.
404 is standard for the page not existing. My guess is your link is incorrect or you might have forgotten to upload your manifest file.
Maybe your paths should start without / that's often the case for me, even if the first element is a folder. But you've probably already checked that.
In my case I was able to make it working like
CACHE MANIFEST
#v1.0.1
NETWORK:
*
I had the 404 error when it was like:
CACHE MANIFEST
#v1.0.1
NETWORK
*
so without : in the NETWORK section.

Broken link issues from w3c link checker

When I check my website with W3C link checker, The checker shows:
Status: 403 Forbidden
"The link is forbidden! This needs fixing. Usual suspects: a missing index.html or Overview.html, or a missing ACL"
But that link is working fine. How can I resolve this issue from that link checker?
Screenshot:
That issue is usually used by over-zealous security filtering which either assumes that all bots are evil or that all bots using certain libraries are evil.
The checklink software uses the LWP library which often finds its way onto such blacklists. The two ways to get checklink to pass the link are:
Download checklink, edit its user-agent string, then run it from your local system and not the W3C servers. (See also: installing CPAN modules.)
Change the security filtering on the server the link is pointing to (this obviously requires that you have access to that server)
Alternatively, you can check the links manually each time you perform a check.
Its not always possible to follow all the standards.
If you have tested that the link is working fine than go ahead.

Cannot find URL when duplicating site

So I have a booking system on my website with the URL domain.co.uk/booking. I wanted to add it to a different site I have, domain.com.au/booking. It is a simple drag and drop then run an install.php file. However my domain.com.au/booking does not find anything there. the files are there in the EXACT same way as they are in my domain.co.uk site which runs fine. Are there any reasons why the path would not be found that i may simply be over looking?
N.B the error that appears is
The requested URL /booking/ was not found on this server.
Additionally, a 404 Not Found error was encountered while trying to use an ErrorDocument to handle the request.
If you are on a linux server, /booking/ does not equal /booking. That could explain the difference between the two sites (one is hosted on linux, the other is not).
Though this is just a wild guess.