HTTP header to detect a preload request by Google Chrome - google-chrome

Google Chrome 17 introduced a new feature which preloads a webpage to improve rendering speed upon actually making the request (hitting enter in the omnibar).
Two questions:
Is there a HTTP header to detect such a request on server side, and if one actually exists what is the proper response in order to prevent such preloading (to prevent unintended requests which might have unwanted effects)?
Does Google Chrome check the robots.txt before making preload requests?
Is there a robots.txt setting which targets only this specific behaviour? (I supose/hope disallow already works).
Is there a meta tag to inform Google Chrome to never preload again on the current domain?

When Firefox pre-fetches content (at the behest of the referrer page’s
markup), it sends the following header with the request: X-moz:
prefetch
Safari does similarly, using: X-Purpose: preview. According to this
ticket , Chrome does, too.
For pre-rendering, Chrome does not send any header whatsoever to the
client. Instead, one must use the Page Visibility API, in JS
source, additional reading

Chrome stopped sending X-Purpose header in 2011 and they stated that they won't fix it there: https://code.google.com/p/chromium/issues/detail?id=86175.
They re-introduced sending Purpose:prefetch headers with all nostate-prefetch requests back in 2018 as stated by the last comment on this issue. https://bugs.chromium.org/p/chromium/issues/detail?id=86175#c65

Related

Published Visio Web Page but browser showing cached page

I am publishing a Visio 2013 drawing to a site (using File->Export->Change File Type->Web Page (*.htm). The website is checked into CM and then labeled. My web server has a (ClearCase) view based on this label that automatically refreshes itself.
What I'm finding is that my browsers are always showing the cached (old) version of the pages. I've been able to change my IE browser settings so that it always refreshes the cache (Internet Options->General Tab->Settings->Check for newer versions of stored pages = Every time I visit the webpage. When I do this, I see the changes.
But, this isn't a real solution. I don't want to have to tell my viewers to change their browser settings so it automatically refreshes. Is there something I need to do to the page contents to tell all browsers to refresh?
You can handle caching of browsers through the http response your server provides.
The Cache-Control field of the http response no-cache can prevent clients from caching the drawing if it is something you expect would be refreshed often: Mozilla Docs
Alternatively you can set the Expires field of the http response to expire after a day if the image would be updated on a periodic basis: Mozilla Docs
The following question has a good comparison of the two: what’s the difference between Expires and Cache-Control headers?

Chrome extension - Disable Blocking of Mixed Content [duplicate]

This question already has an answer here:
Since v38, Chrome extension cannot load from HTTP URLs anymore, workaround?
(1 answer)
Closed 6 years ago.
So I'm building a Chrome extension that takes images from the current tabs and sends those images to a server to host the image. It works great for many sites, but on major sites like Instagram and Pinterest, it won't work because the browser blocks mixed content (HTTP and HTTPS). I get the following error message in the console:
Mixed Content: The page at 'https://www.instagram.com/' was loaded over HTTPS, but requested an insecure XMLHttpRequest endpoint. This request has been blocked; the content must be served over HTTPS.
I checked this post and it doesn't appear to help me with regards to extensions spcifically: https://productforums.google.com/forum/#!topic/chrome/OrwppKWbKnc
Also, I tried to add the server URL to the permissions in manifest.json and that did nothing for me, either.
My question is this: is there a way for me to have a Chrome extension that allows mixed content for just my server or is my only option to switch my server over to HTTPS?
If you send http request from content scripts, since it lives in the same context with the webpage, it will be restricted by SOP, which is browser behavior.
You could move your http request from content scripts to background page (either by Message Passing or some other trigger like browser Action), since background page lives in the context of the extension, while extension itself can bypass the SOP by adding server URL to permissions.

Chrome and Safari caching 302 redirect

Various flavors of this have been asked already, but I've yet to see a real answer for this.
We have a separate image service that our web app uses to get some of its images. The image service is well tested and is functioning properly. To make it concrete, our app is served from domain.com. The src element of img elements is images.domain.com/{imageId}. The image service retrieves the URL for the image and sends back a HTTP 302 redirect for the image.
The application allows users to change images. So say a user 5 has image A as a profile image, and decides to change it by uploading image B. When the user uploads the image, the application cache is properly invalidated and the database is updated. The application does a standard redirect after POST and one of the elements in the page that the user is redirected to after changing her image is something like:
<img src="example.domain.com/5">
The problem is that Chrome never makes the call to example.domain.com/5 to retrieve the image upon the initial redirect or a regular reload of the page, it simply serves image A from the browser cache. A standalone call to example.domain.com/5 correctly returns image B, and a hard refresh or clearing Chrome's cache forces Chrome to request the image's src, which correctly returns image B. Note that I'm not talking about serving either image from the cache after getting a 304 Not Modified response, I'm talking about Chrome deciding not to visit the img src at all and just returning image A. Also, appending some unique query string to the img's src attribute fixes the problem, but that's a hack that we'd rather not have to do.
It's worth noting that Firefox was doing the same thing initially. There were no Cache Control headers in the response initially. We added a Cache Control: no-cache header (and tried no-store as well) to the response header and this fixed the behavior in Firefox, but Chrome and Safari still serve the outdated, cached image without making a call to the image's src.
It looks like this was a longstanding bug in Chromium (https://code.google.com/p/chromium/issues/detail?id=103458) that's allegedly fixed as of about 6 weeks ago, but we're using the most updated version of Chrome.
We've looked at the answers here and here but they don't actually answer the question.
Per section 14.9.1 of RFC 2616:
If the no-cache directive does not specify a field-name, then a cache MUST NOT use the response to satisfy a subsequent request without successful revalidation with the origin server. This allows an origin server to prevent caching even by caches that have been configured to return stale responses to client requests.
Unless we're missing something or doing something wrong, it would seem that Chrome (and Safari) are not complying the RFC behavior for no-cache headers for 302 redirects? Anyone experience this before or have any insight?
cache-control: no-store
I had the same maddening problem that you described (slightly different in that it was a missing cookie redirect back to the login page) that worked everywhere but Safari.
In desperation, I came across this open WebKit bug and saw the fateful comment (and finally a glimmer of hope):
CachedRawResource now keeps the redirect chain, and has some trivial logic for checking correctness, but it's nowhere near complete (only checks cacheControlContainsNoStore()). And of course other resource types don't have anything.
Added no-store to my cache-control header and no more problem.
Please read this.. https://www.hochmanconsultants.com/articles/301-versus-302.shtml
Chrome has pretty aggressive caching.
When you use a temporary redirect, you are basically saying that the actual URL is temporarily unavailable so use this other URL instead. Chrome rightly caches the old URL because it is still valid. Try a 301 instead. Then Chrome should know that the original URL is no longer valid.

Partial SSL in Chrome

Visiting my site in SSL and in Chrome (12.0) I get
Your connection to someWebsite is
encrypted with 256-bit encryption.
However, this page includes other
resources which are not secure. These
resources can be viewed by others
while in transit, and can be modified
by an attacker to change the behaviour
of the page.
The connection uses TLS 1.0.
The connection is encrypted using
AES-256_CBC, with SHA1 for message
authentication and DHE_RSA as the key
exchange mechanism.
The connection is compressed with
DEFLATE.
I searched with FireBug (NET tab) and Chrome Inspector and all resources are accessed via https. Where is the problem? *I cleared the cache already
What could be the problem?
Chrome will give this error if you've visited another https page on the same domain that had mixed content however this should not be the problem if you've tried clearing your cache.
You might want to try Ctrl-Shift-J for the JavaScript console, it should show the insecure content.
I have the same thing - and I read from the Google Chrome help site that elements on the site are not encrypted - like videos. I looked via Firefox - right click->View Page Info->Media tab and saw that every time I use a YouTube video in my video player I have plain http addresses like:
http://s.ytimg.com/yt/swfbin/watch_as3-vflrEm9Nq.swf and
http://img.youtube.com/vi/V6JgyNy59yA/1.jpg
I think these non https links are causing the security message site-wide. Thus, it appears using videos from 3rd party sites will always throw a security error in Google Chrome for https pages.
That's my answer - but I have no solution yet. I need to be able to share videos from youTube in our news section, but my online store section needs to use https without scary red letters and slashes through it for my clients.
Has anyone dealt with this effectively?
Thanks
Had the same problem on my Magento Site. Be sure to change all image and js links (even in .css) from http:// to simply //. Solved it for me.
I had the same issue, my problem was that some img tags had src to http instead of https, it does not matter even they link to other domain like <img src="http://otherdomain.com/image.jpg" /> it still shows that warning. As soon as I changed all internal and external img links to https the warning disappeared.
If you check the page and it seems to have no insecure content, check to make sure that something on the page is not submitting data to an insecure location.
Content should be submitted over HTTPS, not HTTP.

Copying cookies cross-domain, why is IE blocking cookies other browsers are sending with the SCRIPT tag

Trying to copy a cookie from second.com to first.com, with full control of both domains.
Previously an iFrame was used, however this is not able to work across all browsers as it touched on 'third-party cookies' which are hard to implement and impossible in Safari and Chrome.
The new approach uses a SCRIPT tag pointing to second.com and included in the HEAD of first.com. The server-side script is actually a piece of Java which reads the cookies sent with the request (the cookies from second.com) and the JavaScript returned executes on first.com and essentially duplicates the cookie here. This is working great in all browsers except IE, where IE appears to not be sending the second.com cookies with SCRIPT request, so the Java is not able to pickup the cookie value from second.com.
This is surely to do with IE security settings as when I put privacy to the lowest level it is working, but my question is why are the cookies being blocked at all? I thought the SCRIPT tag was not subject to the same origin policy (that AJAX and other technologies have to comply with).
Any solution to this without heading down the P3P privacy policy route?
It's definitely IE security settings, if you're attempting this you'll need to set a P3P compact privacy policy on the page which sets the cookie on first.com, even before you've reached second.com