chrome cache don't work in some pc - google-chrome

I have a web application and two computer, both with win7 and chrome v56.0.2924.76.
and in the app it need several js file. when I request the resources, it response with the same response headers
Cache-Control:max-age=7200, must-revalidate
Connection:Keep-Alive
Content-Encoding:gzip
Content-Language:en-US
Content-Type:application/javascript
Date:Mon, 06 Mar 2017 09:40:11 GMT
Expires:Mon, 06 Mar 2017 11:40:11 GMT
Keep-Alive:timeout=5, max=496
Last-Modified:Wed, 22 Feb 2017 18:47:28 GMT
Transfer-Encoding:chunked
X-Powered-By:Servlet/3.0
in one desktop, everything works fine, and it get all js file from cache, while in another desktop,most are fine while some are still getting from the remote.
the difference is: on the not cached desktop the request has requester header with all detail like Accept/Accept-Encoding/Cookie/etc.
and on the cached desktop have the request header say
Provisional headers are shown
I think since one of them is working, the remote server is fine, the problems is in the chrome side, while they both are the same version, I want to know anything else can I check.

Here is the root cause, "not working one" don't installed the correct certification so that the "not working one"'s https is not secure, and there is a warning before the url say "not secure", and chrome seems not cache the not secured content so the files are not cached, after I installed the correct certification it works fine now.

Related

Cache-Control response header not forcing browser to cache

I'm setting the Cache-Control and Expires HTTP headers to allow caching of static resources. This works fine as expected on Chrome and Firefox. However, IE11 and Safari make a fresh request for the static resources everytime.
Accept-Ranges:bytes
Cache-Control:max-age=31535999
Content-Length:186824
Content-Type:application/x-font-woff
Date:Thu, 21 Apr 2016 09:54:15 GMT
ETag:W/"186824-1461231024000"
Expires:Fri, 21 Apr 2017 09:54:15 GMT
Last-Modified:Thu, 21 Apr 2016 09:30:24 GMT
Server:Apache-Coyote/1.1
Do I need to set any special headers for IE and Safari? I'm using org.apache.catalina.filters.ExpiresFilter to set the response headers.
Turns out this was because of "Always refresh from server" option turned on by default when IE Developer tools is opened.

Why isn't the browser loading cdn file from cache?

Here is a very simple example to illustrate my question using JQuery from a CDN to modify the page:
<html>
<body>
<p>Hello Dean!</p>
<script src="https://cdnjs.cloudflare.com/ajax/libs/jquery/2.1.3/jquery.min.js"></script>
<script>$("p").html("Hello, Gabe!")</script>
</body>
</html>
When you load this page with an internet connection, the page displays "Hello Gabe". When I then turn off the internet connection, the page displays "Hello Dean" with an error -- JQuery is not available.
My understanding is that CDNs have a long Cache-Control and Expire in the header response, which I understand to mean that the browser caches the file locally.
$ curl -s -D - https://cdnjs.cloudflare.com/ajax/libs/jquery/2.1.3/jquery.min.js | head
HTTP/1.1 200 OK
Server: cloudflare-nginx
Date: Fri, 17 Apr 2015 16:30:33 GMT
Content-Type: application/javascript
Transfer-Encoding: chunked
Connection: keep-alive
Last-Modified: Thu, 18 Dec 2014 17:00:38 GMT
Expires: Wed, 06 Apr 2016 16:30:33 GMT
Cache-Control: public, max-age=30672000
But this does not appear to be happening. Can someone please explain what is going on? Also -- how can I get the browser to use the copy of JQuery in the cache somewhere?
This question came up because we want to be using CDN's to serve external libraries, but also want to be able to develop the page offline -- like on an airplane.
I get similar behavior using Chrome and Firefox.
There's nothing to deal with CDN. When then browser encounters a script tag, it will request it to the server, whether it's hosted on a CDN or on your server. If the browser previously loaded it, on the same address, the server tells whether it should be reloaded or not (sending 304 HTTP status code).
What you are probably looking for is to cache your application for offline use. It's possible with HTML5 cache manifest file. You need to create a file listing all files needed to be cached for explicit offline use
Since the previous answer recommended using a cache manifest file, I just wanted to make people aware that this feature is being dropped from the web standards.
Info available from Mozilla:
https://developer.mozilla.org/en-US/docs/Web/HTML/Using_the_application_cache
It's recommended to use Service workers instead of the cache manifest.

Why does chrome not request more recent version of cached file?

I have a caching issue. Chrome does not always load newer versions of site assets, most often Javascript files loaded by Require.js. Right now I've been having this problem for over 24 hours with a particular file.
If I load the page with devtools (network tab) open, the offending files typically show a HTTP 200 response, but in the "Size" column it shows "(from cache)". In the Headers details it shows "Provisional headers are shown". Wireshark shows that the file is indeed not requested from the server.
Chrome shows the Last-Modified date of the file as Sat, 06 Dec 2014 01:27:55 GMT, but my below raw request to the server clearly indicates the file has changed much more recently.
If I do a raw request myself I don't see anything in the headers returned by the server that should cause this problem:
GET /js/path/to/file.js HTTP/1.1
Host: static.mydomain.com
User-Agent: Matt
HTTP/1.1 200 OK
Access-Control-Allow-Origin: *
Vary: Accept-Encoding
Content-Type: application/javascript
Accept-Ranges: bytes
ETag: "4203477418"
Last-Modified: Fri, 16 Jan 2015 18:28:30 GMT
Content-Length: 5704
Date: Fri, 16 Jan 2015 21:05:06 GMT
Server: lighttpd/1.4.33
.... data here ...
The issue has been reported by chrome users on multiple OSes with different versions of chrome, but I do not typically receive reports of caching issues on other browsers. (Right now I'm on "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/39.0.2171.71 Safari/537.36")
Edit:
The problem seems to be most offensive with files loaded by Require.js, though I have encountered it with javascript directly referenced in the page as well.
What am I missing here? Why won't chrome check for a new version of the file?
As it turns out, browsers have poorly documented behavior regarding caching when the Cache-Control header is not specified in the server response (well, at least when no caching behavior is specified). In general, it seems that in this case a browser determines how long to cache the item for based on the file's last modified date (if declared in the response), the current date, and ????
See: https://webmasters.stackexchange.com/questions/53942/why-is-this-response-being-cached
Sadly, Google's official page on HTTP caching does not mention what happens if the header is not set: https://developers.google.com/web/fundamentals/performance/optimizing-content-efficiency/http-caching
Edit:
I ran across some more specific information about the heuristics used, here: What heuristics do browsers use to cache resources not explicitly set to be cachable?

ios7 memory issue loading json file via angularjs (Ionic) httpPromise.then

I have an odd issue. I an using angularjs (Ionic) to load an external json file via a httpPromise.
All has been working correctly until yesterday when the remote files were moved to another host.
Now the problem is on an iphone4 running ios7 it tries to load the file but can't and crashes out with memory usage issues. inspecting it using xcode it quickly climbs to over 300mb and then crashes. does the same thing on two devices. runs fine on other phone emulators etc.
Now if I host the file on another server it works as expected.
the different response headers are:
the one that fails:
Accept-Ranges bytes
Connection close
Content-Length 721255
Content-Type application/json
Date Thu, 11 Dec 2014 06:04:15 GMT
Last-Modified Thu, 11 Dec 2014 05:12:57 GMT
Server LiteSpeed
Vary User-Agent
Working host:
Accept-Ranges bytes
Connection keep-alive
Content-Encoding gzip
Content-Type application/json
Date Thu, 11 Dec 2014 06:05:01 GMT
Last-Modified Thu, 11 Dec 2014 03:29:48 GMT
Server nginx/1.6.2
Transfer-Encoding chunked
Vary Accept-Encoding,User-Agent
Code used to get json file.
var deferred = $q.defer(),
httpPromise = $http.get('http://someurl.com.au/deals.json');
httpPromise.then(function success(response) {
so after all of that my question is why would the json file not load or return an error of some sort?
But cause such a memory spike?
The only main difference I see between servers is the connection configuration.
working uses Connection keep-alive and the one that fails is closed.
thoughts?
Additionally, I've just tried it on 3g and it works fine but via wireless it doesn't work?
i think you need to focus on the Content-Encoding gzip which is probably what is saving you from memory issues

Chrome over-caching .json files served from nginx

I'm using nginx and Dojo to build an embedded UI driven by a set of JSON files. Our primary target browser is Chrome, but it should work with all modern browsers.
Changing the JSON files can change the UI drastically, and I use this to give different presentations to different users. See my previous question for the details (Configure nginx to return different files to different authenticated users with the same URI), but basically my nginx configuration is such that the same URI with different users can yield different content.
This all works very well, except when someone switches to a different user. Some browsers will grab those JSON files from their own internal cache without even checking with the server, which leaves the UI display the previous user's presentation. Reloading the page fixes it, but boy! would I rather the right thing happened automatically.
The obvious solution is to use the various cache headers, but they don't appear to help. I'm using the following nginx directives:
expires epoch;
etag off;
if_modified_since off;
add_header Last-Modified "";
... which yields the following response headers:
HTTP/1.1 200 OK
Server: nginx/1.4.1
Date: Wed, 24 Sep 2014 16:58:32 GMT
Content-Type: application/octet-stream
Content-Length: 1116
Connection: keep-alive
Expires: Thu, 01 Jan 1970 00:00:01 GMT
Cache-Control: no-cache
Accept-Ranges: bytes
This looks pretty conclusive to me, but the problem still occurs with Chrome 36 for OS X and Opera 24 for OS X (although Firefox 29 and 32 do the right thing). Chrome is content to grab files from its cache without even referring to the server.
Here's a detailed example, with headers pulled from Chrome's Network debug panel. The first time Chrome fetches /app/resources/states.json, Chrome reports
Remote Address:75.144.159.89:8765
Request URL:http://XXXXXXXXXXXXXXX/app/resources/screens.json
Request Method:GET
Status Code:200 OK
with request headers:
Accept:*/*
Accept-Encoding:gzip,deflate,sdch
Accept-Language:en-US,en;q=0.8
Authorization:Basic dm9sdGFpcndlYjp2b2x0YWly
Cache-Control:max-age=0
Connection:keep-alive
Content-Type:application/x-www-form-urlencoded
DNT:1
Host:suitable.dyndns.org:8765
Referer:http://XXXXXXXXXXXXXXXXXXXXXX/
User-Agent:Mozilla/5.0 (Macintosh; Intel Mac OS X 10_6_8) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/36.0.1985.125 Safari/537.36
X-Requested-With:XMLHttpRequest
and response headers:
Accept-Ranges:bytes
Cache-Control:no-cache
Connection:keep-alive
Content-Length:2369
Content-Type:application/octet-stream
Date:Wed, 24 Sep 2014 17:19:46 GMT
Expires:Thu, 01 Jan 1970 00:00:01 GMT
Server:nginx/1.4.1
Again, all fine and good. But, when I change the user (by restarting Chrome and then reloading the parent page), I get the following Chrome report:
Remote Address:75.144.159.89:8765
Request URL:http://suitable.dyndns.org:8765/app/resources/states.json
Request Method:GET
Status Code:200 OK (from cache)
with no apparent contact to the server.
This doesn't seem to happen with all files. A few .js files are cached, most are not; none of the .css files seem to be cached; all the .html files are cached, and all of the .json files are cached.
How can I tell the browser (I'm looking at you, Chrome!) that these files are good at the moment it requests them, but will never again be good? Is this a Chrome bug? (If so, it's strange that Opera also shows the problem.)
I believe I've found the problem. Apparently "Cache-Control: no-cache" is insufficient to tell the browser to, um, not cache the data. I added "no-store":
Cache-Control:no-store, no-cache
and that did the trick. No more caching by Chrome or Opera.
I had the same problem, with json being cached...
If you control the client application-code, a possible workaround is to just add a random-value query-parameter at the end of the URL.
So instead of calling:
http://XXXXXXXXXXXXXXX/app/resources/screens.json
you call, for example:
http://XXXXXXXXXXXXXXX/app/resources/screens.json?rand=rrrrrrrrrr
where rrrrrrrrrr is some random-value that is different in each call.
Then, the browser will not be able to reuse any cached values.