I'm using angular with i18n translations in json files like de.json and en.json. In my production environment (nginx) I have the problem that these JSON files are cached by the web browser. After an upgrade, Chrome will not download the new version of the current json file even though the date header has changed.
Request Information (Chrome):
Request URL: https://[my-site]/assets/i18n/de.json
Request Method: GET
Status Code: 200 (from disk cache)
Remote Address: X.X.X.X:443
Referrer Policy: no-referrer-when-downgrade
Response Headers:
content-encoding: gzip
content-type: application/json
date: Fri, 15 Feb 2019 09:04:42 GMT
etag: W/"5c62bf4d-2aea"
last-modified: Tue, 12 Feb 2019 12:42:53 GMT
server: nginx/1.14.0 (Ubuntu)
status: 304
Does anyone have experience with this problem and can help me?
Not specifically a fix for Angular/nginx, but a practice I often use is to append a query string parameter to the resource when you load it. For me, this is typically derived from version number of the .js file / application, e.g. using it as a seed for a RNG
So, instead of: <script src="/assets/de.json" />
use
<script src="/assets/de.json?_=12345" />
Bonus points- in your Angular application, you can keep track of what version of the assets that you want to include, meaning that you can release new asset files without having clients immediately update to them if they have them in local cache (although note that new clients will get the new version regardless)
Related
I have a react app and save the bundle.js on a CDN (or S3 for this example)
on save I run gzip -9
on upload to CDN / S3 i add headers: Content-Encoding: gzip
now each time a browser / http client will download the bundle it will get:
curl -I https://cdn.example.com/bundle.min.js
HTTP/2 200
content-type: application/javascript
content-length: 3304735
date: Wed, 27 Feb 2019 22:27:19 GMT
last-modified: Wed, 27 Feb 2019 22:26:53 GMT
content-encoding: gzip
accept-ranges: bytes
this works fine if I test this in a browser. my only concern is that now we only save a gzip version of the js bundle and users will get it regardless of sending over the Accept-Encoding: gzip in the request
I cant think of any issues this will cause for browsers but I might be missing something.
Is it a bad practice to "enforce" gzip in the response for the bundle.js file ?
Its several month late, but the answer may still be relevant to someone trying the same thing.
Pre-compression with high compression settings is something that can help with saving few more percentages of bandwidth on static resources. However, forcing a single encoding may create problem for some users. As per a study conducted in 2010, ~15% of users with gzip-capable browsers were not sending an appropriate Accept-Encoding request header. The reason being anti-virus software or intermediaries/proxies striping Accept-Encoding header to force the server to send the content in plain-text form.
If you need a CDN that covers the gap, PageCDN serves exactly the same purpose, but offers superior brotli-11 compression, and falls back to gzip for browsers that do not support brotli. You do not need to connect to S3 or any external storage. Just connect to your website, github, or manually upload files to CDN and configure the compression level for files.
I'm new to this with a very low background so bear with me. I want to debug my issue of prematurely terminating loading the page from the reverse proxied site as it request something like filename.json file. I noticed in the request header it has Content-Type:application/x-www-form-urlencoded, and the response headers are
$WSEP:
Cache-Control:no-cache
Connection:Keep-Alive
Content-Encoding:gzip
Content-Language:en-US
Content-Length:55
Content-Type:text/html;charset=UTF-8
Date:Sun, 12 Jun 2016 04:54:18 GMT
Expires:Thu, 01 Jan 1970 00:00:00 GMT
Keep-Alive:timeout=5, max=97
Server:Apache
Vary:Accept-Encoding
X-Cnection:Close
but I'm wondering why it's text/html content-type when it's json, so I tried manipulating the response with application/json and even trying to instead of downloading the page from the remote server, I made it download locally from my server /var/www/html by creating dummy json file. I was thinking because this is json file it should be application/json not text/html as Content-Type. But the issue still exists. It just stops when it reached requesting that json file.
Now my question is, how can I make sure that requesting to that json is the culprit? Or more appropriately, how can I debug the mod_perl. I tried adding
use warning;
in the code
PerlWarn On
in the apache config but still I don't see any warning/errors that will guide me to debug.
Can you give me advice on how to do find the fault?
By the way, the site of that json file being requested returns 404 but I assume that the proxy server should ignore it and continue downloading other files.
Here is a very simple example to illustrate my question using JQuery from a CDN to modify the page:
<html>
<body>
<p>Hello Dean!</p>
<script src="https://cdnjs.cloudflare.com/ajax/libs/jquery/2.1.3/jquery.min.js"></script>
<script>$("p").html("Hello, Gabe!")</script>
</body>
</html>
When you load this page with an internet connection, the page displays "Hello Gabe". When I then turn off the internet connection, the page displays "Hello Dean" with an error -- JQuery is not available.
My understanding is that CDNs have a long Cache-Control and Expire in the header response, which I understand to mean that the browser caches the file locally.
$ curl -s -D - https://cdnjs.cloudflare.com/ajax/libs/jquery/2.1.3/jquery.min.js | head
HTTP/1.1 200 OK
Server: cloudflare-nginx
Date: Fri, 17 Apr 2015 16:30:33 GMT
Content-Type: application/javascript
Transfer-Encoding: chunked
Connection: keep-alive
Last-Modified: Thu, 18 Dec 2014 17:00:38 GMT
Expires: Wed, 06 Apr 2016 16:30:33 GMT
Cache-Control: public, max-age=30672000
But this does not appear to be happening. Can someone please explain what is going on? Also -- how can I get the browser to use the copy of JQuery in the cache somewhere?
This question came up because we want to be using CDN's to serve external libraries, but also want to be able to develop the page offline -- like on an airplane.
I get similar behavior using Chrome and Firefox.
There's nothing to deal with CDN. When then browser encounters a script tag, it will request it to the server, whether it's hosted on a CDN or on your server. If the browser previously loaded it, on the same address, the server tells whether it should be reloaded or not (sending 304 HTTP status code).
What you are probably looking for is to cache your application for offline use. It's possible with HTML5 cache manifest file. You need to create a file listing all files needed to be cached for explicit offline use
Since the previous answer recommended using a cache manifest file, I just wanted to make people aware that this feature is being dropped from the web standards.
Info available from Mozilla:
https://developer.mozilla.org/en-US/docs/Web/HTML/Using_the_application_cache
It's recommended to use Service workers instead of the cache manifest.
I have an odd issue. I an using angularjs (Ionic) to load an external json file via a httpPromise.
All has been working correctly until yesterday when the remote files were moved to another host.
Now the problem is on an iphone4 running ios7 it tries to load the file but can't and crashes out with memory usage issues. inspecting it using xcode it quickly climbs to over 300mb and then crashes. does the same thing on two devices. runs fine on other phone emulators etc.
Now if I host the file on another server it works as expected.
the different response headers are:
the one that fails:
Accept-Ranges bytes
Connection close
Content-Length 721255
Content-Type application/json
Date Thu, 11 Dec 2014 06:04:15 GMT
Last-Modified Thu, 11 Dec 2014 05:12:57 GMT
Server LiteSpeed
Vary User-Agent
Working host:
Accept-Ranges bytes
Connection keep-alive
Content-Encoding gzip
Content-Type application/json
Date Thu, 11 Dec 2014 06:05:01 GMT
Last-Modified Thu, 11 Dec 2014 03:29:48 GMT
Server nginx/1.6.2
Transfer-Encoding chunked
Vary Accept-Encoding,User-Agent
Code used to get json file.
var deferred = $q.defer(),
httpPromise = $http.get('http://someurl.com.au/deals.json');
httpPromise.then(function success(response) {
so after all of that my question is why would the json file not load or return an error of some sort?
But cause such a memory spike?
The only main difference I see between servers is the connection configuration.
working uses Connection keep-alive and the one that fails is closed.
thoughts?
Additionally, I've just tried it on 3g and it works fine but via wireless it doesn't work?
i think you need to focus on the Content-Encoding gzip which is probably what is saving you from memory issues
Can't seem to get chrome to play videos with html5 video tag when I host them on a Rackspace cloudfiles server.
Works perfectly on the regular hosting, but as soon as I link the video with the rackspace cdn url, Chrome freezes (total freeze, website UI completely blocked - after a while Chrome pops a message saying "The following page has become unresponsive bla bla bla").
The video file is fine as it's same as when I link to the regular hosting.
Did a bit of spying on the requests, and I initially thought the problem was that the webm files were serverd by default as application/octet-stream mime-type. I lodged a ticket to rackspace and they gave me a way to force the mime-type when uploading the file. Did that, and the file is now correctly sent as video/webm.. but Chrome still freezes.
Any idea what could be going wrong here?
EDIT: using iheartvideo, loading the video from rackspace triggers a MEDIA_ERR_SRC_NOT_SUPPORTED. Same video off local web server works totally fine (??)
EDIT 2: Happens on both mac and windows with latest mainstream chrome
EDIT 3: curl -I results:
Rackspace (no worky):
HTTP/1.1 200 OK
Server: nginx/0.7.65
Content-Type: video/webm
Last-Modified: Thu, 24 Feb 2011 23:45:12 GMT
ETag: 7029f83b241aa691859012dfa047e20d
Content-Length: 20173074
Cache-Control: public, max-age=900
Expires: Fri, 25 Feb 2011 01:32:11 GMT
Date: Fri, 25 Feb 2011 01:17:11 GMT
Connection: keep-alive
Web Server (worky)
HTTP/1.1 200 OK
Date: Fri, 25 Feb 2011 01:17:51 GMT
Server: Apache
Last-Modified: Thu, 24 Feb 2011 03:56:26 GMT
ETag: "11a0b47-133d112-49cff32940e80"
Accept-Ranges: bytes
Content-Length: 20173074
Content-Type: text/plain
EDIT 4: For those interested, this is what the rackscape crew told me to do to set a webm content type on a file:
The file browser is not smart enough
to determine the content type
video/webm. Unfortunately, there is
not a way to change the content type
of a file that has already been
uploaded.
You'll need to use one of the API to
re-upload your files with the correct
content type.
You can also use curl from a
linux/MacOS command line if available.
Using your username and api key run
this command...
curl -I -X GET -H "X-Auth-User: USERNAME" -H "X-Auth-Key: API_KEY" https://auth.api.rackspacecloud.com/v1.0
From the output there are 2 important
values.
X-Storage-Url: https://storage101.......
X-Storage-Token: Long hash
You can upload the files with,
curl -X PUT -T test.webm -H "Content-Type: video/webm" -H "Content-Length: FILESIZEINBYTE" -H "X-Auth-Token: TOKEN FROM RESPONSE ABOVE" https://STORAGE URL FROM RESPONSE ABOVE/test.webm
You must specify the content type, and
you must give the correct length of
bytes of what is being uploaded. If
not you will get an invalid request
error.
I work quite a bit with the Rackspace API.
Their API actually allows you to set a container as streaming enabled. My first instinct tells me that you haven't done this.
I stream various file types and they all work an absolute treat.
There is more information about CDN Streaming Enabled containers here:
http://docs.rackspace.com/files/api/v1/cf-devguide/content/Streaming-CDN-Enabled_Containers-d1f3721.html
Hopefully this helps, but if not let me know and I don't mind putting some PHP example code together to help you along.
Its all quite easy, but getting your head around the various API operations that Rackspace have implemented can sometimes be a daunting task.
I don't have a concrete answer, just some thoughts:
what about in other browsers?
it works on the web server with the content type above is text/plain so why force to video/webm?
can Rackspace provide you (or can you find on their site or someone else's) some sample content that does play so you can inspect it?
You could try the free trial from Brightcove or Bitgravity and see if that works...