I have a react app and save the bundle.js on a CDN (or S3 for this example)
on save I run gzip -9
on upload to CDN / S3 i add headers: Content-Encoding: gzip
now each time a browser / http client will download the bundle it will get:
curl -I https://cdn.example.com/bundle.min.js
HTTP/2 200
content-type: application/javascript
content-length: 3304735
date: Wed, 27 Feb 2019 22:27:19 GMT
last-modified: Wed, 27 Feb 2019 22:26:53 GMT
content-encoding: gzip
accept-ranges: bytes
this works fine if I test this in a browser. my only concern is that now we only save a gzip version of the js bundle and users will get it regardless of sending over the Accept-Encoding: gzip in the request
I cant think of any issues this will cause for browsers but I might be missing something.
Is it a bad practice to "enforce" gzip in the response for the bundle.js file ?
Its several month late, but the answer may still be relevant to someone trying the same thing.
Pre-compression with high compression settings is something that can help with saving few more percentages of bandwidth on static resources. However, forcing a single encoding may create problem for some users. As per a study conducted in 2010, ~15% of users with gzip-capable browsers were not sending an appropriate Accept-Encoding request header. The reason being anti-virus software or intermediaries/proxies striping Accept-Encoding header to force the server to send the content in plain-text form.
If you need a CDN that covers the gap, PageCDN serves exactly the same purpose, but offers superior brotli-11 compression, and falls back to gzip for browsers that do not support brotli. You do not need to connect to S3 or any external storage. Just connect to your website, github, or manually upload files to CDN and configure the compression level for files.
Related
Here is a very simple example to illustrate my question using JQuery from a CDN to modify the page:
<html>
<body>
<p>Hello Dean!</p>
<script src="https://cdnjs.cloudflare.com/ajax/libs/jquery/2.1.3/jquery.min.js"></script>
<script>$("p").html("Hello, Gabe!")</script>
</body>
</html>
When you load this page with an internet connection, the page displays "Hello Gabe". When I then turn off the internet connection, the page displays "Hello Dean" with an error -- JQuery is not available.
My understanding is that CDNs have a long Cache-Control and Expire in the header response, which I understand to mean that the browser caches the file locally.
$ curl -s -D - https://cdnjs.cloudflare.com/ajax/libs/jquery/2.1.3/jquery.min.js | head
HTTP/1.1 200 OK
Server: cloudflare-nginx
Date: Fri, 17 Apr 2015 16:30:33 GMT
Content-Type: application/javascript
Transfer-Encoding: chunked
Connection: keep-alive
Last-Modified: Thu, 18 Dec 2014 17:00:38 GMT
Expires: Wed, 06 Apr 2016 16:30:33 GMT
Cache-Control: public, max-age=30672000
But this does not appear to be happening. Can someone please explain what is going on? Also -- how can I get the browser to use the copy of JQuery in the cache somewhere?
This question came up because we want to be using CDN's to serve external libraries, but also want to be able to develop the page offline -- like on an airplane.
I get similar behavior using Chrome and Firefox.
There's nothing to deal with CDN. When then browser encounters a script tag, it will request it to the server, whether it's hosted on a CDN or on your server. If the browser previously loaded it, on the same address, the server tells whether it should be reloaded or not (sending 304 HTTP status code).
What you are probably looking for is to cache your application for offline use. It's possible with HTML5 cache manifest file. You need to create a file listing all files needed to be cached for explicit offline use
Since the previous answer recommended using a cache manifest file, I just wanted to make people aware that this feature is being dropped from the web standards.
Info available from Mozilla:
https://developer.mozilla.org/en-US/docs/Web/HTML/Using_the_application_cache
It's recommended to use Service workers instead of the cache manifest.
I have a caching issue. Chrome does not always load newer versions of site assets, most often Javascript files loaded by Require.js. Right now I've been having this problem for over 24 hours with a particular file.
If I load the page with devtools (network tab) open, the offending files typically show a HTTP 200 response, but in the "Size" column it shows "(from cache)". In the Headers details it shows "Provisional headers are shown". Wireshark shows that the file is indeed not requested from the server.
Chrome shows the Last-Modified date of the file as Sat, 06 Dec 2014 01:27:55 GMT, but my below raw request to the server clearly indicates the file has changed much more recently.
If I do a raw request myself I don't see anything in the headers returned by the server that should cause this problem:
GET /js/path/to/file.js HTTP/1.1
Host: static.mydomain.com
User-Agent: Matt
HTTP/1.1 200 OK
Access-Control-Allow-Origin: *
Vary: Accept-Encoding
Content-Type: application/javascript
Accept-Ranges: bytes
ETag: "4203477418"
Last-Modified: Fri, 16 Jan 2015 18:28:30 GMT
Content-Length: 5704
Date: Fri, 16 Jan 2015 21:05:06 GMT
Server: lighttpd/1.4.33
.... data here ...
The issue has been reported by chrome users on multiple OSes with different versions of chrome, but I do not typically receive reports of caching issues on other browsers. (Right now I'm on "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/39.0.2171.71 Safari/537.36")
Edit:
The problem seems to be most offensive with files loaded by Require.js, though I have encountered it with javascript directly referenced in the page as well.
What am I missing here? Why won't chrome check for a new version of the file?
As it turns out, browsers have poorly documented behavior regarding caching when the Cache-Control header is not specified in the server response (well, at least when no caching behavior is specified). In general, it seems that in this case a browser determines how long to cache the item for based on the file's last modified date (if declared in the response), the current date, and ????
See: https://webmasters.stackexchange.com/questions/53942/why-is-this-response-being-cached
Sadly, Google's official page on HTTP caching does not mention what happens if the header is not set: https://developers.google.com/web/fundamentals/performance/optimizing-content-efficiency/http-caching
Edit:
I ran across some more specific information about the heuristics used, here: What heuristics do browsers use to cache resources not explicitly set to be cachable?
I have an odd issue. I an using angularjs (Ionic) to load an external json file via a httpPromise.
All has been working correctly until yesterday when the remote files were moved to another host.
Now the problem is on an iphone4 running ios7 it tries to load the file but can't and crashes out with memory usage issues. inspecting it using xcode it quickly climbs to over 300mb and then crashes. does the same thing on two devices. runs fine on other phone emulators etc.
Now if I host the file on another server it works as expected.
the different response headers are:
the one that fails:
Accept-Ranges bytes
Connection close
Content-Length 721255
Content-Type application/json
Date Thu, 11 Dec 2014 06:04:15 GMT
Last-Modified Thu, 11 Dec 2014 05:12:57 GMT
Server LiteSpeed
Vary User-Agent
Working host:
Accept-Ranges bytes
Connection keep-alive
Content-Encoding gzip
Content-Type application/json
Date Thu, 11 Dec 2014 06:05:01 GMT
Last-Modified Thu, 11 Dec 2014 03:29:48 GMT
Server nginx/1.6.2
Transfer-Encoding chunked
Vary Accept-Encoding,User-Agent
Code used to get json file.
var deferred = $q.defer(),
httpPromise = $http.get('http://someurl.com.au/deals.json');
httpPromise.then(function success(response) {
so after all of that my question is why would the json file not load or return an error of some sort?
But cause such a memory spike?
The only main difference I see between servers is the connection configuration.
working uses Connection keep-alive and the one that fails is closed.
thoughts?
Additionally, I've just tried it on 3g and it works fine but via wireless it doesn't work?
i think you need to focus on the Content-Encoding gzip which is probably what is saving you from memory issues
I am reasonably new to browser caching. I am attempting to get Chrome to permanently cache any static file with a query parameter (for cache busting purposes). I have set Cache-Control and Expires headers way into the future, which should be adequate to say "cache this forever". The resulting response headers:
HTTP/1.1 200 OK
Cache-Control: public, max-age=315360000
Connection: keep-alive
Content-Encoding: gzip
Content-Type: application/x-javascript
Date: Wed, 16 Jul 2014 09:29:54 GMT
Last-Modified: Wed, 16 Jul 2014 03:44:14 GMT
Server: nginx/1.6.0
Transfer-Encoding: chunked
Vary: Accept-Encoding
Firefox and Safari seem to respect this for all cachebusted (?v= query parameter) files. Chrome mostly follows the directives, except for Javascript. Most of the time it does a request with an If-Modified-Since header rather than loading from cache. Sometimes one of them will load from cache and the other will yield a request resulting in a 304. Usually when loading the page from a new tab it will load from cache, but not if you hit enter in the address bar.
I've observed other websites using what I think are the exact same headers, and the files are always loaded from cache. Some of them load from cache even if you do a refresh.
I understand cache behaviour is somewhat unpredictable, but I want to make sure I'm not overseeing something that's making Chrome do that?
I had the same issue with chrome and after some hours of trial and error I figuered out, that chrome seems to have a problem with the Vary Header
I've got this snippet in my Apache / .htaccess config and as soon as I comment the line "Header append Vary Accept-Encoding" Chrome starts caching .js and .css files
<FilesMatch "(\.js\.gz|\.css\.gz)$">
# Serve correct encoding type.
Header set Content-Encoding gzip
# Force proxies to cache gzipped & non-gzipped css/js files separately.
#Header append Vary Accept-Encoding
</FilesMatch>
It still does not work while running the request via our nignx server, because it is adding the Vary: Accept-Encoding header too, when delivering gzip compressed.
So far I can guess this is a problem that only happens with chrome and as a workaround I would change the configuration to append the header only if chrome (haven't checked for safari) is not the client until there is a better fix:
<FilesMatch "(\.js\.gz|\.css\.gz)$">
# Serve correct encoding type.
Header set Content-Encoding gzip
# Force proxies to cache gzipped & non-gzipped css/js files separately.
BrowserMatch "Chrome" ChromeFound
Header append Vary Accept-Encoding env=!ChromeFound
</FilesMatch>
Can't seem to get chrome to play videos with html5 video tag when I host them on a Rackspace cloudfiles server.
Works perfectly on the regular hosting, but as soon as I link the video with the rackspace cdn url, Chrome freezes (total freeze, website UI completely blocked - after a while Chrome pops a message saying "The following page has become unresponsive bla bla bla").
The video file is fine as it's same as when I link to the regular hosting.
Did a bit of spying on the requests, and I initially thought the problem was that the webm files were serverd by default as application/octet-stream mime-type. I lodged a ticket to rackspace and they gave me a way to force the mime-type when uploading the file. Did that, and the file is now correctly sent as video/webm.. but Chrome still freezes.
Any idea what could be going wrong here?
EDIT: using iheartvideo, loading the video from rackspace triggers a MEDIA_ERR_SRC_NOT_SUPPORTED. Same video off local web server works totally fine (??)
EDIT 2: Happens on both mac and windows with latest mainstream chrome
EDIT 3: curl -I results:
Rackspace (no worky):
HTTP/1.1 200 OK
Server: nginx/0.7.65
Content-Type: video/webm
Last-Modified: Thu, 24 Feb 2011 23:45:12 GMT
ETag: 7029f83b241aa691859012dfa047e20d
Content-Length: 20173074
Cache-Control: public, max-age=900
Expires: Fri, 25 Feb 2011 01:32:11 GMT
Date: Fri, 25 Feb 2011 01:17:11 GMT
Connection: keep-alive
Web Server (worky)
HTTP/1.1 200 OK
Date: Fri, 25 Feb 2011 01:17:51 GMT
Server: Apache
Last-Modified: Thu, 24 Feb 2011 03:56:26 GMT
ETag: "11a0b47-133d112-49cff32940e80"
Accept-Ranges: bytes
Content-Length: 20173074
Content-Type: text/plain
EDIT 4: For those interested, this is what the rackscape crew told me to do to set a webm content type on a file:
The file browser is not smart enough
to determine the content type
video/webm. Unfortunately, there is
not a way to change the content type
of a file that has already been
uploaded.
You'll need to use one of the API to
re-upload your files with the correct
content type.
You can also use curl from a
linux/MacOS command line if available.
Using your username and api key run
this command...
curl -I -X GET -H "X-Auth-User: USERNAME" -H "X-Auth-Key: API_KEY" https://auth.api.rackspacecloud.com/v1.0
From the output there are 2 important
values.
X-Storage-Url: https://storage101.......
X-Storage-Token: Long hash
You can upload the files with,
curl -X PUT -T test.webm -H "Content-Type: video/webm" -H "Content-Length: FILESIZEINBYTE" -H "X-Auth-Token: TOKEN FROM RESPONSE ABOVE" https://STORAGE URL FROM RESPONSE ABOVE/test.webm
You must specify the content type, and
you must give the correct length of
bytes of what is being uploaded. If
not you will get an invalid request
error.
I work quite a bit with the Rackspace API.
Their API actually allows you to set a container as streaming enabled. My first instinct tells me that you haven't done this.
I stream various file types and they all work an absolute treat.
There is more information about CDN Streaming Enabled containers here:
http://docs.rackspace.com/files/api/v1/cf-devguide/content/Streaming-CDN-Enabled_Containers-d1f3721.html
Hopefully this helps, but if not let me know and I don't mind putting some PHP example code together to help you along.
Its all quite easy, but getting your head around the various API operations that Rackspace have implemented can sometimes be a daunting task.
I don't have a concrete answer, just some thoughts:
what about in other browsers?
it works on the web server with the content type above is text/plain so why force to video/webm?
can Rackspace provide you (or can you find on their site or someone else's) some sample content that does play so you can inspect it?
You could try the free trial from Brightcove or Bitgravity and see if that works...