I'm currently using the Google CDN on a load balancer backend. On my site we have a large number of static files that are a unique file-type (not the standard files cached on the CDN). Is it possible to cache non-standard files on the CDN? Probably not, but wanted to check. Thank you.
When you say unique file-types, are you referring to custom MIME types? Have you tried using the FORCE_CACHE_ALL cache mode?
Related
I was wondering how you can handle files with variables in the url like
www.mysite.com?id=myvariable
I can't possibly have to store every possibility?
I used these variables because when the users are online they should be able to share their social media.
Kind regards
If you want these pages to work offline as is then yes, you have to store every possibility. You surely can't expect the browser to guess what all the possibilities are?
If you want to use dynamic data in your offline app then request it with AJAX and keep it in local storage. Don't store your data in pages, keep the pages static.
If you need to have bookmarkable URLs for different bits of your app that use different data on the same page then use the History API.
I need to store my js files on browser to reduce load time.
I know that I can use local storage but its not proper way of storing files on browser , its made to store data not files.
Cache manifest we can use , but can I access it while I am online?
Also can specify if there is any better way to store js/css files on browser.
Your JavaScript files will be automatically stored in browser caches according to normal cache principles and procedures, see http://www.mnot.net/cache_docs/
You can use a cacheability tester like http://www.ircache.net/cgi-bin/cacheability.py to check the cacheability of a resource, including .js files.
The so-called cache manifest in HTML5 drafts is meant for use in applications that change rarely and can be used offline. For normal pages, it just makes things more complicated and risky.
In HTML5 there is special feature for caching using manifest file..
please visit following link..
http://html5doctor.com/go-offline-with-application-cache/
It will be helpful for you..
I am a little slow to the HTML5 caching, I have just some simple questions though.
1) How long is data in a caching manifest cached?
2) If I update the data, how can I make sure the client checks for a newer version when it is available, or is this already done?
3) Also, is this completely useless for a non-0mobile environment or can it speed up load times on a desktop?
<html lang="en" manifest="offline.manifest">
offline.manifest
CACHE MANIFEST
index.html
style.css
image.jpg
image-med.jpg
image-small.jpg
notre-dame.jpg
1) As long as the user cares to cache it. The only way to completely get rid of the cache is to go into the browser settings and explicitly remove it.
2) If you update the manifest file, the client will download new versions of all the files. This download is still governed by 'old' HTTP caching rules, so set headers appropriately, also make sure you send a 'no-cache' header on the manifest file itself. The rules from HTML5 Boilerplate are probably a good place to start.
3) Remember desktops can lose connectivity too. Also, having files in application cache means they are always served locally so, providing you're sensible about what you put in it, the application cache can reduce bandwidth and latency. What I mean by sensible is: if most visitors only see a couple of pages of your site and you update the manifest of your entire site every week, then they could end up using more bandwidth if you're forcing them to cache a load of static files for pages they never look at.
To really cut down on bandwidth and latency in your HTML5 website of the future: use the application cache for all your assets and a static framework; use something like mustache to render all your content from JSON; send that JSON over Web Sockets instead of HTTP, saving you ~800 bytes and a two way network handshake per request; cache data with Local Storage to save you fetching it again, and manage navigation with the History API.
1) How long is data in a caching manifest cached?
Once an application is cached, it remains cached until one of the following happens:
The user clears the browser's cache
The manifest file is modified
The application cache is programatically updated
2) If I update the data, how can I make sure the client checks for a newer version when it is available, or is this already done?
you can specify witch files not to cache (NETWORK:)
If you want to update your cached files, you should modify something in the manifest file, the best way is to put comment in the file and change it when you want the browser to update the cache
3) Also, is this completely useless for a non-mobile environment or can it speed up load times on a desktop?
Yes it is useful, cause the internet can cut on all devices
I have a multi-page website. I want to make two of those pages available offline, using the HTML5 manifest. However, I want the online counterparts to be used instead of the local cached version when possible. Currently, the cached versions are being loaded even when the network is available.
Add this to the header:
<META HTTP-EQUIV="Pragma" CONTENT="no-cache">
If you want certain pages to always load from the server when on-line, the implication is that they are in some way more up-to-date?
If that's the case, you need to ensure that your site's off-line cache recognises that these elements have changed, and thus update them instead. I guess the only way is to force this refresh by ensuring that when the relevant elements are updated on the server, so is your off-line cache manifest file.
(You can of course use the NETWORK directive in your cache manifest to compel the user agent to always go to the server for certain resources but then as you imply, you won't have these pages available when off-line).
So you don't have to necessarily invalidate the cache file, but you do need to ensure that it triggers an update and cache-swap.
The solution I found was to ensure that the pages I cache in the manifest do not contain any dynamically generated content. When online, my JavaScript code performs an Ajax request to fetch the dynamically generated content. The JavaScript detects when the browser is offline and will refuse to perform the Ajax requests, essentially shifting into offline-only mode.
We have several images and PDF documents that are available via our website. These images and documents are stored in source control and are copied content on deployment. We are considering creating a separate image server to put our stock images and PDF docs on - thus significantly decreasing the bulk of our deployment package.
Does anyone have experience with this approach?
I am wondering about any "gotchas" - like XSS issues and/or browser issues delivering content from the alternate sub-domain?
Pro:
Many browsers will only allocate two sockets to downloading assets from a single host. So if index.html is downloaded from www.domain.com and it references 6 image files, 3 javascript files, and 3 CSS files (all on www.domain.com), the browser will download them 2 at a time, with the other blocking until a socket is free.
If you pull the 6 image files off onto a separate host, say images.domain.com, you get an extra two sockets dedicated to download your images. This parallelizes the asset download process so, in theory, your page could render twice as fast.
Con:
If you're using SSL, you would need to either get an additional single-host SSL certificate for images.domain.com or a wildcard SSL certificate for *.domain.com (matches any subdomain). Failure to do so will generate a warning in the browser saying the page contains mixed secure and insecure content.
You will also, with a different domain, not send the cookies data with every request. This can increase performance.
Another thing not yet mentioned is that you can use different web servers to serve different sorts of content. For example, your static content could be served via lighttpd or nginx while still serving your dynamic content off Apache.
Pros:
-load balancing
-isolating a different functionality
Cons:
-more work (when you create a page on the main site you would have to maintain the resources on the separate server)
Things like XSS is a problem of code not sanitizing input (or output for that matter). The only issue that could arise is if you have sub-domain specific cookies that are used for authentication.. but that's really a trivial fix.
If you're serving HTTPS and you serve an image from an HTTP domain then you'll get browser security alert warnings pop up when you use it.
So if you do HTTPS, you'll need to buy HTTPS for your image domain awell if you don't want to annoy the hell out of your users :)
There are other ways around this, but it's not particularly in the scope of this answer - it was just a warning!