After changing domain name where flash application being hosted I should change crossdomain.xml file. That crossdomain.xml is hosted on api-server, which is used by flash application. I see that flash uses crossdomain.xml from browser's cache. Is there any trick to make flash to not get crossdomain.xml from cache? Maybe there is any parameter, that I can pass to flash during it's call in object tag?
Annoying issue - no doubt.
First of all: I like Caching - as long as I'm in control.
This is, how I gain control over crossdomain.xml caching:
Let's say, we have a flash app that requires some input from a different server.
In my case we have this configured as a flashvar dataSrc=http://www.company.com/data/calendar.xml
So flash is looking for
www.company.com/crossdomain.xml
... which is loaded once and than taken from users browser cache until he manually flushes it.
The solution is in changing the subdomain the crossdomain.xml ist taken from:
Make sure, that for example(!) noCache.company.com/ points to company.com's documentRoot.
Flashvar is modified to dataSrc=http://noCache.company.com/data/calendar.xml. In fact, you're addressing the same file as before.
Flash is looking for noCache.company.com/crossdomain.xml which will be taken from the Server now because there is no cached file for that uri.
It's up to your fantasy... if you allow subdomains like noCache_{numeric_value}, you could easily handle your own TTL by accessing http://noCache_{week_of_year}.company.com/data/calendar.xml ...
You can as well increment that numeric value each time crossdomain.xml has changed.
Use following apache directives to specify caching policy for the file:
<Directory /var/www/mysite>
<FilesMatch "crossdomain.xml">
Header set Cache-Control "max-age=86400, public, must-revalidate"
</FilesMatch>
</Directory>
I append random numbers to the end of xml files if I don't want them to cache
eg. var myXMLURL:String = "myXML.xml?r=" + Math.random()*1000;
The browser sees it as a different file eg. myXML.xml?r=645 / myXML.xml?r=239
I'm not sure if this would work with crossdomain.xml files, but it should be worth a quick try.
I would forced reload (F5 or CTRL/CMD-F5) the crossdomain.xml file directly in the browser until I see it changes. Just type in the crossdomain file's URL in the browser and keep on refreshing. Also I would clean the browser cache.
You should try Firefox and firebug which shows you whether the downloaded files are cached or not.
http://getfirebug.com/
Good luck,
Rob
Related
When navigating my site, my browser is loading the JS files from cache, but not the CSS files. This happens both running a local server and on the live site (to me and apparently to other users, which is apparent since the logs show mostly .css files getting loaded).
I've tried the other solutions (example): I am clicking around on hyperlinks (not refreshing) and my Chrome Devtools do not have "Disable Cache" checked.
Here is the initial request (using CTRL+F5 for a hard refresh):
Then navigating back to that page creates another request:
(Note: there is no Cache-Control sent in the second request, proving that I indeed did not refresh)
As expected, the server responds with a 304 Not-Modified for the .css file, but I don't understand why it's making a trip to the server at all (notice below the .js file is retrieved without a server request).
I believe you can look at the issue first-hand on your own machine by going to https://up.codes. I'm using Chrome 71.0.
Why are the CSS files not being cached?
#Allen found the issue (Vary header included cookie and the cookie was changing between requests), but I'll show how to fix it for the case of Flask for posterity. You can use Flask's #app.after_request() hook to make sure Flask does not add cookie when it hits this line:
#app.after_request
def add_header(response):
url = request.url
if ('.css' in url or '.js' in url or '.svg' in url or '.png' in url or
'.gif' in url) :
# Flask adds to the header `Vary: cookie` meaning the client should
# re-download the asset if the cookie changed. If you look at the Flask
# source code that comes next after the below return, it will add
# `Vary: cookie` if and only if session.accessed is true.
session.accessed = False
return response
Your server responded with different session cookie when requesting these css files, and your header set with Vary: Cookie, which caused the browser to send request again because of the session cookie change.
Check out for your web.config's compilation attribute, if its:
<compilation debug=”true”/>
CSS will get continually downloaded by clients on each pageview request and not cached locally within the browser.
If its set to false, the resource is only downloaded once to the client and cached there.
Check out this post: Chrome will not cache CSS files. .js files work fine
Chrome uses multiple types of caches.
Blink (the rendering engine that Chrome uses) uses an in memory cache and a disk cache. It uses this cache for images, fonts and js files.
As long as a file of that type is still in the memory cache or the file cache it will be loaded from there and skip the WebRequest API, this means that no call is made to the server.
I don't know exactly why css files are not being cached by Blink, but this is the reason why you see an HTTP request for css files and not for js ones.
Note, that if, for some reason, the js file is evicted from the memory cache and the disk cache you will see an HTTP request for the js files also.
I have crawled the web quite a lot these days, but couldn't get any accurate information on how crossdomain.xml files behave in case of 302 redirects; especially with the sandboxes having changed significantly over the last versions!
I am relatively new to flash... so any advice is more than appreciated!
I have been working on a project lately that uses audio streams with some sort of CDN distribution! what happens is that a common url is triggered, and then the user is dynamically redirected to the next best server available. In my case, i have no access at the server side of things (at least not anytime soon). And the only path providing an appropriate crossdomain.xml is the one performing the redirect. All the other dynamic paths provide exclusively content!
http://resource.domain.com (valid crossdomain.xml)
302 => http://dyn1.domain.com/...
302 => http://dyn2.domain.com/...
302 => http://dyn3.domain.com/...
I noticed that flash doesn't care much if i try to load the audio stream with something like...
var req :URLRequest = new URLRequest("http://resource.domain.com");
var sound :Sound = new Sound(req); // ie. effectively playing http://dyn3.domain.com
sound.play();
It gets both redirecting, and streaming done well! and doesn't bother for any crossdomain file and starts playing!
Although when i try something different, like setting up some custom headers to the request and loading the file with URLStream instead, everything gets messy! Well, the redirect gets done, as expected but all of a sudden i need another crossdomain file in the redirected location!
Is there any explanation to whats happening and eventually ways to resolve this?!
Thanks for your time!
It comes as a site question : i noticed everything to work flawlessly while being in the local-trusted sandbox and errors happening mainly if not exclusively in the remote sandbox. is it possible that the local-trusted sandbox doesn't care about crossdomain policy files at all!?
Summary
Add crossdomain.xml to each CDN host or adopt to limited Sound functionality.
Details
SWF files that are assigned to the local-trusted sandbox can interact with any other SWF files and can load data from anywhere (remote or local).
Sound can load stuff from other domains that don't allow access using cross-domain policy with certain restrictions:
Certain operations dealing with sound are restricted. The data in a
loaded sound cannot be accessed by a file in a different domain unless
you implement a cross-domain policy file. Sound-related APIs that fall
under this restriction are Sound.id3, SoundMixer.computeSpectrum(),
SoundMixer.bufferTime, and the SoundTransform class.
Flash in general has pretty complex cross-domain policies but in your case the bottom line is that you'll need to have proper crossdmain.xml on each host except the one that serves the SWF:
3.1. If your file is served from http://resource.domain.com it's not required to have http://resource.domain.com/crossdomain.xml but it's really good to have one.
3.2. You will need to have proper http://dyn2.domain.com/crossdomain.xml explicitly allowing your SWF to access dyn2.domain.com to be able to use URLLoader and other APIs that provide access to raw loaded data.
3.3. There's a reason for these restrictions - cookies (and other ambient user credentials). If Flash would not require proper cross-domains after a redirect, one could access any domain with user cookies attached by simply loading his own redirector first. This means accessing all user cookie-protected data (e.g. mail.google.com) from any SWF on the internet that's running in your browser.
I have a lot of images in a folder that are used in the application. When using the cache manifest it would be easier maintenance wise if I could specify a wild card to load all the images or files in a certain directory to be cached.
E.g.
CACHE MANIFEST
# 2011-11-3-v0.1.8
#--------------------------------
# Pages
#--------------------------------
../index.html
../edit.html
#--------------------------------
# JavaScript
#--------------------------------
../js/jquery.js
../js/main.js
#--------------------------------
# Images
#--------------------------------
../img/*.png
Can this be done? Have tried it in a few browsers with ../img/* as well but it doesn't seem to work.
It would be easier, but how's it going to work? The manifest file is something which is parsed and acted upon in the browser, which has no special knowledge of files on your server other than what you've told it. If the browser sees this:
../img/*.png
What is the first image the browser should request from the server? Let's start with these:
../img/1.png
../img/2.png
../img/3.png
../img/4.png
...
../img/2147483647.png
That's all the images that might exist with a numeric name, stopping semi-arbitrarily at 231-1. How many of those 2 billion files exist in your img directory? Do you really want a browser making all those requests only to get 2 billion 404s? For completeness the browser would probably also want to request all the zero-filled equivalents:
../img/01.png
../img/02.png
../img/03.png
../img/04.png
...
../img/001.png
../img/002.png
../img/003.png
../img/004.png
...
../img/0001.png
../img/0002.png
../img/0003.png
../img/0004.png
...
Now the browser's made more than 4 billion HTTP requests for files which mostly aren't there, and it's not yet even got on to letters or punctuation in constructing the possible filenames which might exist on the server. This is not a feasible way for the manifest file to work. The server is where the files in the img directory are known, so it's on the server that the list of files has to be constructed.
I don't think it works that way. You'll have to specify all of the images one by one, or have a simple PHP script to loop through the directory and output the file (with the correct text/cache-manifest header of course).
It would be a big security issue if browsers could request folder listings - that's why Tomcat turns that capability off by default now.
But, the browser could locate all matches to the wildcards referenced by the pages it caches. This approach would still be problematic (like, what about images not initially used but set dynamically by JavaScript, etc., and it would require that all cached items not only be downloaded but parsed as well).
If you are trying automate this process, instead of manually doing it. Use a script, or as I do I use manifestR. It will output your manifest/appcache file and all you have to do is copy and paste. I've used it successfully and usually only have to make a few changes.
Also, I recommend using the network header with the wild card:
NETWORK:
*
This allows all assets from other linked domains via JSON, for instance, to download into the cache. I believe that this is the only header where you can specify a wildcard. Like the others have said here, it's for security reasons.
The cache manifest is now deprecated and you should use HTML headers to control caching.
For example:
<meta http-equiv="Cache-control" content="public">
Public - may be cached in public shared caches.
Private - may only be cached in private cache.
No-Cache - may not be cached.
No-Store - may be cached but not archived.
What is purpose of network section in the HTML5 manifest file? If I add a file in that section, doesnt it mean that the browser should not cache it, and it should be available only online?
I've added the file in Network section, but once I visit it online, it is always available offline. I have checked with FF5 and Chrome.
Here is my full manifest code, please see what is wrong with it?
Thanks.
CACHE MANIFEST
# cache files
CACHE:
index.html
offline.html
images/logo.jpg
# offline.html for all uncached pages
FALLBACK:
/ offline.html
# this should be available online only
NETWORK:
network.html
Obviously, it's a bug: http://code.google.com/p/chromium/issues/detail?id=91524
The manifest file allows for offline web applications where it 'caches' all the files listed in the manifest file and keeps them up to date for offline usage.
The NETWORK section in theory is the section to exclude * (everything) or a single file like you're trying to do with network.html. However, application caching with the manifest file doesn't rule out the 'old-fashion' caching mechanisms browsers have.
You've probably set some static content to be cache-able by the browser so depending on what server IIS/Apache you need to adjust your Expire / Cache-control settings.
Adding the file in the NETWORK Section, still saves file in cache and shows from the cache when I am online, whereas my expectation is it should ALWAYS fetch from online.
When I add "meta http-equiv="Pragma" content="no-cache" it always fetches that file from server
The pragma is a meta type that is primariyly used for IE. You might try setting the cache-control to no-cache add the pragma for IE and set the meta's for expire, public, store etc.. to control the page. At this point, creating the manifest file does cause browser caching to be enabled. You must add the mime-type text/cache-manifest and save the file with the extension .appache .
Example:
CACHE MANIFEST
# the above is a required line
# this is a comment
# spaces are ignored
# blank lines are ignoredCACHE:
/favicon.ico
index.cfm
# offline.html for all uncached pages
FALLBACK:
/ offline.html
# this should be available online only
NETWORK:
network.html
Best regards,
Link Worx Seo
I had similar problems.
Try configuring your server to expire content immediately in the HTTP Response Headers section like in the link above if the site is hosted on IIS.
If it is hosted on Apache you'll probably want to look at this.
I have an application which used to use the HTML5 offline cache. Now I've decided to not use it anymore and removed the manifest attribute from the index.html file. However, browsers still regard this site as cached and refuse to update the index.html file.
Even updating the manifest doesn't help. How can I remove the site from the user's offline caches? Am I stuck with a cached web site forever?
You need to make sure the manifest file isn't being cached, which by default it will be.
Adding
ExpiresActive On
ExpiresDefault "access"
To your .htaccess will stop everything being cached, though you really just want the manifest file to be cached in this way like this: (remember to update filename)
<Files cache.manifest>
ExpiresActive On
ExpiresDefault "access"
</Files>
You really need to do that first, but this should alleviate the problem.
I'd recommend reading through Mark Pilgrim's page on this as well.
Try changing contents of your manifest to simply CACHE MANIFEST with no files listed. The clients should retrieve the new manifest next time they hit the site and their cache should be removed.
Note however that they won't be using this new, empty manifest until they refresh the page.
I've found that in some cases on some browsers they don't necessarily grab the new manifest right away. This behavior seems inconsistent though. When this happens I tend to clear their caches / offline storage manually in order to force them to update (though I understand you can't necessarily get users to do this).