we are developing a VueJS based application. We have huge caching problem.
Team members are constantly updating the site but we are getting feedbacks about the solved problems such as typos and miss placed elements.
I personnaly tried the inspect this situation, I found that Chrome reads the files from disk cache or memory cache until the page is refreshed. Even though sometimes chrome still loads the old page when we are re entering the site again (after the refreshing process (ctrl + shift + r)).
I' m sorry for my bad english but I tried my best to explain what I encounter. Also I found a topic about the problem, OP has explained the what I was encounteing. You can also check that out.
How to clear cache of service worker?
I created a website on IIS (local machine windows 10), published the project and tried to reaching it with local ip adress (127.0.0.1:8093), in the network tab I can see the .js and .css files being downloaded then I restart the browser and tried again, this time files are being served from disk cache, I tried couple of times and sometimes files are served from cache and sometimes downloaded.
I tried to add serviceWorker but I got empty handed. Also I created a base project to test some vuejs features and I added same serviceWorker code to the project. It cached again.
Our servers is windows 2012 server with IIS 8.
If it is possible we want no-caching approach or we want to manage what's cached and not. If you can help we would be appreciated.
You can checkout the base project
vue-base project
What I tried
As I said above I tried to add service workers as github commit,
https://github.com/vuejs-templates/pwa/pull/21/files
Also I tried deleting the cached data caches.delete(cacheName) did not seem to work.
I don't know if the serviceworkers related to this problem but did not solve my problem. May be I could not add the code properly. If you can help I would be very appreciated.
Thank you for your helps.
Edit1: Screen GIF
I dont know what you have been using to bundle your code and assets, but with webpack it is possible to create the files with a hashcode, which means that everytime the browser finds a new file reference in your browser it will download it.
Ex: you deployed yesterday a code which contained main.34534534534.js
Today you deploy again but the file is main.94565342.js. Your browser will automatically invalidate cache.
Related
So there is some kind of caching issue going on with chrome lately.
I updated an image on a web page, but it will not update in chrome. Other browsers are fine.
Also I'm going to note that I changed the html, and gave the image a new name to force browsers to grab the updated version.
i.e image.jpg > image-v1.jpeg
How on earth does chrome not grab the updated version of this?
Anyone seen this going on lately and know of a fix.
Thanks
When you clear browsing data, make sure to check the box to clear cached images, also make sure the duration is set to "All time".
https://i.ibb.co/jzYk6DK/ex.png
If that doesn't work, try renaming the image.
If that also doesn't work, take a look at the web server configuration expiry headers, and check the duration for cached images in your format. It could be in .htaccess, or it could be elsewhere depending on if you're using apache2 or Nginx.
For anyone stumped on this kind of an issue, if you are using a web farm check to make sure DFS replication across the servers is working correctly.
It was failing for one server in my case, this is why some users got the updated version while others didn't.
to verify if replication worked, log into the servers via RDP then check the last modified date time stamps for the file(s) in question.
This may sound like a very basic question but I feel like I've tried everything.
This a follow-up to this post I made earlier, where I resolved the issue, only for it to come back again.
To summarize, I was making some change to the contact.css file on the contact page of my website when I noticed the changes were working offline but didn't appear online. I narrowed this issue down to a caching issue with the above post (others could see the changes but I couldn't).
In the above example I couldn't get my website to show up as background-color:blue - eventually it worked and I thought I'd fixed it... So I go to change the color back to normal and boom, it stops refreshing the changes again.
So I think it's some sort of caching issue but for the life of me I can't get my cache to clear properly so that I can refresh and see the changes.
Here are the things I have tried already:
Clearing cache (many times) on Chrome, Firefox, and Opera
Hard refresh on Chrome, Firefox, and Opera
Disabling cache through dev tools on Chrome and Firefox (this worked initially then stopped working when I re-updated the website)
Checked multiple times that the CSS file uploaded correctly and the file path was correct. This was confirmed because the correct changes were seen by other people.
Flushed my DNS
Changed from my ISPs DNS to google's 8.8.8.8 + 8.8.4.4
I'm using HostGator to host my website, I'm wondering at this point whether it's something to do with them? I really just have no idea at this point.
Here's what I see online:
Here's what I should be seeing and what I do see on the offline version of my website:
I noticed you said "I'd really like to get to the bottom of the underlying issue" so I figured I'd write an answer to provide a few options (and if anyone wants me to add others, please feel free to add a comment). Overall though, determining your root cause is likely much harder than just solving your overall problem, but let's start with possible causes that I can think of off my head:
Multiple CDN servers taking a while to update so some are returning the old data (your current session) and some are returning new (incognito)
Server session caching so when you reload the page within one http context session you get back the same content (I've seen this in product search queries for example)
The solution to this is relatively simple though, it's called cache busting. Basically, every time you update your source code just add a unique key in either the query string, file name or something to make the url unique. For example, for your css you can link https://path/to.css?v2.0.1 and just keep increasing the version number as you go. If you use webpack for your build outputs, they have a content hash variable that you can use as a token in the file names.
As for the CDNs possibly caching things out of date... the content hash solution will solve that problem as it's an entirely different file name so the CDN will go get it from the root if it doesn't have it in it's cache. I'm unsure of the url version query parameter will do the same, maybe someone else could shed some light on that.
Have you tried using Incognito in Chrome?
My website only works on my laptop and phone! On my ipad, (it is a school ipad, managed by the district) the website doesn't update. I have cleared the cache. I need to present this next monday (my classmates will also be using their school ipads) and it was working fine earlier. PLEASE HELP
www.andescloudforestnmea.com/next.html
EDIT: The website was updating fine before today
Replying to the comments:
Yes, i tried incognito mode but it made no difference.
Yes, what you see is right but on my school ipad i think that it my school is blocking my server or something. Is that possible?
Since we have already determined that your web site has been updated correctly and the files are OK on the server, it's been narrowed down to possibly your school ipad blocking your server. You can try accessing it through a web based proxy for a quick test. Try this service:
https://www.hidemyass.com/proxy
Something else you can try is to upload your web page to another server. Since it's just one page, you don't need anything special. In fact, you could save the file on any free hosting service. Here is an example I created on HTML Pasta:
https://33d08f2c-9365-4692-bc3c-3ed68c454c41.htmlpasta.com
You can create your own at https://htmlpasta.com
EDIT: P.S. Your school ipad is probably not blocking anything but has some kind of cache that you have not cleared. Since you mentioned that you cleared the browser cache already, it may be routing your network through a school proxy or something which is caching the old site on the network level (or some other type of cache)
I am working on a new site and whenever I change CSS settings chrome will not accept those changes unless I close out of chrome completely with Task manager and relaunch it. I have a tried quite a few things. Below is a list of things I've tried:
Versioning the CSS file (I am using a PHP date stamp at the end of the CSS file
Enabling "Clear Cache while developer window is open" in the Developer console
Using Ctrl + F5 to clear cache on refresh
Going to Application and Clear Storage in the developer Console
Clearing Cache folder in local AppData
Deleting CSS file from stie, refreshing, and readding file.
Incognito mode
Adding Launch options to chrome shortcut --disk-cache-dir=null
Adding Browser Plugins to delete cache.
Anyone have any ideas how to help? It is extremely annoying and inefficient to close chrome every time I want to check a CSS change. Another annoyance is that I am trying to listen to music in the browser so if I close chrome I have to go back and get my music playing again and it's just as of now extremely annoying and way more time consuming than I want.
I've tried looking at other articles online about cache busting and other articles on Stack Overflow but I've tried to do most of what they suggest and I haven't seen any positive outcome yet. Most articles say to add some sort of random string or version on the end of the CSS file as a GET request but that isn't working though I know that has worked for me in the past.
pres f12 > f1 > network > disable cache (while DevTools is open). This should solve your problem
Development server was running various caching tools though they should have been turned off. After disabling them chrome started to work better and most of the time CTRL+F5 did the trick.
"clearing cache" is not as easy as it should be. Instead of clearing cache on my browsers, I realized that "touching" the server files cached will actually change the date and time of the source file cached on the server (Tested on Edge, Chrome and Firefox) and most browsers will automatically download the most current fresh copy of whats on your server (code, graphics any multimedia too). I suggest you just copy the most current scripts on the server and "do the touch thing" solution before your program runs, so it will change the date of all your problem files to a most current date and time, then it downloads a fresh copy to your browser:
<?php
touch('/www/sample/file1.css');
touch('/www/sample/file2.css');
touch('/www/sample/file2.css');
?>
then ... the rest of your program...
It took me some time to resolve this issue (as many browsers act differently to different commands, but they all check time of files and compare to your downloaded copy in your browser, if different date and time, will do the refresh), If you can't go the supposed right way, there is always another usable and better solution to it. Best Regards and happy camping. By the way touch(); or alternatives work in many programming languages inclusive in javascript bash sh php and you can include or call them in html.
I used to have the same problem, and I believe it's a (pretty annoying) bug with chrome. You can use the CSS Reloader Chrome Extension to solve it. Not ideal, but better
If you are trying out new CSS updates, I suggest using Chrome's "Inspect" function to dynamically update CSS settings and observe the results interactively. This may save some time during update cycles as compared to manual edits alone.
Another option to try is to define "cache-control" meta tags in your head section. For development/testing, you may want to have no caching. For a real website, you may want to have a shorter age limit. Refer to the following SO Q&A.
Using meta tags to turn of caching in all browsers?
I was wondering if anybody could direct me to any tools for debugging the cache.manifest file in offline HTML5 access. I recently downloaded a program called Manifesto which allows me to look up the cache manifest on loading a page. Everything seems to working fine however it keeps on saying that the status is "uncached". Obviously, it seems like although it is checking to make sure the cache files are there, it isn't actually caching them upon load. Whats going on and more importantly, how do I figure out how to solve it?
I got it to work. To be honest, it might have been working before but because I was having some different php scripts running in the background I had to be a little careful with pulling it up.