UPDATE 28 July 2020: There is an ongoing discussion on this issue in the Chromium project: https://bugs.chromium.org/p/chromium/issues/detail?id=1107442&q=svg&can=2. The issue has been resolved in M86 canary and will likely be merged into M85 as well (84 may remain broken).
Cross-post from https://support.google.com/chrome/thread/60499004?hl=en
As of Chrome 84, I'm noticing issues with rendering icons from SVG sprite files if they are not served from the local disk cache. I am able to reliably reproduce with the following example code (assuming cache is disabled in DevTools and/or a force refresh is used):
<svg>
<use xlink:href="/path/to/sprites.svg#icon-name"></use>
</svg>
...where sprites.svg is a static file on the application server, and icon-name is the id of a <symbol> in this file. Pages containing the above code fail to render the icons on the first page load (i.e. the file is not yet cached). I've added a server-side Cache-Control header greater than 0 as suggested by a potentially related thread. This appears to resolve this issue in HTTPS environments, but the SVGs still fail on non-cached load in HTTP environments.
Converting the references to inline SVGs does resolve the issue across both HTTP and HTTPS environments, but such an approach loses the advantage of caching the entire icon set in sprite form for usage across an application. The issue first appeared following an upgrade to Chrome 84 from 83 with no associated code changes.
This is a known issue in Chrome M84 and will be patched in M85 at the latest (currently targeted for stable release on Aug. 22). As suggested in the linked Chromium forum thread, the following solution can serve as a workaround until a fix is made available either as a M84 hotfix or within M85:
document.querySelectorAll('svg').forEach(x => {x.innerHTML = x.innerHTML});
I got the same problem with chrome 84.
It looks like this only happens, if you use the same svg-file multiple times in your page. While pageload+svg-load, if the first icon from the SVG-sprite is ready to be rendered, chrome stops reading the rest of the SVG-file, so other icons aren't eventually rendered (or even rendered partially).
As a workaround you can try to add a param to the SVG-path of each SVG-href with different values:
<!-- first icon, add v=1 param -->
<svg>
<use xlink:href="/path/to/sprites.svg?v=1#icon-name"></use>
</svg>
<!-- second icon, add v=2 param -->
<svg>
<use xlink:href="/path/to/sprites.svg?v=2#other-icon"></use>
</svg>
This way, chrome should save the SVG-file in different caches, so the loading-interruption of some SVG-icons should not happen.
Related
Still kind of new to fixing website related problems. I found out that after clearing cache in IE 11 and loading my site the Google Fonts is downloading 240 seconds before my webpage is finally loaded. I included image:
Could someone tell me what am I doing wrong please?
Thank you!
IE11 has some issues with Google Fonts, especially fonts over SSL. You can try these solutions:
Web Font Loader: Use the Web Font Loader javascript instead of the default method of adding fonts (CSS way). It provides a common interface to loading fonts regardless of the source, then adds a standard set of events you may use to control the loading experience. The Web Font Loader is able to load fonts from Google Fonts, Typekit, Fonts.com, and Fontdeck, as well as self-hosted web fonts. It is co-developed by Google and Typekit.
Sample usage:
<script src="https://ajax.googleapis.com/ajax/libs/webfont/1.6.26/webfont.js"></script>
<script>
WebFont.load({
google: {
families: ['Droid Sans', 'Droid Serif']
}
});
</script>
If this solution works, you can implement a spesific case just for IE11.
Append style tag: A weird one but it seems to work in some cases.
With the help of jQuery:
$('head').append('<style></style>')
Debug and validate HTML: Third but maybe the most important solution is to check your HTML markup. Parsing errors can cause anything that you can miss. https://validator.w3.org/nu/?doc=https://nutikell.com/ W3 Validator gives 5 errors and a fatal error that you should really fix. As the validator says about fatal error, any browser "Cannot recover after last error. Any further errors will be ignored." That can be IE's problem.
From validation results:
3. [Error]: Element style not allowed as child of element div in this context. (Suppressing further errors from this subtree.)
From line 162, column 5; to line 162, column 11
...
5. [Error]: End tag "a" violates nesting rules.
From line 463, column 3; to line 463, column 89
<a href="https://nutikell.com/nutikellad/nutikell-westyx-king/" class="main-thumbnail">
6. [Fatal Error]: Cannot recover after last error. Any further errors will be ignored.
From line 463, column 3; to line 463, column 89
Browser Plugins: IE plugins like script blockers or ad blockers may also delay loading the font files and cause timeout of the request. Check if you have one.
Max http connections: Each browser has a maximum http connections limit in a single time. Your site, for example, has 21 CSS files which is extremely high. You can see it from Firefox's devtools (F12 > Network Monitor > CSS). Maybe IE11 can't fetch all the CSS files at the same time. You may use a task runner to join/minify your CSS files.
Test in other machines: Check out Browserstack to test in an other machine. If you prefer local dev, Microsoft offers free downloads of virtual machines to test IE with.
EOT and WOFF fonts not loading over HTTPS in IE: An old post can give an idea: We had the same issue with EOT and WOFF fonts in all versions of Internet Explorer (7-11) not loading over HTTPS. After hours of trial and error and comparing our headers with other working sites we found it was the vary header that was messing things up. Unsetting that header for those file types fixed our issue right away.
If your server is Apache this code below would go in the .htaccess file:
<FilesMatch "\.(woff)$">
Header unset Vary
</FilesMatch>
<FilesMatch "\.(eot)$">
Header unset Vary
</FilesMatch>
There are plenty of solutions for this case in that post. You can also check them.
We are having issues with the <object> tag in chromium.
In our web application we are pushing base64 content into the tag, and it renders the content as expected when using Chrome, but it's when we do this in cef/cefsharp that the issues happen. The object only renders the first document selected. Any subsequent selections still renders the initial object. I can see the data tag being updated to new content, and the object appears to load, but it always loads the first document selected.
If we replace the tag's outerHTML with a new object tag with the new content, the data displays correct, but this is not really an option because the web application works as expected in Chrome.
We have a web application that is also required to run embedded inside a bespoke application, and for this we are using CefSharp. We are currently on version 49.0.1 and can unfortunately not upgrade to the latest version because the client's machines do not have the latest version of the .NET framework installed yet.
It appears there is (was) an issue with chromium (see here).
What we had to do to solve the problem was set the <div> that holds the <object>'s display to 'none', set the data, and then set the display back to 'block'. Probably just a rendering issue. Crisis averted.
In previous version of chrome, on a webpage with the following:
<script>
document.write('<plaintext>');
</script>
<img src="http://example.com/image.jpg">
the image would not be downloaded. At some point a chrome update changed this behavior. Now when I look at the network tab, I see the image is downloaded. (fiddle here: https://jsfiddle.net/doojunqx/)
I have a script that is on a page, I would like to use this script to stop the browser from downloading (using up network bandwidth) for images and other assets that are unwanted and below my script tag.
Mobify does something similar here:
http://cdn.mobify.com/mobifyjs/examples/capturing-grumpycat/index.html
As they say on the page "Open your web inspector and note the original imgs did not load." However, when I open chrome developer tools and look at the network tab, I see the original images ARE now loading. I'm not sure what version of chrome changed this, but I think it is recent, within the last month or two.
Is there any way to force chrome back to the old behavior? Or any other way to stop these unwanted assets from loading?
Thanks,
Great question, and you're correct that it is a recent change in Chromium that affected the plaintext tag behaviour. In versions up to and including version 42.*, the HTML document parser would not spawn an asynchronous parsing thread until an external resource was found in the original HTML document. Once such a resource was found, an asynchronous thread would be spawned that would aggressively download all resources references within the HTML.
The recent change simplified the parsing behaviour by moving all document parsing to the asynchronous thread which now kicks off automatically. Whereas before, using the plaintext tag would ensure that no resources would be loaded if it was inserted before the first external resource, the plaintext tag is now racy as resources will download up to the moment the plaintext tag is executed in the main HTML document. As there is a time delay for the script to execute, an unknown number of resources will be retrieved.
There is as of yet no solution to this new behaviour, nor is there a way to disable the preload scanner as you would like. You will need to rely on workarounds such as polyfills to control your resource downloads. This new behaviour is only present in all versions of Chrome >= 43.* and has not been implemented in Safari, Firefox, or other browsers.
I'm having some problems with the resources I try to load on my web application, using img or video HTML tags.
I use angular for my app and dynamically set the src param of the img tags, using the ng-src="{{ src }}" directive.
There are not that many images and resources to load (on the model I'm currently testing, it loads about 30 images or videos), but it seems like the browsers (tested on Chrome and Firefox on Windows and OSX) don't handle correctly that many parallel requests, and the requests appear as "pending" on Chrome, and does not show any status on Firefox.
At the moment I took these screenshots, they were no other requests running. I have this problem all the time, but the number of correct requests and pending requests change almost every time I refresh the page.
The resources URIs are of course correct, and I can open pictures in another tab with no problem.
If it can be of any relevance, even though it doesn't seem like the request is even sent from the browser, the resources are loaded from an Azure blob storage, and CORS options are correctly set.
The problem happens both when I test my app locally (with localhost as hostname) and when it is hosted on a web server.
Have you any idea on why this problem is happening and how I can fix it?
Thanks !
I finally found what caused the problem!
I had several videos loading with a video tag with the preload="metadata" attribute. It caused many connections to start, and it seems that even when the preload finished, it prevented the requests for the other resources to be sent. I think it is a browser bug, it should not be the desired behavior.
With preload="none" on the video tags, I don't have any problem like that.
I am experiencing this weird issue where my Chrome browser keeps loading a old version of my website whose code doesn't even exist on my server any more. I assume it's a typical cache issue.
I tried to clean the browser cache, use igcognito mode, and clean DNS cache. The old cached page is still being loaded.
This issue seems to have been discussing on this google group for three years but there is still no solutions. https://productforums.google.com/forum/#!topic/chrome/xR-6YAkcASQ
Using firefox or any other web browsers works perfectly.
It doesn't just happen to me. All my coworkers experience the same issue on my website.
<?php Header("Cache-Control: max-age=3000, must-revalidate"); ?>
You can implement a PHP script that must be the first line of code in your index file . It is an http header typically issued by web servers. You can also rename the resource that is considered "stale". This tutorial will give you more details. https://www.mnot.net/cache_docs/
I'm not sure if I understand your problem correctly, but I was experiencing something similar and instead of clearing the cache I disabled it by doing this:
Open chrome and then go to your website
Press Command + Option + C(Mac)
Now that you've opened chrome's DevTools, go to the main menu where it says: Elements Console Sources ...
Click on the menu element that says Network
Make sure that the "Disable Cache" checkbox is checked
Then reload the page without closing the DevTools
This worked for me.
Let me know if it worked for you :)
A short term fix to view the new version of your site would normally be to clear out the cache and reload, for some reason this doesn't always work on Chrome. This short term solution is not going to fix the problem for every user that's on your site though, it will just allow you to see the new version of your site.
Adding version numbers to CSS and JS files will allow you and every other user, to see the most recent version of your site. A version number will force any browser not to load from the a user's personal computer cache, and instead load the actual files on the server, if the version number varies from the one in the user's cache.
So if you have these files on your server:
ExJS.js
ExCSS.css
and change them to:
ExJS.js?v=1.01
ExCSS.css?v=1.01
the new version of these files will load in any browser.
Normally, a browser will always load the HTML file from the server, but there are some HTML meta tags you can use to make sure that the most recent HTML version will load for any user:
<meta http-equiv="Cache-Control" content="no-cache, no-store, must-revalidate" />
<meta http-equiv="Pragma" content="no-cache" />
<meta http-equiv="Expires" content="0" />
There are also ways to make sure that files in other languages always load the most recent version as well, which is discussed on this post:
How to add version number to HTML file (not only to css and js files)
You can press on Inspect, then Network and check Disable cache.
change the name of images and make the necessary image name changes in html file.. found this quick fix for my website
I ran into the same issue, and I also tried to disable caching on my JSP pages
<% response.setHeader("Cache-Control","no-cache");
response.setHeader("Pragma","no-cache");
response.setDateHeader ("Expires", 0); %>
But it didn't help.
This is a known issue with google chrome and chromium browsers, even though you clear cache and cookie.
However it may or may not happen for most of the users.
Also this has been unresolved till the date since last 9-10 years.
Hence for testing purposes I would highly recommend to use Mozilla Firefox or Opera.
However it does sounds that your application is limited to certain browsers for best experience, and may not sound convincing to Business/End users.
But having said that, this caching issue may or may not happen to most of us.
You should be able to clear the problem by resetting Chrome. This is the only way I found to clear this condition - after tearing my hair out for half a day.
Prior to finding this, I tried clearing the cache, deleting the contents of the various cache directories etc. in vain.
[As of today May 3 2021] You can do this by gong to 'Settings > Advanced > Reset and clean up > Restore settings to their original defaults'. Note that this will not remove any bookmarks but will log you out of all accounts you are signed into.
Adding CNAME Will help also if you always run site without www, try with www.example.com will work.
I came across this issue developing locally, and tried the following things:
Clearing Cache + generally ALL files in Chrome
Setting the Cache-Control Header like Eli Duhon mentioned.
Setting the Cache Control Header in multiple other ways.
And the only thing that fixed the problem for me was to basically re-start my docker containers on which the app was running.
so I did this:
docker-compose down
And then
docker-compose up
and everything was updated after that.
HOWEVER, if you have changes again, they are still not updated...
So this is certainly not a fix to this problem, as I dont even know what causes this behaviour in the first place, but I assume it has to do something with hot reloading and/or Docker but that was the only thing that did the trick for me so I thought I would mention it here...
I had this problem moving a Wordpress site to new hosting where the URL redirects to .../wp, which hadn't been the case before.
Chrome was helpfully presenting a directory listing showing the file dates from the old server, despite the DNS having updated fully a week ago. So it was obviously demonstrating the problem discussed here.
I added an index.html file with just the following in it:
<meta http-equiv="refresh" content="0; URL='http://my-wp-site.com/wp'"/>
which fixed the problem straight away, including on Chrome browsers that had not had their cache cleared and that had no knowledge of any Google account of mine.
I don't know why this worked, however, given all the problems people have listed above.
you have two options
a) consider fingerprint the stale resources like
<script src="js/app-4829382839238882882bb3442bbbbdhh3kh3.js" type="text/javascript"></script>
b) Add cache control headers such as Cache-Control, Expires on your webserver.
This is a good read on browser caching