What does Blink in-memory cache store? - google-chrome

Besides the browser cache, there are a few other ways browser cache data. For Chrome, there is another cache in the rendering engine Blink that stores images, styles, scripts and fonts (maybe more) in memory.
This cache is used for consecutive navigations on a site. Resources delivered from the Blink cache are tagged with (from memory cache) in the network tab. Resources served from the browser cache are tagged with (from disk cache).
My question is now, which resources are stored in and delivered from this very fast cache? From my tests, it varies a lot:
It works extremely well for images and script tag which are directly in the HTML.
It works sometimes for style (link) tags which are directly in the HTML. Sometimes it does not work (in the same browser with the same session).
It works almost never for script tags that are inserted into the HTML programmatically. Sometimes it works though.
One huge difference between disk cache hits and memory cache hits becomes visible in combination with Service Workers. Requests that are served by the in-memory cache cannot be observed in the Service Worker (because the cache comes before the Service Worker). Requests that are served by the disk cache pass through the Service Worker (since the Browser Cache lies behind Service Worker).
To show the explained behavior, I built a test page with all resource types: https://dm-clone-optimized.app.baqend.com/
You can navigate through the site with the links at the top and observe how the requests behave in the network tab and console. Every page loads the same resources.
After a bit of navigating (Chrome 70.0.3538.67), I get this behavior most of the time:
HTML is fetched from network
Script tags scripts.js and scripts2.js are from in-memory cache
Image tag logo.png is from in-memory as well
Style link tag styles.css is from disk cache :(
Programatically added script tag scripts2.js?id=1 is from disk cache as well :(
Sometimes though, I get really lucky and everything is served from in-memory cache:
I would love to understand how the Blink in-memory cache works and how I can tune my site to use it for all resources with appropriate cache control header.
---- edit ----
What concerns me the most is: Why are dynamically added scripts not cached at all? This has a noticeable impact on frameworks like require.js since they insert all dependencies as dynamically added script tags.

Blink in-memory cache works
Blink has four memory allocators
PartitionAlloc, Oilpan, tcmalloc, and system allocator
So the team working on Chrome Blink has removed tcmalloc and system allocators from Blink
Blink (PartitionAlloc+Oilpan) is the second largest consumer of renderer’s memory which consumes 10 - 20% in typical cases and retains some of the memory in Discardable, CC and V8
Inside Blink, the primary memory consumers are:
Large StringImpls (used by JavaScript source code)
shared buffers (used by Resources)
Vectors and HashTables
The recommendation is to: "identify caches that have an impact on Blink’s total memory and implement purgeMemory() only on them."
Reducing the size of (DOM object) won’t have an impact
Discarding caches won’t affect in most cases
They are working on getting rid of "DiscardableMemory" items which will help to do things like forcibly detaching all layout objects which in turn will release memory retained by the layout tree.

I believe it is a result of optimization in chrome, and they make it verbose to you.
The files are always go into disk cache. And they also goes into memory, and flushed very soon.
Chrome is smart enough to ask running process that do they still have a loaded copy of them in memory before seek on disk.
The step has a high hit rate, as those images/js are actively using for something.
You will not have any control how chrome manage TTL of them/capacity of memory could be used to keep blob hot. Chrome dev team doing quite a lots on dynamic tuning based on actual hardware capacity and system loading.
P.S. If you are asking for keep YOUR APP in memory. You are failing into Sun/Adobe way of evil: making their app DLL hot in memory(by tray icon/service) and slow everyone else down.
P.P.S. If it is the app end-user might want to use, use electron and follow Whatsapp/Slack/etc to build an app always running.

Related

Chrome - Images disappearing from cache storage after few hours

I have a website that is used as a kiosk app. When online, I preload data and images from a wordpress API, and store the images in the cache storage.
A service worker intercepts http gets to these images and serves the cache data instead. Like this, the app can run offline (API calls included).
But after few hours running offline (generally around 6h) some images disappear from the cache storage. And it's always the same ones.
But not all.
Any idea where should I check to see what's going wrong ?
There can be two possible reasons for such clearing of cache, either your storage gets full(chrome localStorage), so it gets clear OR check for the expiring length of the data send from the server, check for headers that might give you insights of time it takes to expire.
And for checking if the data gets evicted from the storage, try testing your website in Safari or Edge browser where such eviction does not occur.
How did you configured Cache-Control? The ones which are constantly deleted might configured as max-age.
It's still recommended that you configure the Cache-Control headers on
your web server, even when you know that you're using the Cache
Storage API. You can usually get away with setting Cache-Control:
no-cache for unversioned URLs, and/or Cache-Control: max-age=31536000
for URLs that contain versioning information, like hashes.
as stated at https://web.dev/service-workers-cache-storage
Also, you should have a check storage amount of cache storage. (https://developers.google.com/web/tools/chrome-devtools/storage/cache) Even they claim that either all caches are deleted or none of them, since currently it is experimental technology, it is normal to be suspicious about this point.
You are also responsible for periodically purging cache entries. Each
browser has a hard limit on the amount of cache storage that a given
origin can use. Cache quota usage estimates are available via the
StorageEstimate API. The browser does its best to manage disk space,
but it may delete the Cache storage for an origin. The browser will
generally delete all of the data for an origin or none of the data for
an origin. Make sure to version caches by name and use the caches only
from the version of the script that they can safely operate on. See
Deleting old caches for more information.
as stated at https://developer.mozilla.org/en-US/docs/Web/API/Cache
Service Workers use the Storage API to cache assets.
There are fixed quotas available and different from browser to browser.
You can get some more hints from your app using the following call and analysing the results:
navigator.storageQuota.queryInfo("temporary")
.then(function(info) {
console.log(info.quota);
// It gives us the quota in bytes
console.log(info.usage);
// It gives us the used data in bytes
});
UPDATE 1
Do you have anywhere a cleanup method or logic that removes old cache entries and eventually is triggered somehow unexpected? Something like the code below.
caches.keys().then(function(keyList) {
return Promise.all(keyList.map(function(key) {
if (key !== cacheName) {
return caches.delete(key);
}
}));
})
A couple of more questions:
Do you have any business logic in the SW dealing with caching the images?
Could you get any info by using the storageQuota methods to see the available quota and usage? This might be very useful to understand if the images are evicted because of reaching the limit. (even by setting a box as persistence, you are not 100% sure the assets are retained)
From "Building PWAs" (o'Reilly - Tal Alter):
Each browser behaves differently when it comes to managing CacheStorage, allocating space in the cache for each site, and clearing out old cache entries.
In addition to a storage limit per site (also known as an origin), most browsers will also set a size limit for their entire cache. When the cache passes this limit, the browser will delete the caches of the site accessed the longest time ago (also known as the least recently used).
Browsers will never delete only part of your site’s cache. Either your site’s entire cache will be deleted or none of it will be. This ensures your site never has an unpredictable partial cache.
Considering the last words "Browsers will never delete only part of your site’s cache." might me think that maybe it is not a problem with the cache limit reached, as otherwise all images wiould be deleted in the cache.
If interested, I wrote an article about this topic here.
You could mark your data storage unit (aka "box") as "persistent", making the user agent retain it as long as possible (more details in the Storage API link above).

Bootstrap CDN Rendering Delay

Have been working on a site and have used Boostrap via MAX CDN by putting
<link rel="stylesheet" href="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.7/css/bootstrap.min.css">
in the head section.
Google Pagespeed is showing:
Your page has 1 blocking CSS resources. This causes a delay in rendering your page.
None of the above-the-fold content on your page could be rendered without waiting for the following resources to load. Try to defer or asynchronously load blocking resources, or inline the critical portions of those resources directly in the HTML.
Optimize CSS Delivery of the following:
https://maxcdn.bootstrapcdn.com/bootstrap/3.3.7/css/bootstrap.min.css
Is there anyway of fixing this?
Thanks
The other answer is too incorrect, so I am writing this if anyone still needs help on this.
CDNs speed up your production websites too, not only your local development environment. Hosting a static file locally on your own host does not solve the Render Blocking issue that pagespeed tool suggests here. A static file hosted locally is still render blocking.
The solution here is inlining the render blocking CSS files completely. However, I would recommend to not inline the resources for several reasons:
Resources delivered over CDN are delivered faster. If you inline them, the entire payload is expected to deliver at slow pace.
Resources delivered separately are cached by browser, while if you inline CSS resources browser will not be able to reuse them across requests.
Resources inlined are compressed on every request along with the other dynamic contents of the file. You cannot take advantage of pre-compression that comes with CDN.
Bigger HTML document due to inlined CSS files is expected to slow when it comes to parsing.
What should one do to resolve this issue? You might think differently, but I would suggest you to do nothing.
Its not that bad if the browser has to wait a little for resources to download. If you are using CDN, chances are visitors will not perceive the download as most of the CDNs now a days have less than 50ms average global latency.
Resources are cached by browser. While pagespeed tool loads fresh resources on every request, please note that browser may be loading some of the resources from cache, completely eliminating the CDN request.
Resources are shared across websites. If you are using Bootstrap from a public CDN, chances are that the same bootstrap file is already available in browser cache of the visitor, that is downloaded when the user visited another website that used bootstrap. This gives 0ms latency and 100% bandwidth saving for that particular resource for even your first time visitors that have no other resources of your site in their browser cache. Browser can now spend the saved bandwidth elsewhere to speed other things up.
Manually inlining external libraries make it little more difficult to keep traces of all inlined copies of the library and makes the edits and updates hard.
Dynamic inlining adds few more disk seeks per request. This just adds to the server load. These IOs can be saved if the files are served separately. Your OS may just cache the files if they are hot enough. Using CDN will completely eliminate this load.
Although not exactly a solution that will remove this pagespeed warning, its possible to reduce the impact of render blocking resources for real visitors (not performance measurement tools):
Serve resources with aggressive compression to reduce the payload size.
Serve resources with immutable cache-control header to tell the browser to confidently store this file for longer period as this is not going to change in the future. If you use bootstrap cdn by pagecdn, these two optimizations are enabled by default.
If you know a file is going to be loaded immediately after a page load, you can use HTTP/2 Server Push to deliver the file before even the browser asks for it. However, if you do this, you will need to make sure that the same files are not aggressively been pushed on every request (that is not a good option as the files should load from browser cache on second request onwards).
Either change the CDN (which will most likely do nothing for you) or, simply store the file locally. This will up your Google Page Speed rate.
Download a copy of the bootstrap file you are using and store it like so root/css/bootstrap.min.css
Where root is your project folder.
CDN's are used mainly for test purposes (instant access to files) or in larger-scale projects which have multiple requirements that can't always be met locally.
Read this thread to better understand.
While the error that Google gives you might not be a serious issue for your project, it is always a good practice to use your resources locally, so that your website may load by itself, without referencing external sources that resemble in a separate query.
Static files = better load times = happy Google.

Why does chrome Content Download take longer for cached resoures?

I am trying to make the page speed as fast as possible. The problem is that I have one file in particular that takes a long time to download (even if the size is 0 -> loaded from cache)
Here is how it looks without cache (first load):
And with cache:
I agree that the time has dropped by almost 50%, but why do I have a Content Download time from my local cache higher than the one from the server?
NOTE:
I am not sure if the times are explained the same in Safari, but it doesn't seem to take that long to get the file there.
Image from Safari Network tab:
After running tests on several projects and without any code or further breakdown of the content downloaded processes, here is the conclusion I've reached.
System threads set a priority level for each process they are assigned. When a file is stored in the cache then the download performance is, for the most part, dependent on the performance of the machine. If threads are bogged down then it can result in a slower content download than a download from a server. However, a download from a server is typically slower because of all of the processes involved in contacting, requesting, receiving and handling the download.
The advantage of cache is easily seen when handling slow connections and/or large files. For example, if there's 10+ seconds of waiting due to a slow connection then it's not ideal to load the full page more than once. It would take an incredibly slow system to justify sending that page twice.
Since browser cache removes the send/receive process, it is faster but it doesn't remove load times entirely. In this case, it happens to yield a higher load time in one category.
Side note: It seems odd to classify a file as content downloaded when reading it from the cache but it seems to be consistent amongst all of the projects I tested.

Chrome running out of memory

I've developed a Chrome Extension that cycles through approximately 1500-2000 pages to collect information from a website and push it onto my own server. I use a Chrome extension, as given the requirements it's much easier to configure than making and parsing from the server side.
The extension is used only for internal purposes and we trigger this job using the chrome.alarm API to run at 3:00 am every day. This alarm when triggers pops open a new tab and runs through the 1500-2000 pages. The problem is when we check in the morning we see the SNAP dialogue after the extension has cycled through about 1500 pages (approx 3/4). I'm presuming this is due to the memory demand placed on Chrome to maintain such an unusually large history?
My question is, what would be the best way to mitigate this? Presumably killing the tab and reopening (after x number of pages) would work but that would slow down the feed and require quite a bit of code re-factoring. Is there any way you can force Chrome to dump the history and would doing so free memory in the immediate session?
Just to add some context, I am running this on a extra small VM with only 1 GB of memory. I appreciate I could upgrade the VM but that really just defers the problem.
It's hard to make a solid recommendation without seeing your code, but here's something a bit more general:
The problem is likely the information that is being collected. It continues to build up in memory until you run out. You might consider saving the results to local storage using Chrome's unlimited storage capability. In manifest file:
"permissions": [
"unlimitedStorage"
],
What I would probably do is save to local storage after every page, or something similar, and then reset the array (or whatever you're using).
UPDATE
If indeed history is the problem, you could consider deleting it on the fly. This function will delete entries for a specific URL:
https://developer.chrome.com/extensions/history.html#method-deleteUrl

What is the difference between a primed cache and empty cache?

What is the difference between a primed cache and empty cache?
For example the statistics result of YSlow provides a graphical data of an empty cached vs. primed cache. What are the difference between them?
Simply, a primed cache means the browser has it cached. It has been there before, or (though I don't think YSlow means it this way) it has been somewhere that uses some of the same resources (images, CSS, JavaScript)
This was asked 3 years before, but I bumped into this question my self since I had it to. So I did a small research on the internet and I found that :
Statistics is the third tab and provides a graphical representation of the number of HTTP requests made to the server and the total weight of the page in kilobytes for both Empty Cache and Primed Cache scenarios.
The Empty Cache scenario is when the browser makes the first request to the page and the Primed Cache scenario is when the browser has a cached version of the page. In a Primed Cache scenario, the components are already in the cache, so this will reduce the number of HTTP requests and thereby the weight of the page.
Keyword here is "scenarios". This does not mean that the graphs will change if you already have cached the page. I run the test two times even though I cached it and it always displays both graphs since it shows the "scenarios". So if I cached the page I am looking at the Primed Cache scenario but for my new visitors the Empty Cache.
So in the above example, when I request the page and it is cached, my browser will still make 3 requests with total weight of 86.6K
This page explains what actually yslow displays. http://www.devcurry.com/2010/07/understanding-yslow-firebug-extension.html