Chrome doesn't use cache after power loss? - google-chrome

I am creating a digital signage player that uses Chrome as it's display engine. We need to be able to still muddle along if the network goes down without too much interruption.
Chrome works fine caching images, and I've set the "Exipres" header to be a month after access. I can set the player computer offline and have the app run for days with no problem. If I reboot the machine the right way (Start->Shut Down), caching still works as expected.
The issue is that when Chrome exits abnormally - Either a crash or power loss - on reboot, Chrome ignores the cache and refuses to load images. This happens if I cut power 5 minutes after it loads the page, so content is not expiring.
My guess is that Chrome is set to ignore the cache after an abnormal exit to prevent corrupted cache from continually crashing the browser. However, this behavior is not what I need.
Does anyone know of a command line arg or flag I can set to keep this from happening?
Thanks for your help.

I tried everything I could think of to make Chrome not invalidate the local cache on system failure, and came up empty. There's a few other people who had the same question, and I didn't see an answer.
Here's what I did that made this work, and if someone else is having the same problem, it might be the workaround that you need.
I added a service worker that would cache images. The code below isn't perfect yet, but should be a starting place for someone... (FYI, I learned this 5 minutes ago, so if someone wants to give me a pointer or two on how to make this more elegant, I'm all ears.)
We cache anything that has a response type of "cors" so we cache only images coming from the remote server. Note that your images must be loaded via https for this to work.
Taken (mostly) from: https://developers.google.com/web/fundamentals/getting-started/primers/service-workers
var CACHE_NAME = 'shine_cache';
var urlsToCache = [
'/'
];
self.addEventListener('install', function(event) {
// Perform install steps
event.waitUntil(
caches.open(CACHE_NAME)
.then(function(cache) {
console.log('Opened cache');
return cache.addAll(urlsToCache);
})
);
});
self.addEventListener('fetch', function(event) {
//console.log('Handling fetch event for', event.request);
if (event.request.method == 'POST') {
//console.log("Skipping POST");
event.respondWith(fetch(event.request));
return;
}
if (event.request.headers.get('Accept').indexOf('image') !== -1) {
event.respondWith(
caches.match(event.request)
.then(function(response) {
// Cache hit - return response
if (response) {
console.log("Returning from cache.", event.request);
return response;
}
// IMPORTANT: Clone the request. A request is a stream and
// can only be consumed once. Since we are consuming this
// once by cache and once by the browser for fetch, we need
// to clone the response.
var fetchRequest = event.request.clone();
return fetch(fetchRequest).then(
function(response) {
console.log("Have a response.", response);
// Check if we received a valid response
if(!response || response.status !== 200 || response.type !== 'cors') {
return response;
}
// IMPORTANT: Clone the response. A response is a stream
// and because we want the browser to consume the response
// as well as the cache consuming the response, we need
// to clone it so we have two streams.
var responseToCache = response.clone();
caches.open(CACHE_NAME)
.then(function(cache) {
console.log("Caching response", event.request);
cache.put(event.request, responseToCache);
});
return response;
}
);
})
);
}
});

Related

SERVICE WORKER: The service worker navigation preload request failed with network error: net::ERR_INTERNET_DISCONNECTED in Chrome 89

I have a problem with my Service Worker.
I'm currently implementing offline functionality with an offline.html site to be shown in case of network failure. I have implemented Navigation Preloads as described here: https://developers.google.com/web/updates/2017/02/navigation-preload#activating_navigation_preload
Here is my install EventListener were skipWaiting() and initialize new cache
const version = 'v.1.2.3'
const CACHE_NAME = '::static-cache'
const urlsToCache = ['index~offline.html', 'favicon-512.png']
self.addEventListener('install', function(event) {
self.skipWaiting()
event.waitUntil(
caches
.open(version + CACHE_NAME)
.then(function(cache) {
return cache.addAll(urlsToCache)
})
.then(function() {
console.log('WORKER: install completed')
})
)
})
Here is my activate EventListener were I feature-detect navigationPreload and enable it. Afterwards I check for old caches and delete them
self.addEventListener('activate', event => {
console.log('WORKER: activated')
event.waitUntil(
(async function() {
// Feature-detect
if (self.registration.navigationPreload) {
// Enable navigation preloads!
console.log('WORKER: Enable navigation preloads')
await self.registration.navigationPreload.enable()
}
})().then(
caches.keys().then(function(cacheNames) {
cacheNames.forEach(function(cacheName) {
if (cacheName !== version + CACHE_NAME) {
caches.delete(cacheName)
console.log(cacheName + ' CACHE deleted')
}
})
})
)
)
})
This is my fetch eventListener
self.addEventListener('fetch', event => {
const { request } = event
// Always bypass for range requests, due to browser bugs
if (request.headers.has('range')) return
event.respondWith(
(async function() {
// Try to get from the cache:
const cachedResponse = await caches.match(request)
if (cachedResponse) return cachedResponse
try {
const response = await event.preloadResponse
if (response) return response
// Otherwise, get from the network
return await fetch(request)
} catch (err) {
// If this was a navigation, show the offline page:
if (request.mode === 'navigate') {
console.log('Err: ',err)
console.log('Request: ', request)
return caches.match('index~offline.html')
}
// Otherwise throw
throw err
}
})()
)
})
Now my Problem:
On my local machine on localhost everything just works as it should. If network is offline the index~offline.html page is delivered to the user. If I deploy to my test server everything works as well as expected, except for a strange error-message in Chrome on normal browsing(not offline mode):
The service worker navigation preload request failed with network error: net::ERR_INTERNET_DISCONNECTED.
I logged the error and the request to get more information
Error:
DOMException: The service worker navigation preload request failed with a network error.
Request:
Its strange because somehow index.html is requested no matter which site is loaded.
Additional Information this is happening in Chrome 89, in chrome 88 everything seems fine(I checked in browserstack). I just saw there was a change in pwa offline detection in Chrome 89...
https://developer.chrome.com/blog/improved-pwa-offline-detection/
anybody has an idea what the problem might be?
Update
I rebuild the problem here so everybody can check it out: https://dreamy-leavitt-bd4f0e.netlify.app/
This error is directly caused by the improved pwa offline detection you linked to:
https://developer.chrome.com/blog/improved-pwa-offline-detection/
The browser fakes an offline context and tries to request the start_url of your manifest, e.g. the index.html specified in your https://dreamy-leavitt-bd4f0e.netlify.app/site.webmanifest
This is to make sure that your service worker is actually returning a valid 200 response in this situation, i.e. the valid cached response for your index~offline.html page.
The error you're asking about specifically is from the await event.preloadResponse part and it apparently can't be suppressed.
The await fetch call produces a similar error but that can be suppressed, just don't console.log in the catch section.
Hopefully chrome won't show this error from preload responses in future when doing offline pwa detection as it's needlessly confusing.

Debugging service worker 'trying to install' in Chrome

I have what I assumed was a pretty standard service worker at http://www.espruino.com/ide/serviceworker.js for the page http://www.espruino.com/ide
However recently, when I have "Update on reload" set in the Chrome dev console for Service Workers the website stays with its loading indicator on, and the status shows a new service worker is "Trying to Install".
Eventually I see a red 'x' by the new service worker and a '1' with a link, but that doesn't do anything or provide any tooltip. Clicking on serviceworker.js brings me to the source file with the first line highlighted in yellow, but there are no errors highlighted.
I've done the usual and checked that all files referenced by the service worker exist and they do, and I have no idea what to look at next.
Does anyone have any clues how to debug this further?
thanks!
I'm on Chrome Beta.
I updated to the newest release a magically everything works. So I guess it was a bug in Chrome or the devtools, not my code.
For those running in to this issue with the latest version of Chrome, I was able to fix it by caching each resource in its own cache. Just call caches.open for every file you want to store. You can do this because caches.match will automatically find the file in your sea of caches.
As a messy example:
self.addEventListener('install', event => {
event.waitUntil(swpromise);
var swpromise = new Promise(function(resolve,reject) {
for (var i = 0; i < resources_array.length; i++) {
addToCache(i,resources_array[i]);
}
function addToCache(index,url) {
caches.open(version+"-"+index).then(cache => cache.addAll([url])).then(function() {cacheDone()});
}
var havedone = 0;
function cacheDone() {
havedone++;
if (havedone == resources_array.length) {
resolve();
}
}
})
I used the version number and the index of the file as the cache key for each file. I then delete all of the old caches on activation of the service worker with something similar to the following code:
addEventListener('activate', e => {
e.waitUntil(caches.keys().then(keys => {
return Promise.all(keys.map(key => {
if (key.indexOf(version+"-") == -1) return caches.delete(key);
}));
}));
});
Hopefully this works for you, it did in my case.

IndexedDB flush to disk on Chrome

I'm facing an issue with IndexedDB on Chrome where I reload my page once the transaction returns a successful write.
Problem is sometimes that data does not reflect after reload. I can solve this by giving a timeout of about 100ms before reload, which leads me to believe that the data is not flushed to disk everytime.
Firexox has an experimental readwriteflush mode which ensures data is flushed to disk before returning a success call, but can't seem to find a similar one for Chrome. Any suggestions?
Here's my insert code:
const data = {type: type, value: value};
const objectStore = StorageService.db.transaction(['localData'], 'readwrite').objectStore('localData');
// readwriteflush doesn't work in chrome
// const objectStore = StorageService.db.transaction(['localData'], 'readwriteflush').objectStore('localData');
const requestSet = objectStore.put(data);
requestSet.onerror = function (event) {
alert('Error in saving data locally');
};
requestSet.onsuccess = function (event) {
console.log('Data was successfully saved locally: ' + type);
if (callback != undefined) {
callback();
}
};
The callback has location.reload = '/'; executed in it (along with some other things), so the page reloads after the onsuccess has been returned.
After the page reloads, I cannot see any data on my IndexedDB storage, both via code and on developer tools. This does not always happen however, I've observed that this happens only when data is larger than usual.
"success" fired at a request does not indicate that the transaction has committed successfully. The transaction could later fail due to a separate failed request (e.g. a conflicting add call), I/O error, or e.g. power loss.
You need to wait for the "complete" event to be fired at the transaction. Chrome flushes to disk before firing the "complete" event.

net::ERR_CONNECTION_RESET with service worker in Chrome

I have a very simple service worker to add offline support. The fetch handler looks like
self.addEventListener("fetch", function (event) {
var url = event.request.url;
event.respondWith(fetch(event.request).then(function (response) {
//var cacheResponse: Response = response.clone();
//caches.open(CURRENT_CACHES.offline).then((cache: Cache) => {
// cache.put(url, cacheResponse).catch(() => {
// // ignore error
// });
//});
return response;
}).catch(function () {
// check the cache
return getCachedContent(event.request);
}));
});
Intermittently we are seeing a net::ERR_CONNECTION_RESET error for a particular script we load into the page when online. The error is not coming from the server as the service worker is picking up the file from the browser cache. Chrome's network tab shows the service worker has successfully fetched the file from the disk cache but the request from the browser to the service worker shows as (failed)
Does anyone know the underlying issue causing this? Is there a problem with my service worker implementation?
This is likely due to a bug in Chrome (and potentially other browsers as well) that could result in a garbage collection event removing a reference to the response stream while it's still being read.
Its fix in Chrome is being tracked at https://bugs.chromium.org/p/chromium/issues/detail?id=934386.

Google chrome web push api bug

What is this bug? When sending web pushing browser Google Chrome "sometimes" gives a second message with the text: "This site has been updated in the background."
I want to make it only one message
This text I found in source Chrome
This site has been updated in the background.
github.com/scheib/chromium/blob/master/chrome/app/resources/generated_resources_en-GB.хтб
How to get rid of this message.
The way it works is a feature not a bug.
Here is an issue that explains your situation in Chrome: https://code.google.com/p/chromium/issues/detail?id=437277
And more specific code comment in Chromium code:
https://code.google.com/p/chromium/codesearch#chromium/src/chrome/browser/push_messaging/push_messaging_notification_manager.cc&rcl=1449664275&l=287
What might have happened is some of the push messages sent to the client did not result in showing a notification.
Hope that helps
The reason this often occurs is the promise returned to event.waitUntil() didn't resolve with a notification being shown.
An example that might show the default push notification:
function handlePush() {
// BAD: The fetch's promise isn't returned
fetch('/some/api')
.then(function(response) {
return response.json();
})
.then(function(data) {
// BAD: the showNotification promise isn't returned
showNotification(data.title, {body: data.body});
});
}
self.addEventListener(function(event) {
event.waitUntil(handlePush());
});
Instead you could should write this as:
function handlePush() {
// GOOD
return fetch('/some/api')
.then(function(response) {
return response.json();
})
.then(function(data) {
// GOOD
return showNotification(data.title, {body: data.body});
});
}
self.addEventListener(function(event) {
const myNotificationPromise = handlePush();
event.waitUntil(myNotificationPromise);
});
The reason this is all important is that browsers wait for the promise passed into event.waitUntil to resolve / finish so they know the service worker needs to be kept alive and running.
When the promise resolves for a push event, chrome will check that a notification has been shown and it falls into a race condition / specific circumstance as to whether Chrome shows this notification or not. Best bet is to ensure you have a correct promise chain.
I put some extra notes on promises on this post (See: 'Side Quest: Promises' https://gauntface.com/blog/2016/05/01/push-debugging-analytics)