Accessing data from local storage by content scripts - google-chrome

I'm trying to access local storage from content scripts but even though the message passing is working, the output isn't as expected.
CONTENT SCRIPT
var varproxy = localStorage.getItem('proxy'); //gets data from options page saved to local storage
var proxy = "proxystring";
chrome.runtime.sendMessage({message:"hey"},
function(response) {
proxy = response.proxy;
console.log(response.proxy);
}
);
console.log(proxy);
BACKGROUND PAGE (For message passing)
chrome.runtime.onMessage.addListener(
function(request, sender, sendResponse)
{
if (request.message == "hey")
{
sendResponse({proxy: varproxy});
console.log('response sent');
}
else
{sendResponse({});}
});
The console logs the proxy as the value of varproxy and also "response sent" but the
console.log(proxy);
logs the proxy as "proxystring"
Why isn't the value of proxy getting changed? How do I change it as required?

Message sending -- among lots of chrome API function -- is an asynchronous function. The interpreter won't wait for the response, but jumps to the next line. So it can easily happen that log(proxy) will be evaluated first, since communicating with the background page takes some time. As soon as the response is received, the value of proxy changes.

Might I recommend you try out another implementation? What about Chrome Storage?
Then you don't need any message passing at all, because you can access chrome storage within content scripts.
Example, this is something I do in my extensions' content script to grab several values from chrome storage:
chrome.storage.sync.get({HFF_toolbar: 'yes',HFF_logging: 'yes',HFF_timer: '1 Minute'},
function (obj) {
toolbar_option = obj.HFF_toolbar;
logging_option = obj.HFF_logging;
timer_option = obj.HFF_timer;
/* the rest of my content script, using those options */
I personally found this approach much easier, for my purposes anyway, than message passing implementations.

Related

Google Apps Script: I want to display Script Property in the client-side code, but its value is undefined [duplicate]

I am trying to write a Google Apps script which has a client and server side component. The client side component displays a progress bar. The client calls server side functions (which are called asynchronously), whose progress has to be shown in the client side progress-bar. Now, what I want is to be able to update the client side progress bar based on feedback from the server side functions. Is this possible?
The complexity is created due the the fact that JS makes the server-side calls asynchronously and hence I cannot really have a loop on the client side calling the functions and updating the progress bar.
I could of course split up the execution of the server side function in multiple steps, call one by one from the client side, each time updating the status bar. But I'm wondering if there's a better solution. Is there a way to call a client side function from the server side, and have that update the progress bar based on the argument passed? Or is there a way to access the client side progress-bar object from server side and modify it?
The way I've handled this is to have a middleman (giving a shout out now to Romain Vialard for the idea) handle the progress: Firebase
The HTML/client side can connect to your Firebase account (they're free!) and "watch" for changes.
The client side code can update the database as it progresses through the code - those changes are immediately fed back to the HTML page via Firebase. With that, you can update a progress bar.
Romain has a small example/description here
The code I use:
//Connect to firebase
var fb = new Firebase("https://YOUR_DATABASE.firebaseio.com/");
//Grab the 'child' holding the progress info
var ref = fb.child('Progress');
//When the value changes
ref.on("value", function(data) {
if (data.val()) {
var perc = data.val() * 100;
document.getElementById("load").innerHTML = "<div class='determinate' style='width:" + perc + "%\'></div>";
}
});
On the client side, I use the Firebase library to update the progress:
var fb = FirebaseApp.getDatabaseByUrl("https://YOUR_DATABASE..firebaseio.com/");
var data = { "Progress": .25};
fb.updateData("/",data);
Rather than tying the work requests and progress updating together, I recommend you separate those two concerns.
On the server side, functions that are performing work at the request of the client should update a status store; this could be a ScriptProperty, for example. The work functions don't need to respond to the client until they have completed their work. The server should also have a function that can be called by the client to simply report the current progress.
When the client first calls the server to request work, it should also call the progress reporter. (Presumably, the first call will get a result of 0%.) The onSuccess handler for the status call can update whatever visual you're using to express progress, then call the server's progress reporter again, with itself as the success handler. This should be done with a delay, of course.
When progress reaches 100%, or the work is completed, the client's progress checker can be shut down.
Building on Jens' approach, you can use the CacheService as your data proxy, instead of an external service. The way that I've approached this is to have my "server" application generate an interim cache key which it returns to the "client" application's success callback. The client application then polls this cache key at an interval to see if a result has been returned into the cache by the server application.
The server application returns an interim cache key and contains some helper functions to simplify checking this on the client-side:
function someAsynchronousOperation() {
var interimCacheKey = createInterimCacheKey();
doSomethingComplicated(function(result) {
setCacheKey(interimCacheKey, result);
});
return interimCacheKey;
}
function createInterimCacheKey() {
return Utilities.getUuid();
}
function getCacheKey(cacheKey, returnEmpty) {
var cache = CacheService.getUserCache();
var result = cache.get(cacheKey);
if(result !== null || returnEmpty) {
return result;
}
}
function setCacheKey(cacheKey, value) {
var cache = CacheService.getUserCache();
return cache.put(cacheKey, value);
}
Note that by default getCacheKey doesn't return. This is so that google.script.run's successHandler doesn't get invoked until the cache entry returns non-null.
In the client application (in which I'm using Angular), you call off to the asynchronous operation in the server, and wait for its result:
google.script.run.withSuccessHandler(function(interimCacheKey) {
var interimCacheCheck = $interval(function() {
google.script.run.withSuccessHandler(function(result) {
$interval.cancel(interimCacheCheck);
handleSomeAsynchronousOperation(result);
}).getCacheKey(interimCacheKey, false);
}, 1000, 600); // Check result once per second for 10 minutes
}).someAsynchronousOperation();
Using this approach you could also report progress, and only cancel your check after the progress reaches 100%. You'd want to eliminate the interval expiry in that case.

Service Worker not caching API content on first load

I've created a service worker enabled application that is intended to cache the response from an AJAX call so it's viewable offline. The issue I'm running into is that the service worker caches the page, but not the AJAX response the first time it's loaded.
If you visit http://ivesjames.github.io/pwa and switch to airplane mode after the SW toast it shows no API content. If you go back online and load the page and do it again it will load the API content offline on the second load.
This is what I'm using to cache the API response (Taken via the Polymer docs):
(function(global) {
global.untappdFetchHandler = function(request) {
// Attempt to fetch(request). This will always make a network request, and will include the
// full request URL, including the search parameters.
return global.fetch(request).then(function(response) {
if (response.ok) {
// If we got back a successful response, great!
return global.caches.open(global.toolbox.options.cacheName).then(function(cache) {
// First, store the response in the cache, stripping away the search parameters to
// normalize the URL key.
return cache.put(stripSearchParameters(request.url), response.clone()).then(function() {
// Once that entry is written to the cache, return the response to the controlled page.
return response;
});
});
}
// If we got back an error response, raise a new Error, which will trigger the catch().
throw new Error('A response with an error status code was returned.');
}).catch(function(error) {
// This code is executed when there's either a network error or a response with an error
// status code was returned.
return global.caches.open(global.toolbox.options.cacheName).then(function(cache) {
// Normalize the request URL by stripping the search parameters, and then return a
// previously cached response as a fallback.
return cache.match(stripSearchParameters(request.url));
});
});
}
})(self);
And then I define the handler in the sw-import:
<platinum-sw-import-script href="scripts/untappd-fetch-handler.js">
<platinum-sw-fetch handler="untappdFetchHandler"
path="/v4/user/checkins/jimouk?client_id=(apikey)&client_secret=(clientsecret)"
origin="https://api.untappd.com">
</platinum-sw-fetch>
<paper-toast id="caching-complete"
duration="6000"
text="Caching complete! This app will work offline.">
</paper-toast>
<platinum-sw-register auto-register
clients-claim
skip-waiting
base-uri="bower_components/platinum-sw/bootstrap"
on-service-worker-installed="displayInstalledToast">
<platinum-sw-cache default-cache-strategy="fastest"
cache-config-file="cache-config.json">
</platinum-sw-cache>
</platinum-sw-register>
Is there somewhere I'm going wrong? I'm not quite sure why it works on load #2 instead of load #1.
Any help would be appreciated.
While the skip-waiting + clients-claim attributes should cause your service worker to take control as soon as possible, it's still an asynchronous process that might not kick in until after your AJAX request is made. If you want to guarantee that the service worker will be in control of the page, then you'd need to either delay your AJAX request until the service worker has taken control (following, e.g., this technique), or alternatively, you can use the reload-on-install attribute.
Equally important, though, make sure that your <platinum-sw-import-script> and <platinum-sw-fetch> elements are children of your <platinum-sw-register> element, or else they won't have the intended effect. This is called out in the documentation, but unfortunately it's just a silent failure at runtime.

Modify local storage via popup, and use stored values in contentscript

I'm trying my hand at creating a chrome extension, but am running into a wall.
I want to be able to use the browser-action popup to write/modify values into local storage (extension storage).
Then, I want to use the stored values in a content script.
From what I've read, it looks like I need a background file? but I'm not sure.
Some coded examples would be extremely appreciated!
Thanks for your help!
You can avoid using a background page as a proxy if you use chrome.storage API. It's a storage solution that is available from Content Scripts directly.
Here is a comparison between it and localStorage in the context of Chrome extensions.
An important thing to note is that it's asynchronous, making code slightly more complicated than using localStorage:
/* ... */
chrome.storage.local.get('key', function(value){
// You can use value here
});
// But not here, as it will execute before the callback
/* ... */
But to be fair, if you go with the background being the proxy for data, message passing is still asynchronous.
One can argue that once the data is passed, localStorage works as a synchronous cache.
But that localStorage object is shared with the web page, which is insecure, and nobody stops you from having your own synchronous storage cache, initialized once with chrome.storage.local.get(null, /*...*/) and kept up to date with a chrome.storage.onChanged listener.
Background pages can access the localStorage variables saved by your extension. Your content script only has access to the localStorage of the website open in a specific tab. You will therefore need to send the variables from the background page to the content script. The content script can then access these variables.
The following code saves a localStorage variable in the background script and then sends it to the content script for use.
Since you requested a coded example, I've written you one. This project would have a background page and a content script. Using localStorage in your popup will allow the background page to access these variables for use in the content script.
Something like this:
background.js
// When a tab is updated
chrome.tabs.onUpdated.addListener(function(tabId, changeInfo) {
// When the tab has loaded
if(changeInfo.status == 'complete') {
// Query open tabs
chrome.tabs.query({'active': true, 'lastFocusedWindow': true}, function (tabs) {
// Get URL of current tab
var tabURL = tabs[0].url;
// If localStorage is not empty
if(localStorage.length != 0) {
// Set a local storage variable
localStorage.helloworld = "Hello World";
// Send message to content script
chrome.tabs.query({active: true, currentWindow: true}, function(tabs) {
// Send request to show the notification
chrome.tabs.sendMessage(tabs[0].id, {greeting: "hello"}, function(response) {
});
});
}
});
}
});
contentscript.js
chrome.runtime.onMessage.addListener(function(request, sender, sendResponse) {
// Use the local storage variable in some way
if(request.greeting == "hello") {
var hello = localStorage.helloworld;
// do something with the variable here
}
});
Once you have this working, consider switching to chrome.storage

Access-Control-Allow-Origin error in a chrome extension

I have a chrome extension which monitors the browser in a special way, sending some data to a web-server. In the current configuration this is the localhost. So the content script contains a code like this:
var xhr = new XMLHttpRequest();
xhr.onreadystatechange = function(data)...
xhr.open('GET', url, true);
xhr.send();
where url parameter is 'http://localhost/ctrl?params' (or http://127.0.0.1/ctrl?params - it doesn't matter).
Manifest-file contains all necessary permissions for cross-site requests.
The extension works fine on most sites, but on one site I get the error:
XMLHttpRequest cannot load http://localhost/ctrl?params. Origin http://www.thissite.com is not allowed by Access-Control-Allow-Origin.
I've tried several permissions which are proposed here (*://*/*, http://*/*, and <all_urls>), but no one helped to solve the problem.
So, the question is what can be wrong with this specific site (apparently there may be another sites with similar misbehaviour, and I'd like to know the nature of this), and how to fix the error?
(tl;dr: see two possible workarounds at the end of the answer)
This is the series of events that happens, which leads to the behavior that you see:
http://www.wix.com/ begins to load
It has a <script> tag that asynchronously loads the Facebook Connect script:
(function() {
var e = document.createElement('script');
e.type = 'text/javascript';
e.src = document.location.protocol +
'//connect.facebook.net/en_US/all.js';
e.async = true;
document.getElementById('fb-root').appendChild(e);
}());
Once the HTML (but not resources, including the Facebook Connect script) of the wix.com page loads, the DOMContentLoaded event fires. Since your content script uses "run_at" : "document_end", it gets injected and run at this time.
Your content script runs the following code (as best as I can tell, it wants to do the bulk of its work after the load event fires):
window.onload = function() {
// code that eventually does the cross-origin XMLHttpRequest
};
The Facebook Connect script loads, and it has its own load event handler, which it adds with this snippet:
(function() {
var oldonload=window.onload;
window.onload=function(){
// Run new onload code
if(oldonload) {
if(typeof oldonload=='string') {
eval(oldonload);
} else {
oldonload();
}
}
};
})();
(this is the first key part) Since your script set the onload property, oldonload is your script's load handler.
Eventually, all resources are loaded, and the load event handler fires.
Facebook Connect's load handler is run, which run its own code, and then invokes oldonload. (this is the second key part) Since the page is invoking your load handler, it's not running it in your script's isolated world, but in the page's "main world". Only the script's isolated world has cross-origin XMLHttpRequest access, so the request fails.
To see a simplified test case of this, see this page (which mimics http://www.wix.com), which loads this script (which mimics Facebook Connect). I've also put up simplified versions of the content script and extension manifest.
The fact that your load handler ends up running in the "main world" is most likely a manifestation of Chrome bug 87520 (the bug has security implications, so you might not be able to see it).
There are two ways to work around this:
Instead of using "run_at" : "document_end" and a load event handler, you can use the default running time (document_idle, after the document loads) and just have your code run inline.
Instead of adding your load event handler by setting the window.onload property, use window.addEventListener('load', func). That way your event handler will not be visible to the Facebook Connect, so it'll get run in the content script's isolated world.
The access control origin issue you're seeing is likely manifest in the headers for the response (out of your control), rather than the request (under your control).
Access-Control-Allow-Origin is a policy for CORS, set in the header. Using PHP, for example, you use a set of headers like the following to enable CORS:
header('Access-Control-Allow-Origin: http://blah.com');
header('Access-Control-Allow-Credentials: true' );
header('Access-Control-Allow-Headers: Content-Type, Content-Disposition, attachment');
If sounds like that if the server is setting a specific origin in this header, then your Chrome extension is following the directive to allow cross-domain (POST?) requests from only that domain.

Can local storage databases be cross-accessed between separate Chrome extensions?

Question I think is self explanatory, but if you need more, here it is:
Chrome Extension A saves an email address in localstorage.
Chrome Extension B wants to see that email address.
Is this permitted? (This might be more of an HTML5 thing than a Chrome-specific thing, but my knowledge is limited so I'll frame it within the context of my desire to know the answer).
If you own the two extensions, for instance, your the one maintaining both extensions. You can definitely use cross extension message communication to pass that email or even localStorage to the other extension.
For example, take a look at my extension here:
https://github.com/mohamedmansour/reload-all-tabs-extension/tree/v2
One extension is the core, and the other one is just the browser action (right now they are merged as of v3) but v2 lets them both communicate to each other. The browser action sends a "ping" event, and the core extension listens on such event and returns a "pong". The browser action extension is an "Add-On" to the core extension. When you open up "Options", it uses the options from the core one.
Back to your questions ... To access localStorage cross extensions, you can do something like this:
main core extension:
localStorage['foo'] = 'bar';
var secondary_extension_id = 'pecaecnbopekjflcoeeiogjaogdjdpoe';
chrome.extension.onRequestExternal.addListener(
function(request, sender, response) {
// Verify the request is coming from the Add-On.
if (sender.id != secondary_extension_id)
return;
// Handle the request.
if (request.getLocalStorage) {
response({result: localStorage});
} else {
response({}); // Snub them.
}
}
);
secondary extension:
var main_extension_id = 'gighmmpiobklfepjocnamgkkbiglidom'
chrome.extension.sendRequest(main_extension_id, {getLocalStorage: 1},
function (response) {
var storage = response.result;
alert(storage['foo']); // This should print out 'bar'.
}
);
BTW, I really didn't test this extension. I just copied and pasted from the reload all tabs extension that did something similar.
Not directly, but you can send messages between extensions. So if an extension that stores emails is expecting a request from some external extension, it could read the required data and send it back. More about it here.