chrome.scripting.executeScript woes with async functions - google-chrome

I have an MV3 chrome extension on Chrome 106. I want to run an async method with executeScript like this:
async function doSomeWork() {
console.log("starting");
await chrome.runtime.sendMessage({foo: 'bar1'}); // call arbitrary await method
console.log("weird, this is never seen");
// ...do some more work
}
async function doit() {
await chrome.scripting.executeScript({target: {tabId: tab.id}, func: doSomeWork});
}
The documentation seems to imply this should work:
If the resulting value of the script execution is a promise, Chrome
will wait for the promise to settle and return the resulting value.
https://developer.chrome.com/docs/extensions/reference/scripting/#promises
However, the function never continues executing after the first await call. The second log message doesn't show up on the console. The sendMessage works, though.
Am I completely misreading the documentation?
A complete example is available in this git repo: https://github.com/stickfigure/chrome-extension-test/blob/master/src/app.tsx

Related

Composable functions in Puppeteers page.Evaluate

I'm relatively new to puppeteer and I'm trying to understand the patterns that can be used to build more complex apis with it. I am building a cli where I am running a WebGL app in puppeteer which i call various functions in, and with my current implementation i have to copy and paste a lot of setup code.
Usually, in every cli command i have to setup pupeteer, setup the app and get access to its api object, and then run an arbitrary command on that api, and get the data back in node.
It looks something like this.
const {page, browser} = await createBrowser() // Here i setup the browser and add some script tags.
let data;
page.exposeFunction('extractData', (data) => {
data = data;
})
await page.evaluate(async (input) => {
// Setup work
const requestEvent = new CustomEvent('requestAppApi', {
api: undefined;
})
window.dispatchEvent(requestEvent);
const api = requestEvent.detail.api;
// Then i call some arbitrary function, that will
always return some data that gets extracted by the exposed function.
const data = api.arbitraryFunction(input);
window.extractData(data)
}, input)
What i would like is to wrap all of the setup code in a function, so that i could call it and just specify what to do with the api object once i have it.
My initial idea was to have a function that will take a callback that has this api object as a parameter.
const { page, browser } = wait createBrowser();
page.exposeFunction(async (input) =>
setupApiObject(((api) =>
api.callSomeFunction(input)
), input)
However, this does not work. I understand that puppeteer requires any communication between the node context and the browser to be serialised as json, and obviously a function cant be. Whats tripping me up is that I'm not actually wanting to call these methods in the node context, just have a way to reuse them. The actual data transfer is already handled by page.exposeFunction.
How would a more experienced puppeteer dev accomplish this?
I'll answer my own question here, since i managed to figure out a way to do it. Basically, you can use page.evaluate to create a function on the window object that can later be reused.
So i did something like
await page.evaluate(() => {
window.useApiObject = function(callback: (api) => void){
// Perform setup code
callback()
}
})
Meaning that later on i could use that method in the browser context and avoid redoing the setup code.
page.evaluate(() => {
window.useApiObject((api) => {
api.someMethod()
})
})

catch mysql errors before sending HTTP response

I am coming from php background where sql queries are blocking.
My usual setup is to make a try, catch block to handle any possible errors that might happen during table updating. I wanted to transfer this logic to nodejs, but it's asynchronous nature is making this troublesome. Take a look at example code:
var httpServer = Https.createServer(httpsOptions, function (request, response) {
try {
//many queries in if-else, switch statements
if(something){
mysqlConnection.query(query,[],function(err, result){});
}else{
//many queries will follow
if(somethingelse){
mysqlConnection.query(query2,[],function(err, result){});
}else{
mysqlConnection.query(query3,[],function(err, result){});
if(andsoon){}
}
}
response.writeHead(200, { "Content-Type": "text/plain" });
response.end("Got it.");
console.log("Response successful");
}catch(error){
console.error(error);
response.writeHead(400, { "Content-Type": "text/plain" });
response.end("Error.");
}
}).listen(8000);
So, response.end() will trigger before the query executes so I send the message before I know if the query was successful. I could wrap the response.end() code in callback, but there are MANY queries that are executing, so now I have to track how many were executed so far. Things get worse, 1 if-else control flow requires just 1 query, some other requires 10 so there is crazy overhead with tracking when the code is completed.
This all can't be the optimal workflow. What do you do when you have a complex query system and you want to send response only after all queries have executed(or failed)?
You've put null callbacks from your .query invocations. That's your problem.
mysqlConnection.query(query,[],function(err, result){});
xxxxxxxxxxxxxxxxxxxxxxxx
Javascript goes on to run the next statement after the one shown immediately without waiting for the query to finish. You know the query is complete (or it hit an error) only when the .query method invokes your callback function.
So, you need to do something like this instead.
mysqlConnection.query(query,[],function(err, result){
if (err) throw Error(err);
//many queries will follow
if(somethingelse){
...
}
});
A sequence of queries like you need ends up with an absurdly deeply nested set of callback functions. But don't despair: Javascript's Promise pattern works around this for you and gives you clean code. Use npm's promise-mysql package. And write code like this.
mysqlConnection.query(query,[])
.then (function(mySqlConnection) {
if (somethingelse)
return mysqlConnection.query(query2,[])
else
return mysqlConnection.query(query3,[])
})
.then (function(mySqlConnection) {
return mysqlConnection.query(query4,[])
})
.then (function(mySqlConnection) {
response.writeHead(200, { "Content-Type": "text/plain" });
response.end("Got it.");
})
.catch (function(err)) {
/* failure somewhere ! */
});
You have a series of then functions, with a catch at the end. The Promise subsystem hides all the callback invocation for you, organizing it this way instead. (Notice that I did not debug this sample code.)
You must figure out this Promise stuff to program sql / nodejs successfully. It's counterintuitive at first (at least it was for me when I was learning it). But it's worth your effort. Pro tip: if you're stepping through this kind of code in a debugger, use Step Into liberally.

Cloud Functions for Firebase Could not handle the request after a successful request

TLDR: After writing a JSON (successfully) to my Firestore, the next request will give me Internal Server Error (500). I have a suspicion that the problem is that inserting is not yet complete.
So basically, I have this code:
const jsonToDb = express();
exports.jsondb = functions.region('europe-west1').https.onRequest(jsonToDb);
jsonToDb.post('', (req, res) => {
let doc;
try {
doc = JSON.parse(req.body);
} catch(error) {
res.status(400).send(error.toString()).end();
return;
}
myDbFuncs.saveMyDoc(doc);
res.status(201).send("OK").end();
}
The database functions are in another JS file.
module.exports.saveMyDoc = function (myDoc) {
let newDoc = db.collection('insertedDocs').doc(new Date().toISOString());
newDoc.set(myDoc).then().catch();
return;
};
So I have several theories, maybe one of them is not wrong, but please help me with this. (Also if I made some mistakes in this little snippet, just tell me.)
Reproduction:
I send the first request => everything is OK, Json in the database.
I send a second request after the first request give me OK status => it does not do anything for a few secs, then 500: Internal Server Error.
Logs: Function execution took 4345 ms, finished with status: 'connection error'.
I just don't understand. Let's imagine I'm using this as an API, several requests simultaneously. Can't it handle? (I suppose it can handle, just I do something stupid.) Deliberately, I'm sending the second request after the first has finished and this occurs.
Should I make the saveMyDoc async?
saveMyDoc isn't returning a promise that resolves when all the async work is complete. If you lose track of a promise, Cloud Functions will shut down the work and clean up before the work is complete, making it look like it simply doesn't work. You should only send a response from an HTTP type function after all the work is fully complete.
Minimally, it should look more like this:
module.exports.saveMyDoc = function (myDoc) {
let newDoc = db.collection('insertedDocs').doc(new Date().toISOString());
return newDoc.set(myDoc);
};
Then you would use the promise in your main function:
myDbFuncs.saveMyDoc(doc).then(() => {
res.status(201).send("OK").end();
}
See how the response is only sent after the data is saved.
Read more about async programming in Cloud Functions in the documentation. Also watch this video series that talks about working with promises in Cloud Functions.

How to parse or Stringify in asycnhronous way in javascript

I see that JSON.stringify and JSON.parse are both sycnhronous.
I would like to know if there a simple npm library that does this in an asynchonous way .
Thank you
You can make anything "asynchronous" by using Promises:
function asyncStringify(str) {
return new Promise((resolve, reject) => {
resolve(JSON.stringify(str));
});
}
Then you can use it like any other promise:
asyncStringfy(str).then(ajaxSubmit);
Note that because the code is not asynchronous, the promise will be resolved right away (there's no blocking operation on stringifying a JSON, it doesn't require any system call).
You can also use the async/await API if your platform supports it:
async function asyncStringify(str) {
return JSON.stringify(str);
}
Then you can use it the same way:
asyncStringfy(str).then(ajaxSubmit);
// or use the "await" API
const strJson = await asyncStringify(str);
ajaxSubmit(strJson);
Edited: One way of adding true asynchrnous parsing/stringifying (maybe because we're parsing something too complex) is to pass the job to another process (or service) and wait on the response.
You can do this in many ways (like creating a new service that shares a REST API), I will demonstrate here a way of doing this with message passing between processes:
First create a file that will take care of doing the parsing/stringifying. Call it async-json.js for the sake of the example:
// async-json.js
function stringify(value) {
return JSON.stringify(value);
}
function parse(value) {
return JSON.parse(value);
}
process.on('message', function(message) {
let result;
if (message.method === 'stringify') {
result = stringify(message.value)
} else if (message.method === 'parse') {
result = parse(message.value);
}
process.send({ callerId: message.callerId, returnValue: result });
});
All this process does is wait a message asking to stringify or parse a JSON and then respond with the right value.
Now, on your code, you can fork this script and send messages back and forward. Whenever a request is sent, you create a new promise, whenever a response comes back to that request, you can resolve the promise:
const fork = require('child_process').fork;
const asyncJson = fork(__dirname + '/async-json.js');
const callers = {};
asyncJson.on('message', function(response) {
callers[response.callerId].resolve(response.returnValue);
});
function callAsyncJson(method, value) {
const callerId = parseInt(Math.random() * 1000000);
const callPromise = new Promise((resolve, reject) => {
callers[callerId] = { resolve: resolve, reject: reject };
asyncJson.send({ callerId: callerId, method: method, value: value });
});
return callPromise;
}
function JsonStringify(value) {
return callAsyncJson('stringify', value);
}
function JsonParse(value) {
return callAsyncJson('parse', value);
}
JsonStringify({ a: 1 }).then(console.log.bind(console));
JsonParse('{ "a": "1" }').then(console.log.bind(console));
Note: this is just one example, but knowing this you can figure out other improvements or other ways to do it. Hope this is helpful.
Check this out, another npm package-
async-json is a library that provides an asynchronous version of the standard JSON.stringify.
Install-
npm install async-json
Example-
var asyncJSON = require('async-json');
asyncJSON.stringify({ some: "data" }, function (err, jsonValue) {
if (err) {
throw err;
}
jsonValue === '{"some":"data"}';
});
Note-Didn't test it, you need to manually check it's dependency and
required packages.
By asynchronous I assume you actually mean non-blocking asynchronous - i.e., if you have a large (megabytes large) JSON string, and you stringify, you don't want your web server to hard freeze and block newly incoming web requests for 500+ milliseconds while it processes the object.
Option 1
The generic answer is to iterate through your object piece by piece, and to then call setImmedate whenever a threshold is reached. This then allows other functions in the event queue to run for a bit.
For JSON (de)serialization, the yieldable-json library does this very well. It does however drastically sacrifice JSON processing time (which is somewhat intentional).
Usage example from the yieldable-json readme:
const yj = require('yieldable-json')
yj.stringifyAsync({key:"value"}, (err, data) => {
if (!err)
console.log(data)
})
Option 2
If processing speed is extremely important (such as with real-time data), you may want to consider spawning multiple Node threads instead. I've used used the PM2 Process Manager with great success, although initial setup was quite daunting. Once it works however, the final result is magic, and does not require modifying your source code, just your package.json file. It acts as a proxy, load balancer, and monitoring tool for Node applications. It's somewhat analogous to Docker swarm, but bare metal, and does not require a special client on the server.

How do I use the data returned by an ajax call?

I am trying to return an array of data inside a JSON object that is return from a URL, I can see the data that is being returned using console.log.
However when trying to catch the return array in a variable for example:
var arr = list();
console.log(arr.length);
The length being output by this code is "0" despite the fact that the data returned has content (so the length is greater than zero). How can I use the data?
list: function() {
var grades = [];
$.getJSON(
"https://api.mongolab.com/api/1/databases", function(data) {
console.log(data);
grades [0] = data[0].name;
console.log(grades.length);
});
return grades;
},
The issue you are facing is easy to get snagged on if you aren't used to the concept of asynchronous calls! Never fear, you'll get there.
What's happening is that when you call the AJAX, your code continues to process even though the request has not completed. This is because AJAX requests could take a long time (usually a few seconds) and if the browser had to sit and wait, the user would be staring in angsuish at a frozen screen.
So how do you use the result of your AJAX call?
Take a closer look at the getJSON documentation and you will see a few hints. Asynchronous functions like getJSON can be handled in two ways: Promises or Callbacks. They serve a very similar purpose because they are just two different ways to let you specify what to do once your AJAX is finished.
Callbacks let you pass in a function to getJSON. Your function will get called once the AJAX is finished. You're actually already using a callback in the example code you wrote, it's just that your callback is being defined inside of your list() method so it isn't very useful.
Promises let you pass in a function to the Promise returned by getJSON, which will get called once the AJAX is finished.
Since you are doing all this inside of a method, you have to decide which one you're going to support. You can either have your method take in callbacks (and pass them along) or you can have your method return the promise returned by getJSON. I suggest you do both!
Check it out:
var list = function(success) {
// Pass in the callback that was passed into this function. getJSON will call it when the data arrives.
var promise = $.getJSON("https://api.mongolab.com/api/1/databases", success)
// by returning the promise that getJSON provides, we allow the caller to specify the promise handlers
return promise;
}
// Using callbacks
list(function(grades) {
console.log(grades);
});
// Using promises
list()
.success(function(grades) {
console.log(grades);
});