I'm newby at Node.js and Socket.io. I wonder if i accumulate the data which users entered from UI, then, these datas collecting at the server and pushing database at intervals. For example, different users entering messages from their accounts continuously, and these datas collecting at the server. every 1 minutes, collected datas push into database, and the data pool will drain off. Is it possible with node.js and socket.io?
You can achieve such result pretty easily in NodeJS.
Just accumulate your messages in an hashmap (let say a key per user) and trigger a timeout on every minute to flush your hashmap into your database.
var messages = {};
io.sockets.on('connection', function (socket) {
socket.on('message', function (data) {
// data.id is user id, data.content the message
if (!(data.id in messages)) {
messages[data.id] = []
}
messages[data.id].push(data.content);
});
});
var flush = function() {
for (userId in messages) {
// ... write in database
// effectively flush messages
messages[userId] = [];
}
// Don't forget to set agin the next flush
setTimeout(flush, 1000);
}
setTimeout(flush, 1000);
Its a naive solution which will be very easy to setup and test.
Don't use setInterval, which is verry greedy on CPU time.
As this example is single-threaded, your server will be unavailable during the (short) time it will write on the database.
If you need a highly available server, consider use another process for storing and flushing messages, and send it your messages instead of storing them into the main (communication) process.
Related
I am building a user interface thanks to a GAS web app.
This script is connected to a spreadsheet to read and record data.
I am able to record data from the user entry to the spreadsheet thanks to my current code but my concern is about the reliability of this app when multiple users will be connected to this app and will try to record data.
In order to test the tool, I have created a 10 iterations loop on client side which send pre formatted data to the server side for recording into the spreadsheet.
When I launch the function on one computer it works (10 lines are properly recorded) but when a second user activate the same function on its session the total number of lines recorded is random (15 or 17 instead of 20).
When having a look to the spreadsheet when scriptS are running, I see that sometimes values on a line is overwritten by an other value (just like if the new row was not recorded at the proper place).
The Web app is share as execute as me (so the server side code is executed with my account)
In order to control what s going on, a lockservice has been implemented on the server side script (the part which records the data on the spreadsheet) and a new promise / async function on the client side.
function recordDataOnSheet(data){ // server side function to record data in the spreadsheet
var lock='';
lock = LockService.getScriptLock();
var success = lock.tryLock(10000);//Throws exception if fail wait 10s max to get a lock
if (success){
var dat =dataSheet.getDataRange().getValues();
lsRow++;
data[0].length);
dat.push(data);
dataSheet.getRange(1, 1, dat.length, dat[0].length).setValues(dat);
Logger.log(dat.length);
} else {Logger.log("Lock failed")};
lock.releaseLock();
} ```
and client side piece of script:
function runGoogleSript(serverFunc,dat){ // function to mange the asynchronous call of a server sdide function
return new Promise((resolve, reject) => {
google.script.run.withSuccessHandler(data => {
resolve(data)
}).withFailureHandler(er => {
reject(er)
})[serverFunc](dat)
});
}
async function test (){
for (i=0;i<10;i++){
let d = new Date();
setTimeout(() => { console.log(i) }, 2000);
try {
const data = await runGoogleSript("recordDataOnSheet",["06323","XX:",user.id,"2022-02-07", d.getMinutes() , d.getSeconds() ] );
}
catch (er) {alert(er);}
}
}
After dozen of test and tests, I have discovered that it was only an issue with the spreadsheet call.
As I am calling the same file/sheet in a loop sometimes the previous activity (to update the data in the sheet) were not over. This was at the origin of the issue.
I have simply added a SpreadsheetApp.flush(); at the end of my function recordDataOnSheet and it works and it is reliable
the urlfetch has a limited data allowed so in order to load a large CSV or TSV file (50K) I've created a proxy server (REST)
which makes the request under the hood, map it to list and cache it for 6 hours.
then, I added an endpoint to fetch data based on a range (see signature below)
signature:
var PROXY = 'http://lagoja-services.rcb.co.il/api/adwordsProxy/feed/{base64URL}/{seperator}/{from}/{to}'
when someone reaches the range endpoint, I create a CSV on-the-fly.
the first loading of this large file on the server takes between 60-80 seconds, other followed requests take the content from the cache so it takes only 2 seconds.
the problem I have is that I'm getting a timeout after 60 seconds using the urlfetch built-in class so it Alternately works (sometimes it works and sometimes getting timeout exception)
there is no way of reducing the first load!
it seems that I can't control the client timeout (from some unknown reason)
any other solution for me?
thanks, Roby (-:
hi, I've created a dedicated "test" endpoint for you to reproduce the issue. this endpoint tries to load the massive file every time (no cache)
function main() {
var response = UrlFetchApp.fetch('http://lagoja-services.rcb.co.il/api/adwordsProxy/test');
var status_code = response.getResponseCode();
Logger.log('status code %s', status_code);
}
endpoint:
http://lagoja-services.rcb.co.il/api/adwordsProxy/test
exception:
Timeout: http://lagoja-services.rcb.co.il/api/adwordsProxy/test
hi all,
i found a solution (which i don't like of course)
but i didn't have any better suggestion.
the idea is to ping the server to fetch the file, wait till it loaded into the cache and then request the loaded content... a shitty solution but at least it works lol
api:
[HttpGet]
[Route("api/fire")]
public HttpResponseMessage FireGetCSVFeed(...) {
HostingEnvironment.QueueBackgroundWorkItem(async ct => {
// make a request to the original CSV file - takes 60-80 seconds
});
return Request.CreateResponse(HttpStatusCode.OK, "Working...");
}
adwords script:
function sleep(milliseconds) {
var start = new Date().getTime();
while (new Date().getTime() < start + milliseconds);
}
function main() {
var response = UrlFetchApp.fetch(baseURL + 'fire');
sleep(1000 * 120); // yes i know, i'm blocking everything )-:
response = UrlFetchApp.fetch(baseURL + 'getCSV');
status_code = response.getResponseCode();
Logger.log('status code %s', status_code);
}
I am making a chat application, I wish to monitor which users are online and which have left.
When user joins on Connect it will add his IP to mysql users online table along with username etc..
When user leaves on Disconnect it will remove his IP from users online.
Just in case any unpredicted scenario happens, I want to get all IP addresses of clients that are currently connected to server and compare it to the ones that are in table and that way sort which clients are connected and which aren't.
So how can I obtain a list of ip's of connected clients?
The reason I want to use MySQL and table for this is because I want to monitor how many users are currently online from external PHP site. If there is better way I am open for suggestions.
One solution would be to keep an object around that contains all connected sockets (adding on connect and removing on close). Then you just iterate over the sockets in the object.
Or if you're feeling adventurous, you could use an undocumented method to get all of the active handles in node and filter them. Example:
var http = require('http');
var srv = http.createServer(function(req, res) {
console.dir(getIPs(srv));
// ...
});
srv.listen(8000);
function getIPs(server) {
var handles = process._getActiveHandles(),
ips = [];
for (var i = 0, handle, len = handles.length; i < len; ++i) {
handle = handles[i];
if (handle.readable
&& handle.writable
&& handle.server === server
&& handle.remoteAddress) {
ips.push(handle.remoteAddress);
}
}
return ips;
}
I am trying setup caching for a spreadsheet custom funciton but the results seem to be inconsistent/unexpected. Sometimes I get the cached results, sometimes it refreshes the data. I've set the timeout to 10 seconds, and when I refresh within 10 seconds, sometimes it grabs new data, sometimes it caches. Even after waiting more than 10 seconds since last call, sometimes I get the cached results. Why is there so much inconsistency in the spreadsheet function? (or am I just doing something wrong?). When I call the function directly within the actual script, it seems to be much more consistent but sometimes I get inconsistenties/unexpected results.
function getStackOverflow(){
var cache = CacheService.getPublicCache();
var cached = cache.get("stackoverflow");
if(cached != null) {
Logger.log('this is cached');
return 'this is cached version';
}
// Fetch the data and create an object.
var result = UrlFetchApp.fetch('http://api.stackoverflow.com/1.1/tags/google-apps-script/top-answerers/all-time');
var json = Utilities.jsonParse(result.getContentText()).top_users;
var rows = [],data;
for (i = 0; i < json.length; i++) {
data = json[i].user;
rows.push(data.display_name);
}
Logger.log("This is a refresh");
cache.put("stackoverflow",JSON.stringify(rows),10);
return rows;
}
You cant use custom functions like that. Its documented.
Custom functions must be deterministic, they have always the same output given fhe same input (in your case none since you are passing no parameters.
the spreadsheet will remember the values for each input set, basically like a second layer of cache that yiu have no control.
HI I am trying to delete a record in indexed database by passing its id, but my the function is not working properly and even Visual Studio intellisence is not showing any such function. Is objectstore.delete() function of the indexed database API has been depreciated or I am doing something wrong in calling it.
Following is the code spinet
var result = objectStore.delete(key);
result.onsuccess = function() {
alert('Success');
};
The delete by key function is working fine in all browsers Chrome, FF and IE10. Here is the sample code:
var connection = indexedDB.open(dbName);
connection.onsuccess = function(e) {
var database = e.target.result;
var transaction = database.transaction(storeName, 'readwrite');
var objectStore = transaction.objectStore(storeName);
var request = objectStore.delete(parseInt(key));
request.onsuccess = function (event)
{
database.close();
};
}
Almost everything in IndexedDB works the same way, and your question belies a misunderstanding of this model: everything happens in a transaction.
Almost nothing is syncronous in the IndexedDB API except opening the database. So you'll never see anything like database.delete() or database.set() when dealing with records.
To delete a record, as with getting or setting, you start by creating a new transaction on the database. You then use that transaction (like in Deni's example) to invoke the method for your change.
The transaction then "disappears" when it goes out of scope of all functions and your change is then committed to the database. It's on this transaction's reference to the database (not the database itself) that you hook event listeners such as success and error callbacks.