How to send dry run notifications with vapid push notifications - vapid

GCM provides a way to send dry run messages to test the request formats, as explain in reference https://developers.google.com/cloud-messaging/http-server-ref. How can the similar dry_run support be achieved with VAPID (FCM) standard?

I know this is an old post, but I'll answer. Maybe it helps someone.
You can use DryRun to test in FCM too. Look:
// Send a message in the dry run mode.
var dryRun = true;
admin.messaging().send(message, dryRun)
.then((response) => {
// Response is a message ID string.
console.log('Dry run successful:', response);
})
.catch((error) => {
console.log('Error during dry run:', error);
});
You need to pass a bool as second parameter of send() function.

Related

Is sending login credentials to server for authentication and authorization in body with content-type: application/json acceptable?

I was looking through multiple stack overflow questions and answers but wasn't able to get anything definitive when it comes to making a request to a server for login authentication and authorization.
My question: Is sending login credentials to server for authentication and authorization in body with content-type: application/json acceptable?
const handleSubmit = (e) => {
e.preventDefault();
const formData = new FormData(e.target);
const [email, password] = formData.values();
fetch('/login', {
method: 'POST',
headers : {'Content-Type' : 'application/json'},
body: JSON.stringify({email, password})
}).then(result =>{ //result is a ReadableStream object
return result.json(); //result.json() parses the data into useable format (json)
}).then(data => {
if(data.isAuthenticated){
handleUserAuthChange(true, ()=>{
history.push('/vehicles');
});
}
}).catch(err => console.log(err));
}
As long as you are using HTTPS, yes. This is a pretty common way of handling login requests
There is a great "tutorial" here on stackoverflow.
Unless the connection is already secure (that is, tunneled through HTTPS using SSL/TLS), your login form values will be sent in cleartext, which allows anyone eavesdropping on the line between browser and web server will be able to read logins as they pass through. This type of wiretapping is done routinely by governments, but in general, we won't address 'owned' wires other than to say this: Just use HTTPS.
In short, you want to always use HTTPS to be sure it's safe.

Cloud Functions for Firebase Could not handle the request after a successful request

TLDR: After writing a JSON (successfully) to my Firestore, the next request will give me Internal Server Error (500). I have a suspicion that the problem is that inserting is not yet complete.
So basically, I have this code:
const jsonToDb = express();
exports.jsondb = functions.region('europe-west1').https.onRequest(jsonToDb);
jsonToDb.post('', (req, res) => {
let doc;
try {
doc = JSON.parse(req.body);
} catch(error) {
res.status(400).send(error.toString()).end();
return;
}
myDbFuncs.saveMyDoc(doc);
res.status(201).send("OK").end();
}
The database functions are in another JS file.
module.exports.saveMyDoc = function (myDoc) {
let newDoc = db.collection('insertedDocs').doc(new Date().toISOString());
newDoc.set(myDoc).then().catch();
return;
};
So I have several theories, maybe one of them is not wrong, but please help me with this. (Also if I made some mistakes in this little snippet, just tell me.)
Reproduction:
I send the first request => everything is OK, Json in the database.
I send a second request after the first request give me OK status => it does not do anything for a few secs, then 500: Internal Server Error.
Logs: Function execution took 4345 ms, finished with status: 'connection error'.
I just don't understand. Let's imagine I'm using this as an API, several requests simultaneously. Can't it handle? (I suppose it can handle, just I do something stupid.) Deliberately, I'm sending the second request after the first has finished and this occurs.
Should I make the saveMyDoc async?
saveMyDoc isn't returning a promise that resolves when all the async work is complete. If you lose track of a promise, Cloud Functions will shut down the work and clean up before the work is complete, making it look like it simply doesn't work. You should only send a response from an HTTP type function after all the work is fully complete.
Minimally, it should look more like this:
module.exports.saveMyDoc = function (myDoc) {
let newDoc = db.collection('insertedDocs').doc(new Date().toISOString());
return newDoc.set(myDoc);
};
Then you would use the promise in your main function:
myDbFuncs.saveMyDoc(doc).then(() => {
res.status(201).send("OK").end();
}
See how the response is only sent after the data is saved.
Read more about async programming in Cloud Functions in the documentation. Also watch this video series that talks about working with promises in Cloud Functions.

how do callback from internal service with filter?

a help, while consuming another internal service eg:
await Promise.all (messages.map (async message => {
// Also pass the original `params` to the service call
// so that it has the same information available (e.g. who is requesting it)
message.user = await app.service ('users'). get (message.userId, params);
}));
I get all the records,
email, avatar, _id.
How do I get only email back ...?
tks
Use the $select in query. feathers select

How to parse or Stringify in asycnhronous way in javascript

I see that JSON.stringify and JSON.parse are both sycnhronous.
I would like to know if there a simple npm library that does this in an asynchonous way .
Thank you
You can make anything "asynchronous" by using Promises:
function asyncStringify(str) {
return new Promise((resolve, reject) => {
resolve(JSON.stringify(str));
});
}
Then you can use it like any other promise:
asyncStringfy(str).then(ajaxSubmit);
Note that because the code is not asynchronous, the promise will be resolved right away (there's no blocking operation on stringifying a JSON, it doesn't require any system call).
You can also use the async/await API if your platform supports it:
async function asyncStringify(str) {
return JSON.stringify(str);
}
Then you can use it the same way:
asyncStringfy(str).then(ajaxSubmit);
// or use the "await" API
const strJson = await asyncStringify(str);
ajaxSubmit(strJson);
Edited: One way of adding true asynchrnous parsing/stringifying (maybe because we're parsing something too complex) is to pass the job to another process (or service) and wait on the response.
You can do this in many ways (like creating a new service that shares a REST API), I will demonstrate here a way of doing this with message passing between processes:
First create a file that will take care of doing the parsing/stringifying. Call it async-json.js for the sake of the example:
// async-json.js
function stringify(value) {
return JSON.stringify(value);
}
function parse(value) {
return JSON.parse(value);
}
process.on('message', function(message) {
let result;
if (message.method === 'stringify') {
result = stringify(message.value)
} else if (message.method === 'parse') {
result = parse(message.value);
}
process.send({ callerId: message.callerId, returnValue: result });
});
All this process does is wait a message asking to stringify or parse a JSON and then respond with the right value.
Now, on your code, you can fork this script and send messages back and forward. Whenever a request is sent, you create a new promise, whenever a response comes back to that request, you can resolve the promise:
const fork = require('child_process').fork;
const asyncJson = fork(__dirname + '/async-json.js');
const callers = {};
asyncJson.on('message', function(response) {
callers[response.callerId].resolve(response.returnValue);
});
function callAsyncJson(method, value) {
const callerId = parseInt(Math.random() * 1000000);
const callPromise = new Promise((resolve, reject) => {
callers[callerId] = { resolve: resolve, reject: reject };
asyncJson.send({ callerId: callerId, method: method, value: value });
});
return callPromise;
}
function JsonStringify(value) {
return callAsyncJson('stringify', value);
}
function JsonParse(value) {
return callAsyncJson('parse', value);
}
JsonStringify({ a: 1 }).then(console.log.bind(console));
JsonParse('{ "a": "1" }').then(console.log.bind(console));
Note: this is just one example, but knowing this you can figure out other improvements or other ways to do it. Hope this is helpful.
Check this out, another npm package-
async-json is a library that provides an asynchronous version of the standard JSON.stringify.
Install-
npm install async-json
Example-
var asyncJSON = require('async-json');
asyncJSON.stringify({ some: "data" }, function (err, jsonValue) {
if (err) {
throw err;
}
jsonValue === '{"some":"data"}';
});
Note-Didn't test it, you need to manually check it's dependency and
required packages.
By asynchronous I assume you actually mean non-blocking asynchronous - i.e., if you have a large (megabytes large) JSON string, and you stringify, you don't want your web server to hard freeze and block newly incoming web requests for 500+ milliseconds while it processes the object.
Option 1
The generic answer is to iterate through your object piece by piece, and to then call setImmedate whenever a threshold is reached. This then allows other functions in the event queue to run for a bit.
For JSON (de)serialization, the yieldable-json library does this very well. It does however drastically sacrifice JSON processing time (which is somewhat intentional).
Usage example from the yieldable-json readme:
const yj = require('yieldable-json')
yj.stringifyAsync({key:"value"}, (err, data) => {
if (!err)
console.log(data)
})
Option 2
If processing speed is extremely important (such as with real-time data), you may want to consider spawning multiple Node threads instead. I've used used the PM2 Process Manager with great success, although initial setup was quite daunting. Once it works however, the final result is magic, and does not require modifying your source code, just your package.json file. It acts as a proxy, load balancer, and monitoring tool for Node applications. It's somewhat analogous to Docker swarm, but bare metal, and does not require a special client on the server.

Customize Loopback response after save

I have a loopback 2.x app, in which I have a model Conversation and a model Message, with a relationship "Conversation has many messages". I want to customize the response for POST conversations/:id/messages with a json response different than the default, say {status: 'success'}. I tried to use remote hook for the method __create__messages, but it did not work:
Conversation.afterRemote('__create__messages', function(ctx, next) {
ctx.result.data = {
success: 'yes'
};
next();
});
This still returns the default response. How can I return a custom json for a remote method? I have seen examples only for all models, or for all methods: multiple models, multiple methods
Maybe you can try a version of following code below. Also, I think you are meaning to to manipulate data before the method finishes, not after. If you wait, the response will already be created, preventing your intended goal. Let me know if this works (replace with methods that will work for your use case).
Conversation.observe('before save', function(context, next) {
var instance = context.instance || context.data;
if (!instance) return next();
// Your code here
next();
});