Where to specify whether it's a GET or POST? - google-cloud-functions

When I create a new Google Cloud function, the default code given is:
const functions = require('#google-cloud/functions-framework');
functions.http('helloHttp', (req, res) => {
res.send(`Hello ${req.query.name || req.body.name || 'World'}!`);
});
However, in this tutorial, the function is specified as:
exports.validateTemperature = async (req, res) => {
try {
if (req.body.temp < 100) {
res.status(200).send("Temperature OK");
} else {
res.status(200).send("Too hot");
}
} catch (error) {
//return an error
console.log("got error: ", error);
res.status(500).send(error);
}
};
What is the difference between the two? How do they work in the bigger scheme of things?
In the second example, the code is listening for a Http POST request. Where is this specified?

Through the two methods you exposed the result is the same, a HTTP Path is used to receive the Request and Response objects.
Inside the Request object, you may find the Body (usually filled in POST & PUT requests but not limited to) and the Method (GET, POST, PUT, etc).
Therefore, your Cloud Function code will be used with both a GET and a POST call in either solution.

Functions Framework
Functions Framework turns a Cloud Functions snippet into a workable server. It's kind of like register a handler function to an express router, and run the express app behind the scenes.
The main use case is local development and migrant to App Engine or other services. They both need to start a HTTP server, and functions framework does that.
Express
Node.js runtime explains your code snippet using Express framework.
The validateTemperature accepts all HTTP methods.
We often filter HTTP methods by a router. Although you can do it in with req.method. And router, web server level is what you don't want to reapeat again and again.
If you want to split requests by method when buiding an API service, you could consider let Cloud Endpoints or some API Gateway stands before your functions.

Related

use of $timeout with 0 milliseconds

HttpMethod.CallHttpPOSTMethod('POST',null, path).success(function (response) {
console.log(response);
$scope.htmlString = $sce.trustAsHtml(response.data[0]);
$timeout(function () {
var temp = document.getElementById('form');
if (temp != null) {
temp.submit();
}
}, 0);
});
I will get html string in RESPONSE of my API call. And then I will add the html to my view page.
If I write the code outside $timeout service it wont work as it will work when written inside $timeout service.
What is the difference between two ways?
How is $timeout useful here?
When you make any changes to the controller, it does not start asynchronously for two-way binding. If the asynchronous code is wrapped in special ones: `$timeout, $scope.$apply, etc. binding will happen. For the current code example, I would have tried replace you code to:
HttpMethod.CallHttpPOSTMethod('POST',null, path).success(function (response) {
console.log(response);
$scope.htmlString = $sce.trustAsHtml(response.data[0]);
var temp = document.getElementById('form');
if (temp != null) {
temp.submit();
}
$scope.$apply();
});
I tried to give you an answer in very simple language, hope it may help to understand your issue.
Generally, When HTTP request fires to execute it will send to the server and get the data from the server this is the general scenario we have in our mind. There may be a situation occur that sometime due to network latency it may possible to receive response delay.
AngluarJs application has its own lifecycle.
Root scope is created during application bootstrap by the $injector. In template linking, directive binding creates new child scope.
While template linking there is watch registered to particular scope to identify particular changes.
In your case, when template linking and binding directive, there is a new watcher registered. Due to network latency or other reason your $http request sends delay response to your $http request and meanwhile those time scope variable has been changed. due to that, it will not give the updated response.
When you send $http request to a server it is asynchronous operation. When you use $timeout ultimately your scope binding wait to numbers of seconds in $timeout function you defined. After n number of seconds, your scope variable watch has been executed and it will update the value if you get the response in time.

Communicate "out" from Chromium via DevTools protocol

I have a page running in a headless Chromium instance, and I'm manipulating it via the DevTools protocol, using the Puppeteer NPM package in Node.
I'm injecting a script into the page. At some point, I want the script to call me back and send me some information (via some event exposed by the DevTools protocol or some other means).
What is the best way to do this? It'd be great if it can be done using Puppeteer, but I'm not against getting my hands dirty and listening for protocol messages by hand.
I know I can sort-of do this by manipulating the DOM and listening to DOM changes, but that doesn't sound like a good idea.
Okay, I've discovered a built-in way to do this in Puppeteer. Puppeteer defines a method called exposeFunction.
page.exposeFunction(name, puppeteerFunction)
This method defines a function with the given name on the window object of the page. The function is async on the page's side. When it's called, the puppeteerFunction you define is executed as a callback, with the same arguments. The arguments aren't JSON-serialized, but passed as JSHandles so they expose the objects themselves. Personally, I chose to JSON-serialize the values before sending them.
I've looked at the code, and it actually just works by sending console messages, just like in Pasi's answer, which the Puppeteer console hooks ignore. However, if you listen to the console directly (i.e. by piping stdout). You'll still see them, along with the regular messages.
Since the console information is actually sent by WebSocket, it's pretty efficient. I was a bit averse to using it because in most processes, the console transfers data via stdout which has issues.
Example
Node
async function example() {
const puppeteer = require("puppeteer");
let browser = await puppeteer.launch({
//arguments
});
let page = await browser.newPage();
await page.exposeFunction("callPuppeteer", function(data) {
console.log("Node receives some data!", data);
});
await page.goto("http://www.example.com/target");
}
Page
Inside the page's javascript:
window.callPuppeteer(JSON.stringify({
thisCameFromThePage : "hello!"
}));
Update: DevTools protocol support
There is DevTools protocol support for something like puppeteer.exposeFunction.
https://chromedevtools.github.io/devtools-protocol/tot/Runtime#method-addBinding
If executionContextId is empty, adds binding with the given name on
the global objects of all inspected contexts, including those created
later, bindings survive reloads. If executionContextId is specified,
adds binding only on global object of given execution context. Binding
function takes exactly one argument, this argument should be string,
in case of any other input, function throws an exception. Each binding
function call produces Runtime.bindingCalled notification.
.
If the script sends all its data back in one call, the simplest approach would be to use page.evaluate and return a Promise from it:
const dataBack = page.evaluate(`new Promise((resolve, reject) => {
setTimeout(() => resolve('some data'), 1000)
})`)
dataBack.then(value => { console.log('got data back', value) })
This could be generalized to sending data back twice, etc. For sending back an arbitrary stream of events, perhaps console.log would be slightly less of a hack than DOM events? At least it's super-easy to do with Puppeteer:
page.on('console', message => {
if (message.text.startsWith('dataFromMyScript')) {
message.args[1].jsonValue().then(value => console.log('got data back', value))
}
})
page.evaluate(`setInterval(() => console.log('dataFromMyScript', {ts: Date.now()}), 1000)`)
(The example uses a magic prefix to distinguish these log messages from all others.)

How to migrate from express js to feathers js server?

I build an api rest by express js simply to post an data in my server .
app.post("/register", function(request, response){
var username = request.body.username;
});
How i can do this with feathersjs ? and how i can call it from my reactjs app ?
Feathers is a drop-in replacement for Express. That means you can replace your const app = express(); with const app = feathers() and everything will work just the same, so what you are showing above you can also do with Feathers.
The real Feathers way to accomplish this however is through services which - with the other important concepts - are described in the basics guide.
There are pre-build services for several databases (which can be customized through hooks) but you can always create your own service. It is important to note that Feathers services - unlike the Express middleware you showed - will be available via HTTP (REST) and websockets (which also gets you real-time events). See here how service methods map to REST endpoints.
Your example in Feathers simply looks like this:
app.use('/register', {
create(data, params) {
// data is the request body e.g.
// data.username
// Always return a promise with the result data
return Promise.resolve(data);
}
});

U2F with multi-facet App ID

We have been directly using U2F on our auth web app with the hostname as our app ID (https://auth.company.com) and that's working fine. However, we'd like to be able to authenticate with the auth server from other apps (and hostnames, e.g. https://customer.app.com) that communicate with the auth server via HTTP API.
I can generate the sign requests and what-not through API calls and return them to the client apps, but it fails server-side (auth server) because the app ID doesn't validate (clients are using their own hostnames as app ID). This is understandable, but how should I handle this? I've read about facets but I cannot get it to work at all.
The client app JS is like:
var registerRequests = // ...
var signRequests = // ...
u2f.register('http://localhost:3000/facets', registerRequests, signRequests, function(registerResponse) {
if (registerResponse.errorCode) {
return alert("Registration error: " + registerResponse.errorCode);
}
// etc.
});
This gives me an Error code 5 (timeout error) after a while. I don't see any request to /facets . Is there a way around this or am I barking up the wrong tree (or a different forest)?
————
Okay, so after a few hours of researching this; I'm pretty sure this fiendish bit of the Firefox U2F plugin is the source of some of my woes:
if (u.scheme == "http")
if (url2str(u, true) == url2str(ou, true))
return resolve(challenge);
else
return reject("Not matching appID");
https://github.com/prefiks/u2f4moz/blob/master/ext/appIdValidator.js#L106-L110
It's essentially saying, if the appID's scheme is http, only allow it if it's exactly the same as the page's host (it goes on to do the behaviour for fetching the trusted facets JSON but only for https).
Still not sure if I'm on the right track though in how I'm trying to design this.
I didn't need to worry about facets for my particular situation. In the end I just pass the client app hostname through to the Auth server via the secure API interface and it uses that as the App ID. Seems to work okay so far.
The issue I was having with facets was due to using http in dev and the Firefox U2F plugin not permitting that with JSON facets.

Get the ISP of an IP in node.js

Is there a way to perform a whois on an IP to get the ISP that provides that IP in a Node.js/Express server ?
I already got the IP, I'm not looking for a way to get the client's IP.
I've found ways with external request to paid services that sends back JSON, but I would like to find a native way.
Do you guys know anything that could help me ?
Edit: I'm not trying to build a whois server, I just need for the application I build to get the client's ISP name.
You can get ISP information by using node-whois module but in its response it quite complex to access value for a particular key. So there is another way is you can use satellite module, This module can give quick response and response is available in json format so you can access any key values easily.
Here is the code.
var satelize = require('satelize');
var ExternalIP = "173.194.70.100"; // I asume that, you already have external(public)IP
satelize.satelize({ip: ExtenalIP}, function(err, geoData)
{
if(err){
console.log(" Error in retriving ISP Information");
}
else
{
console.log("ISP Information for "+ ExternalIP+" :"+geoData );
}
});
This is a Node.js module implementing a whois client.
As correctly pointed out by #robertklep, the above module does not work with IP addresses. Still, node-whois does (I personally tested the code this time):
"use strict";
var whois = require('node-whois');
whois.lookup('173.194.70.100', function(err, data) {
console.log(err, data);
});
The only issue is that the output is not very nice.
https://github.com/xreader/whois has nice JSON output. Hope this helps somebody.