Google Functions v2 and NestJs - google-cloud-functions

I'm looking into spinning up an api with nest using google cloud functions v2, looks like some people is doing it using nx: https://itnext.io/a-perfect-match-nestjs-cloud-functions-2nd-gen-nx-workspace-f13fb044e9a4, can this be done using nx?
I'm looking into a more vanilla example just using functions-framework and nest. Can somebody point me to any repo or example?
Thanks!

Nx is just tooling. It should be the same as you implement without NX. Can you try it in a normal Nestjs project?
// main.ts
const server = express()
import { http } from '#google-cloud/functions-framework'
export const createNestServer = async (expressInstance) => {
const app = await NestFactory.create(AppModule, new ExpressAdapter(expressInstance))
const globalPrefix = 'api'
app.setGlobalPrefix(globalPrefix)
app.enableCors()
return app.init()
}
createNestServer(server)
.then((v) => {
if (environment.production) {
Logger.log('🚀 Starting production server...')
} else {
Logger.log(`🚀 Starting development server on http://localhost:${process.env.PORT || 3333}`)
v.listen(process.env.PORT || 3333)
}
})
.catch((err) => Logger.error('Nest broken', err))
http('apiNEST', server)

Related

Batching with useQuery react hooks getting back undefined

I am currently working on a project which requires me to make multiple queries/mutations. I tried setting up my apollo client with BatchHttpLink and I can see the data I am requesting in the network tab in the browser. It is coming back at an array of objects instead of JSON.
But the issue is when I try to grab the data in my component data is undefined. I tried using HttpLink instead of BatchHttpLink and I can get the data back from the hook.
My suspicion is the shape of the object that comes back from the response is different, I tried looking into documentation but I can't find much about batching.
Currently using "#apollo/client#^3.0.2"
Here's my client set up.
import { ApolloClient, InMemoryCache, ApolloLink, from } from '#apollo/client'
import { BatchHttpLink } from '#apollo/client/link/batch-http'
import { onError } from '#apollo/client/link/error'
const BASE_URL = 'http://localhost:4000'
const httpLink = new BatchHttpLink({
uri: BASE_URL,
credentials: 'include',
})
const csrfMiddleware = new ApolloLink((operation, forward) => {
operation.setContext(({ headers = {} }) => ({
headers: {
...headers,
'X-CSRF-Token': getCSRFToken(),
},
}))
return forward(operation)
})
const errorMiddleware = onError(({ networkError }) => {
if (networkError && 'statusCode' in networkError && networkError.statusCode === 401) {
window.location.assign('/accounts/login')
}
})
const client = new ApolloClient({
link: from([errorMiddleware, csrfMiddleware, httpLink]),
cache: new InMemoryCache(),
})
This is the react hook I'm trying to console log.
const {data} = useQuery(GET_USER_PERMISSIONS_AND_PREFERENCES)
Figured it out. You need to add another middleware to return the data that the useQuery hook can recognize. The data that comes back in the batch call is an array of objects shaped
{
payload: {
data: { ... }
}
}
So something like this did the trick for me
const batchParseMiddleware = new ApolloLink((operation, forward) => {
return forward(operation).map((data: any) => data.payload)
})
I have been having a similar issue, and have so far only been able to solve it by breaking batching and converting to a normal HttpLink

Is it possible to perform an action with `context` on the init of the app?

I'm simply looking for something like this
app.on('init', async context => {
...
})
Basically I just need to make to calls to the github API, but I'm not sure there is a way to do it without using the API client inside the Context object.
I ended up using probot-scheduler
const createScheduler = require('probot-scheduler')
module.exports = app => {
createScheduler(app, {
delay: false
})
robot.on('schedule.repository', context => {
// this is called on startup and can access context
})
}
I tried probot-scheduler but it didn't exist - perhaps removed in an update?
In any case, I managed to do it after lots of digging by using the actual app object - it's .auth() method returns a promise containing the GitHubAPI interface:
https://probot.github.io/api/latest/classes/application.html#auth
module.exports = app => {
router.get('/hello-world', async (req, res) => {
const github = await app.auth();
const result = await github.repos.listForOrg({'org':'org name});
console.log(result);
})
}
.auth() takes the ID of the installation if you wish to access private data. If called empty, the client will can only retrieve public data.
You can get the installation ID by calling .auth() without paramaters, and then listInstallations():
const github = await app.auth();
const result = github.apps.listInstallations();
console.log(result);
You get an array including IDs that you can in .auth().

Resolving an ES6 module imported from a URL with Rollup

It is perfectly valid to import from a URL inside an ES6 module and as such I've been using this technique to reuse modules between microservices that sit on different hosts/ports:
import { authInstance } from "http://auth-microservice/js/authInstance.js"
I'm approaching a release cycle and have started down my usual path of bundling to IIFEs using rollup. Rollup doesn't appear to support es6 module imports from URLs, I think it should as this is allowed in the spec :(
module-name
The module to import from. This is often a relative or absolute path name to the .js file containing the module. Certain bundlers may permit or require the use of the extension; check your environment. Only single quotes and double quotes Strings are allowed. (https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Statements/import)
I've dug through the interwebs for an hour now and have come up with nothing. Has anybody seen a resolver similar to rollup-plugin-node-resolve for resolving modules from URLs?
I had to move on from this quickly so ended up just writing a skeleton of a rollup plugin. I still feel that resolving absolute paths should be a core feature of rollup.
Updated snippet
We have been using this to transpile production code for several of our apps for a considerable amount of time now.
const fs = require('fs'),
path = require('path'),
axios = require("axios")
const createDir = path => !fs.existsSync(path) && fs.mkdirSync(path)
const mirrorDirectoryPaths = async ({ cacheLocation, url }) => {
createDir(cacheLocation)
const dirs = [], scriptPath = url.replace(/:\/\/|:/g, "-")
let currentDir = path.dirname(scriptPath)
while (currentDir !== '.') {
dirs.unshift(currentDir)
currentDir = path.dirname(currentDir)
}
dirs.forEach(d => createDir(`${cacheLocation}${d}`))
return `${cacheLocation}${scriptPath}`
}
const cacheIndex = {}
const writeToDiskCache = async ({ cacheLocation, url }) => {
//Write a file to the local disk cache for rollup to pick up.
//If the file is already existing use it instead of writing a new one.
const cached = cacheIndex[url]
if (cached) return cached
const cacheFile = await mirrorDirectoryPaths({ cacheLocation, url }),
data = (await axiosInstance.get(url).catch((e) => { console.log(url, e) })).data
fs.writeFileSync(cacheFile, data)
cacheIndex[url] = cacheFile
return cacheFile
}
const urlPlugin = (options = { cacheLocation }) => {
return {
async resolveId(importee, importer) {
//We importing from a URL
if (/^https?:\/\//.test(importee)) {
return await writeToDiskCache({ cacheLocation: options.cacheLocation, url: importee })
}
//We are importing from a file within the cacheLocation (originally from a URL) and need to continue the cache import chain.
if (importer && importer.startsWith(options.cacheLocation) && /^..?\//.test(importee)) {
const importerUrl = Object.keys(cacheIndex).find(key => cacheIndex[key] === importer),
importerPath = path.dirname(importerUrl),
importeeUrl = path.normalize(`${importerPath}/${importee}`).replace(":\\", "://").replace(/\\/g, "/")
return await writeToDiskCache({ cacheLocation: options.cacheLocation, url: importeeUrl })
}
}
}
}
This plugin together with the following config works for me:
https://github.com/mjackson/rollup-plugin-url-resolve
import typescript from "#rollup/plugin-typescript";
import urlResolve from "rollup-plugin-url-resolve";
export default {
output: {
format: "esm",
},
plugins: [
typescript({ lib: ["es5", "es6", "dom"], target: "es5" }),
urlResolve(),
],
};
You can remove the TypeScript plugin obviously.

Node Elasticsearch - Bulk indexing not working - Content-Type header [application/x-ldjson] is not supported

Being new to elasticsearch, am exploring it by integrating with node and trying to execute the following online git example in windows.
https://github.com/sitepoint-editors/node-elasticsearch-tutorial
while trying to import the data of 1000 items from data.json, the execution 'node index.js' is failing with the following error.
By enabling the trace, I now see the following as the root cause from the bulk function.
"error": "Content-Type header [application/x-ldjson] is not supported",
** "status": 406**
I see a change log from https://www.elastic.co/guide/en/elasticsearch/client/javascript-api/current/changelog.html which says following
13.0.0 (Apr 24 2017) bulk and other APIs that send line-delimited JSON bodies now use the Content-Type: application/x-ndjson header #507
Any idea how to resolve this content type issue in index.js?
index.js
(function () {
'use strict';
const fs = require('fs');
const elasticsearch = require('elasticsearch');
const esClient = new elasticsearch.Client({
host: 'localhost:9200',
log: 'error'
});
const bulkIndex = function bulkIndex(index, type, data) {
let bulkBody = [];
data.forEach(item => {
bulkBody.push({
index: {
_index: index,
_type: type,
_id: item.id
}
});
bulkBody.push(item);
});
esClient.bulk({body: bulkBody})
.then(response => {
console.log(`Inside bulk3...`);
let errorCount = 0;
response.items.forEach(item => {
if (item.index && item.index.error) {
console.log(++errorCount, item.index.error);
}
});
console.log(`Successfully indexed ${data.length - errorCount} out of ${data.length} items`);
})
.catch(console.err);
};
// only for testing purposes
// all calls should be initiated through the module
const test = function test() {
const articlesRaw = fs.readFileSync('data.json');
const articles = JSON.parse(articlesRaw);
console.log(`${articles.length} items parsed from data file`);
bulkIndex('library', 'article', articles);
};
test();
module.exports = {
bulkIndex
};
} ());
my local windows environment:
java version 1.8.0_121
elasticsearch version 6.1.1
node version v8.9.4
npm version 5.6.0
The bulk function doesn't return a promise. It accepts a callback function as a parameter.
esClient.bulk(
{body: bulkBody},
function(err, response) {
if (err) { console.err(err); return; }
console.log(`Inside bulk3...`);
let errorCount = 0;
response.items.forEach(item => {
if (item.index && item.index.error) {
console.log(++errorCount, item.index.error);
}
});
console.log(`Successfully indexed ${data.length - errorCount} out of ${data.length} items`);
}
)
or use promisify to convert a function accepting an (err, value) => ... style callback to a function that returns a promise.
const esClientBulk = util.promisify(esClient.bulk)
esClientBulk({body: bulkBody})
.then(...)
.catch(...)
EDIT: Just found out that elasticsearch-js supports both callbacks and promises. So this should not be an issue.
By looking at the package.json of the project that you've linked, it uses elasticsearch-js version ^11.0.1 which is an old version and that is sending requests with application/x-ldjson header for bulk upload, which is not supported by newer elasticsearch versions. So, upgrading elasticsearch-js to a newer version (current latest is 14.0.0) should fix it.

How can I have multiple API endpoints for one Google Cloud Function?

I have a Google Cloud Function which contains multiple modules to be invoked on different paths.
I am using the serverless framework to deploy my functions, but it has the limitation of only one path per function.
I want to use multiple paths in one function just like we can in the AWS serverless framework.
Suppose a user cloud function will have two paths /user/add as well as /user/remove; both the paths should invoke the same function.
Something like this:
serverless.yml
functions:
user:
handler: handle
events:
- http: user/add
- http: user/remove
How can I have multiple API endpoints for one GCF?
Yes, indeed there is no actual REST service backing up Google Cloud Functions. It uses out of the box HTTP triggers.
To hustle the way around, I'm using my request payload to determine which action to perform. In the body, I'm adding a key named "path".
For example, consider the Function USER.
To add a user:
{
"path":"add",
"body":{
"first":"Jhon",
"last":"Doe"
}
}
To remove a user:
{
"path":"remove",
"body":{
"first":"Jhon",
"last":"Doe"
}
}
If your operations are purely CRUD, you can use request.method which offers verbs like GET, POST, PUT, DELETE to determine operations.
You could use Firebase Hosting to rewriting URLs.
In your firebase.json file:
"hosting": {
"rewrites": [
{
"source": "/api/v1/your/path/here",
"function": "your_path_here"
}
]
}
Keep in mind this is a workaround and it has a major drawback: you will pay for double hit. Consider this if your app has to scale.
You can write your functions in different runtimes. The Node.js runtime uses the Express framework. So you can use its router to build different routes within a single function.
Add the dependency
npm install express#4.17.1
The following example is using typescript. Follow these guidelines to initiate a typescript project.
// index.ts
import { HttpFunction } from '#google-cloud/functions-framework';
import * as express from 'express';
import foo from './foo';
const app = express();
const router = express.Router();
app.use('/foo', foo)
const index: HttpFunction = (req, res) => {
res.send('Hello from the index route...');
};
router.get('', index)
app.use('/*', router)
export const api = app
// foo.ts
import { HttpFunction } from '#google-cloud/functions-framework';
import * as express from 'express';
const router = express.Router();
const foo: HttpFunction = (req, res) => {
res.send('Hello from the foo route...');
};
router.get('', foo)
export default router;
to deploy run:
gcloud functions deploy YOUR_NAME \
--runtime nodejs16 \
--trigger-http \
--entry-point api \
--allow-unauthenticated
Currently, in google allows only one event definition per function is supported. For more
Express can be installed with npm i express, then imported and used more or less as normal to handle routing:
const express = require("express");
const app = express();
// enable CORS if desired
app.use((req, res, next) => {
res.set("Access-Control-Allow-Origin", "*");
next();
});
app.get("/", (req, res) => {
res.send("hello world");
});
exports.example = app; // `example` is whatever your GCF entrypoint is
If Express isn't an option for some reason or the use case is very simple, a custom router may suffice.
If parameters or wildcards are involved, consider using route-parser. A deleted answer suggested this app as an example.
The Express request object has a few useful parameters you can take advantage of:
req.method which gives the HTTP verb
req.path which gives the path without the query string
req.query object of the parsed key-value query string
req.body the parsed JSON body
Here's a simple proof-of-concept to illustrate:
const routes = {
GET: {
"/": (req, res) => {
const name = (req.query.name || "world");
res.send(`<!DOCTYPE html>
<html lang="en"><body><h1>
hello ${name.replace(/[\W\s]/g, "")}
</h1></body></html>
`);
},
},
POST: {
"/user/add": (req, res) => { // TODO stub
res.json({
message: "user added",
user: req.body.user
});
},
"/user/remove": (req, res) => { // TODO stub
res.json({message: "user removed"});
},
},
};
exports.example = (req, res) => {
if (routes[req.method] && routes[req.method][req.path]) {
return routes[req.method][req.path](req, res);
}
res.status(404).send({
error: `${req.method}: '${req.path}' not found`
});
};
Usage:
$ curl https://us-east1-foo-1234.cloudfunctions.net/example?name=bob
<!DOCTYPE html>
<html lang="en"><body><h1>
hello bob
</h1></body></html>
$ curl -X POST -H "Content-Type: application/json" --data '{"user": "bob"}' \
> https://us-east1-foo-1234.cloudfunctions.net/example/user/add
{"message":"user added","user":"bob"}
If you run into trouble with CORS and/or preflight issues, see Google Cloud Functions enable CORS?