How do I read environment variables in my Appwrite functions? - function

In the Appwrite console, I'm adding a test environment variable to pass into function...
In my function code (NodeJs) index.js, I'm logging out the value of above variable...
I save the code and use the Appwrite CLI (createTag) to push/publish the code.
Then in Appwrite console, I activate the new function then I execute it and I see this in log...
Clearly I'm missing something but I'm searching the Appwrite docs and I don't see it.
What am I doing incorrectly?
Thank you for helping :-)

Ok looks like as of this post, this is a bug in the Appwrite web UI code. You can do this 2 ways right now. You can set the environment vars in code or you can use the Appwrite CLI. I ended up putting the CLI commend in my NodeJs package.json scripts for quick easy access.
Here are both ways that worked for me...
appwrite functions create --functionId=regions_get_all --name=regions_get_all --execute=[] --runtime=node-16.0 --vars={ 'LT_API_ENDPOINT': 'https://appwrite.league-tracker.com/v1', 'LT_PROJECT_ID': '61eb...7e4ff', 'LT_FUNCTIONS_SECRET': '3b4b478e5a5576c1...ef84ba44e5fc2261cb8a8b3bfee' }
const sdk = require('node-appwrite');
const endpoint = 'https://appwrite.league-tracker.com/v1';
const projectId = '61eb3...7e4ff';
const funcionsSecret = '3b4b478e5a557ab8a...c121ff21977a';
const functionId = process.argv[2];
const name = process.argv[2];
const execute = [];
const runtime = 'node-16.0';
const env_vars = {
"LT_API_ENDPOINT": endpoint,
"LT_PROJECT_ID": projectId,
"LT_FUNCTIONS_SECRET": funcionsSecret
};
// Init SDK
const client = new sdk.Client();
const functions = new sdk.Functions(client);
client
.setEndpoint(endpoint) // Your API Endpoint
.setProject(projectId) // Your project ID
.setKey('33facd6c0d792e...359362efbc35d06bfaa'); // Your secret API key
functions.get(functionId)
.then(
func => {
// Does this function already exist?
if ((typeof (func) == 'object' && func['$id'] == functionId)) {
throw `Function '${functionId}' already exists. Cannot 'create'.\n\n`;
}
// Create the function
functions.create(functionId, name, execute, runtime, env_vars)
.then(
response => console.log(response),
error => console.error(`>>> ERROR! ${error}`)
);
}).catch(
error => console.error(`>>> ERROR! ${error}`)
);

As of Appwrite 0.13.0, an Appwrite Function must expose a function that accepts a request and response. To return data, you would use the response object and either call response.json() or response.send(). The request object has an env object with all function variables. Here is an example NodeJS Function:
module.exports = async (req, res) => {
const payload =
req.payload ||
'No payload provided. Add custom data when executing function.';
const secretKey =
req.env.SECRET_KEY ||
'SECRET_KEY environment variable not found. You can set it in Function settings.';
const randomNumber = Math.random();
const trigger = req.env.APPWRITE_FUNCTION_TRIGGER;
res.json({
message: 'Hello from Appwrite!',
payload,
secretKey,
randomNumber,
trigger,
});
};
In the example above, you can see req.env.SECRET_KEY being referenced. For more information, refer to the Appwrite Functions docs.

Related

How to save stripe webhook response on mysql

I am using flutter as a client-side and using the firebase extension to get the checkout URL for payment purposes.
This is working fine but I want to save the webhook response data on my MySQL db. The stripe webhook sends data on each event and I want to store each event response on my DB by which I will filter the data from there.
I am using node js as the backend but I don't have much knowledge of nodejs.
I followed the stripe nodejs documentation from which I am triggering the events
and this is the documentation code -
// server.js
//
// Use this sample code to handle webhook events in your integration.
//
// 1) Paste this code into a new file (server.js)
//
// 2) Install dependencies
// npm install stripe
// npm install express
//
// 3) Run the server on http://localhost:4242
// node server.js
const stripe = require('stripe');
const express = require('express');
const app = express();
// This is your Stripe CLI webhook secret for testing your endpoint locally.
const endpointSecret = "whsec_a86c898114804f99965ae4a14a7a93731a4c0bdffa506fa3d24d73ac89dd56f3";
app.post('/webhook', express.raw({type: 'application/json'}), (request, response) => {
const sig = request.headers['stripe-signature'];
let event;
try {
event = stripe.webhooks.constructEvent(request.body, sig, endpointSecret);
} catch (err) {
response.status(400).send(`Webhook Error: ${err.message}`);
return;
}
// Handle the event
switch (event.type) {
case 'checkout.session.async_payment_failed':
const session = event.data.object;
// Then define and call a function to handle the event checkout.session.async_payment_failed
break;
case 'checkout.session.async_payment_succeeded':
const session = event.data.object;
// Then define and call a function to handle the event checkout.session.async_payment_succeeded
break;
case 'payment_intent.created':
const paymentIntent = event.data.object;
// Then define and call a function to handle the event payment_intent.created
break;
case 'payment_intent.processing':
const paymentIntent = event.data.object;
// Then define and call a function to handle the event payment_intent.processing
break;
// ... handle other event types
default:
console.log(`Unhandled event type ${event.type}`);
}
// Return a 200 response to acknowledge receipt of the event
response.send();
});
app.listen(4242, () => console.log('Running on port 4242'));
now, how can I connect MySQL on each switch case and save the data on my mysql db.
(note - I dont have pior knowledge of nodejs and mysql I have just read the documentation and following everything)

Is it possible to perform an action with `context` on the init of the app?

I'm simply looking for something like this
app.on('init', async context => {
...
})
Basically I just need to make to calls to the github API, but I'm not sure there is a way to do it without using the API client inside the Context object.
I ended up using probot-scheduler
const createScheduler = require('probot-scheduler')
module.exports = app => {
createScheduler(app, {
delay: false
})
robot.on('schedule.repository', context => {
// this is called on startup and can access context
})
}
I tried probot-scheduler but it didn't exist - perhaps removed in an update?
In any case, I managed to do it after lots of digging by using the actual app object - it's .auth() method returns a promise containing the GitHubAPI interface:
https://probot.github.io/api/latest/classes/application.html#auth
module.exports = app => {
router.get('/hello-world', async (req, res) => {
const github = await app.auth();
const result = await github.repos.listForOrg({'org':'org name});
console.log(result);
})
}
.auth() takes the ID of the installation if you wish to access private data. If called empty, the client will can only retrieve public data.
You can get the installation ID by calling .auth() without paramaters, and then listInstallations():
const github = await app.auth();
const result = github.apps.listInstallations();
console.log(result);
You get an array including IDs that you can in .auth().

DialogflowSDK middleware return after resolving a promise

I'm currently playing around with the actions-on-google node sdk and I'm struggling to work out how to wait for a promise to resolve in my middleware before it then executes my intent. I've tried using async/await and returning a promise from my middleware function but neither method appears to work. I know typically you wouldn't override the intent like i'm doing here but this is to test what's going on.
const {dialogflow} = require('actions-on-google');
const functions = require('firebase-functions');
const app = dialogflow({debug: true});
function promiseTest() {
return new Promise((resolve,reject) => {
setTimeout(() => {
resolve('Resolved');
}, 2000)
})
}
app.middleware(async (conv) => {
let r = await promiseTest();
conv.intent = r
})
app.fallback(conv => {
const intent = conv.intent;
conv.ask("hello, you're intent was " + intent );
});
It looks like I should at least be able to return a promise https://actions-on-google.github.io/actions-on-google-nodejs/interfaces/dialogflow.dialogflowmiddleware.html
but I'm not familiar with typescript so I'm not sure if I'm reading these docs correctly.
anyone able to advise how to do this correctly? For instance a real life sample might be I need to make a DB call and wait for that to return in my middleware before proceeding to the next step.
My function is using the NodeJS V8 beta in google cloud functions.
The output of this code is whatever the actual intent was e.g the default welcome intent, rather than "resolved" but there are no errors. So the middleware fires, but then moves onto the fallback intent before the promise resolves. e.g before setting conv.intent = r
Async stuff is really fiddly with the V2 API. And for me only properly worked with NodeJS 8. The reason is that from V2 onwards, unless you return the promise, the action returns empty as it has finished before the rest of the function is evaluated. There is a lot to work through to figure it out, here's some sample boilerplate I have that should get you going:
'use strict';
const functions = require('firebase-functions');
const {WebhookClient} = require('dialogflow-fulfillment');
const {BasicCard, MediaObject, Card, Suggestion, Image, Button} = require('actions-on-google');
var http_request = require('request-promise-native');
process.env.DEBUG = 'dialogflow:debug'; // enables lib debugging statements
exports.dialogflowFirebaseFulfillment = functions.https.onRequest((request, response) => {
const agent = new WebhookClient({ request, response });
console.log('Dialogflow Request headers: ' + JSON.stringify(request.headers));
console.log('Dialogflow Request body: ' + JSON.stringify(request.body));
function welcome(agent) {
agent.add(`Welcome to my agent!`);
}
function fallback(agent) {
agent.add(`I didn't understand`);
agent.add(`I'm sorry, can you try again?`);
}
function handleMyIntent(agent) {
let conv = agent.conv();
let key = request.body.queryResult.parameters['MyParam'];
var myAgent = agent;
return new Promise((resolve, reject) => {
http_request('http://someurl.com').then(async function(apiData) {
if (key === 'Hey') {
conv.close('Howdy');
} else {
conv.close('Bye');
}
myAgent.add(conv);
return resolve();
}).catch(function(err) {
conv.close(' \nUh, oh. There was an error, please try again later');
myAgent.add(conv);
return resolve();
})})
}
let intentMap = new Map();
intentMap.set('Default Welcome Intent', welcome);
intentMap.set('Default Fallback Intent', fallback);
intentMap.set('myCustomIntent', handleMyIntent);
agent.handleRequest(intentMap);
});
A brief overview of what you need:
you have to return the promise resolution.
you have to use the 'request-promise-native' package for HTTP requests
you have to upgrade your plan to allow for outbound HTTP requests (https://firebase.google.com/pricing/)
So it turns out my issue was to do with an outdated version of the actions-on-google sdk. The dialogflow firebase example was using v2.0.0, changing this to 2.2.0 in the package.json resolved the issue

Testing saves data in database

Hi I am trying to run mocha and chai test to test my node js router, which saves a user in mysql database and then returns the same array back.
The problem I am facing at the moment is that I would not like to save the information in the database when I run it from local and when I use a continious integration software like travis/Ci the test fails since there is no database connection. I would like to know how can I test the database saving with the current without actually saving to the database.
Basically meaning having a fake virtual database to save or returning save.
I read that sinon.js can help but I am quite not sure on how to use it.
Here is my code
var expect = require('chai').expect;
var faker = require('faker');
const request = require('supertest');
const should = require('should');
const sinon = require('sinon');
const helper = require('../../models/user');
describe('POST /saveUser',()=>{
it('should save a new user',(done)=>{
var fake =
request(app)
.post('/saveUser')
.send({
Owner : faker.random.number(),
firstname : faker.name.firstName(),
lastname : faker.name.lastName(),
email:faker.internet.email(),
password : faker.random.number(),
token : faker.random.uuid()
})
.expect(200)
.expect((res)=>{
expect(res.body.firstname).to.be.a("string");
expect(res.body.lastname).to.be.a("string");
expect(res.body.Owner).to.be.a("number");
})
.end(done);
});
});
This is the router
router.post('/saveUser',(req,res,next)=>{
saveUser(req.body).then((result)=>{
return res.send(req.body);
}).catch((e)=>{
return res.send('All info not saved');
});
});
And here is the model
saveUser = (userinfo) => new Promise((resolve,reject)=>{
db.query('INSERT INTO user SET ?',userinfo,function(error,results,fields){
if(error){
reject();
}else{
resolve(userinfo);
}
})
});
What you are describing is a stub. With sinon you can stub methods and call fake methods instead like this:
sinon.stub(/* module that implements saveUser */, 'saveUser').callsFake(() => Promise.resolve());

Using Streams in MySQL with Node

Following the example on Piping results with Streams2, I'm trying to stream results from MySQL to stdout in node.js.
Code looks like this:
connection.query('SELECT * FROM table')
.stream()
.pipe(process.stdout);
I get this error: TypeError: invalid data
Explanation
From this github issue for the project:
.stream() returns stream in "objectMode". You can't pipe it to stdout or network
socket because "data" events have rows as payload, not Buffer chunks
Fix
You can fix this using the csv-stringify module.
var stringify = require('csv-stringify');
var stringifier = stringify();
connection.query('SELECT * FROM table')
.stream()
.pipe(stringifier).pipe(process.stdout);
notice the extra .pipe(stringifier) before the .pipe(process.stdout)
There is another solution now with the introduction of pipelinein Node v10 (view documentation).
The pipeline method does several things:
Allows you to pipe through as many streams as you like.
Provides a callback once completed.
Importantly, it provides automatic clean up. Which is a benefit over the standard pipe method.
const fs = require('fs')
const mysql = require('mysql')
const {pipeline} = require('stream')
const stringify = require('csv-stringify')
const stringifier = stringify()
const output = fs.createWriteStream('query.csv')
const connection = mysql.createConnection(...)
const input = connection.query('SELECT * FROM table').stream()
pipeline(input, stringifier, process.stdout, err => {
if (err) {
console.log(err)
} else {
console.log('Output complete')
}
}