Slack webhooks cause cls-hooked request context to orphan mysql connections - mysql

The main issue:
We have a lovely little express app, which has been crushing it for months with no issues. We manage our DB connections by opening a connection on demand, but then caching it "per request" using the cls-hooked library. Upon the request ending, we release the connection so our connection pool doesn't run out. Classic. Over the course of months and many connections, we've never "leaked" connections. Until now! Enter... slack! We are using the slack event handler as follows:
app.use('/webhooks/slack', slackEventHandler.expressMiddleware());
and we sort of think of it like any other request, however slack requests seem to play weirdly with our cls-hooked usage. For example, we use node-ts and nodemon to run our app locally (e.g. you change code, the app restarts automatically). Every time the app restarts locally on our dev machines, and you try and play with slack events, suddenly when our middleware that releases the connection tries to do so, it thinks there is nothing in session. When you then use a normal endpoint... it works fine and essentially seems to reset slack to working okay again. We are now scared to go to prod with our slack integration, because we're worried our slack "requests" are going to starve our connection pool.
Background
Relevant subset of our package.json:
{
"#slack/events-api": "^2.3.2",
"#slack/web-api": "^5.8.0",
"express": "~4.16.1",
"cls-hooked": "^4.2.2",
"mysql2": "^2.0.0",
}
The middleware that makes the cls-hooked session
import { session } from '../db';
const context = (req, res, next) => {
session.run(() => {
session.bindEmitter(req);
session.bindEmitter(res);
next();
});
};
export default context;
The middleware that releases our connections
export const dbReleaseMiddleware = async (req, res, next) => {
res.on('finish', async () => {
const conn = session.get('conn');
if (conn) {
incrementConnsReleased();
await conn.release();
}
});
next();
};
the code that creates the connection on demand and stores it in "session"
const poolConn = await pool.getConnection();
if (session.active) {
session.set('conn', poolConn);
}
return poolConn;
the code that sets up the session in the first place
export const session = clsHooked.createNamespace('our_company_name');
If you got this far, congrats. Any help appreciated!
Side note: you couldn't pay me to write a more confusing title...

Figured it out! It seems we have identified the following behavior in the node version of slack's API (seems to only happen on mac computers... sometimes)
The issue is that this is in the context of an express app, so Slack is managing the interface between its own event handler system + the http side of things with express (e.g. returning 200, or 500, or whatever). So what seems to happen is...
// you have some slack event handler
slackEventHandler.on('message', async (rawEvent: any) => {
const i = 0;
i = i + 1;
// at this point, the http request has not returned 200, it is "pending" from express's POV
await myService.someMethod();
// ^^ while this was doing its async thing, the express request returned 200.
// so things like res.on('finished') all fired and all your middleware happened
// but your event handler code is still going
});
So we ended up creating a manual call to release connections in our slack event handlers. Weird!

Related

SERVICE WORKER: The service worker navigation preload request failed with network error: net::ERR_INTERNET_DISCONNECTED in Chrome 89

I have a problem with my Service Worker.
I'm currently implementing offline functionality with an offline.html site to be shown in case of network failure. I have implemented Navigation Preloads as described here: https://developers.google.com/web/updates/2017/02/navigation-preload#activating_navigation_preload
Here is my install EventListener were skipWaiting() and initialize new cache
const version = 'v.1.2.3'
const CACHE_NAME = '::static-cache'
const urlsToCache = ['index~offline.html', 'favicon-512.png']
self.addEventListener('install', function(event) {
self.skipWaiting()
event.waitUntil(
caches
.open(version + CACHE_NAME)
.then(function(cache) {
return cache.addAll(urlsToCache)
})
.then(function() {
console.log('WORKER: install completed')
})
)
})
Here is my activate EventListener were I feature-detect navigationPreload and enable it. Afterwards I check for old caches and delete them
self.addEventListener('activate', event => {
console.log('WORKER: activated')
event.waitUntil(
(async function() {
// Feature-detect
if (self.registration.navigationPreload) {
// Enable navigation preloads!
console.log('WORKER: Enable navigation preloads')
await self.registration.navigationPreload.enable()
}
})().then(
caches.keys().then(function(cacheNames) {
cacheNames.forEach(function(cacheName) {
if (cacheName !== version + CACHE_NAME) {
caches.delete(cacheName)
console.log(cacheName + ' CACHE deleted')
}
})
})
)
)
})
This is my fetch eventListener
self.addEventListener('fetch', event => {
const { request } = event
// Always bypass for range requests, due to browser bugs
if (request.headers.has('range')) return
event.respondWith(
(async function() {
// Try to get from the cache:
const cachedResponse = await caches.match(request)
if (cachedResponse) return cachedResponse
try {
const response = await event.preloadResponse
if (response) return response
// Otherwise, get from the network
return await fetch(request)
} catch (err) {
// If this was a navigation, show the offline page:
if (request.mode === 'navigate') {
console.log('Err: ',err)
console.log('Request: ', request)
return caches.match('index~offline.html')
}
// Otherwise throw
throw err
}
})()
)
})
Now my Problem:
On my local machine on localhost everything just works as it should. If network is offline the index~offline.html page is delivered to the user. If I deploy to my test server everything works as well as expected, except for a strange error-message in Chrome on normal browsing(not offline mode):
The service worker navigation preload request failed with network error: net::ERR_INTERNET_DISCONNECTED.
I logged the error and the request to get more information
Error:
DOMException: The service worker navigation preload request failed with a network error.
Request:
Its strange because somehow index.html is requested no matter which site is loaded.
Additional Information this is happening in Chrome 89, in chrome 88 everything seems fine(I checked in browserstack). I just saw there was a change in pwa offline detection in Chrome 89...
https://developer.chrome.com/blog/improved-pwa-offline-detection/
anybody has an idea what the problem might be?
Update
I rebuild the problem here so everybody can check it out: https://dreamy-leavitt-bd4f0e.netlify.app/
This error is directly caused by the improved pwa offline detection you linked to:
https://developer.chrome.com/blog/improved-pwa-offline-detection/
The browser fakes an offline context and tries to request the start_url of your manifest, e.g. the index.html specified in your https://dreamy-leavitt-bd4f0e.netlify.app/site.webmanifest
This is to make sure that your service worker is actually returning a valid 200 response in this situation, i.e. the valid cached response for your index~offline.html page.
The error you're asking about specifically is from the await event.preloadResponse part and it apparently can't be suppressed.
The await fetch call produces a similar error but that can be suppressed, just don't console.log in the catch section.
Hopefully chrome won't show this error from preload responses in future when doing offline pwa detection as it's needlessly confusing.

Issue listening for custom event via puppeteer

I am currently working on a GitLab CI test environment and I have a test harness which we use to test our SDK. I have gone about setting up a custom event that is fired on the page which designates the end of the test run. In my puppeteer implementation I am wanting to listen for this custom event "TEST_COMPLETE".
I have not been successful in getting this to work so I figured I would at least make sure the custom-event.js example on the puppeteer repo worked and there too I am not seeing what I believe I should be getting. I cloned the main repo below and performed an npm install. When I execute the js test below, setting headless:false and don't close the browser, I do not see any log in console that shows any custom event being fired.
It is my understanding that I should see some console event message with 'fired' and then 'app-ready' event and info, but this is not the case. Even if I interact with the page I don't see anything outside of some 'features_loaded' and 'features_unveil' logs.
https://github.com/puppeteer/puppeteer/blob/main/examples/custom-event.js
Anyone able to get the expected behavior on this code today? Not sure if this worked previously and has broke since or I am just doing something wrong. Any info would be of great help, Thanks!
Not sure if this is what you need, but I can get the message 'TEST_COMPLETE fired.' in Node.js console with this simplified code (puppeteer 8.0.0):
import puppeteer from 'puppeteer';
const browser = await puppeteer.launch();
try {
const [page] = await browser.pages();
await page.goto('https://example.org/');
await page.exposeFunction('onCustomEvent', async (type) => {
console.log(`${type} fired.`);
await browser.close();
});
await page.evaluate(() => {
document.addEventListener('TEST_COMPLETE', (e) => {
window.onCustomEvent('TEST_COMPLETE');
});
document.dispatchEvent(new Event('TEST_COMPLETE'));
});
} catch (err) { console.error(err); }

How to link Node.js Post script to HTML form?

I have created a REST full APi, which works as I would be expecting if I am running Postman. I run the Test from an index.js file which would have the routes saved as per below file.
const config = require('config');
const mongoose = require('mongoose');
const users = require('./routes/users');
const auth = require('./routes/auth');
const express = require('express');
const app = express();
//mongoose.set();
if (!config.get('jwtPrivateKey'))
{
console.log('Fatal ERRORR: jwtPrivateKey key is not defined')
process.exit(1);
}
mongoose.connect(uri ,{
useNewUrlParser: true,
useUnifiedTopology: true,
useCreateIndex: true
})
.then(()=>console.log('Connected to MongoDB...'))
.catch(err=> console.log('Not Connected, bad ;(', err));
app.use(express.json());
//THis is only for posting the user, e.g. Registering them
app.use('/api/users', users);
app.use('/api/auth', auth);
const port = process.env.PORT || 3000;
app.listen(port, () => console.log(`Listening on port ${port}...`));
The real code is happening here. Testing this in Postmon I could establish, that the values are saved in MongoDB.
router.post('/', async (req, res) => {
//validates the request.
const { error } = validate(req.body);
if (error) return res.status(400).send(error.details[0].message);
let user = await User.findOne({email: req.body.email});
if (user) return res.status(400).send('User Already Register, try again!');
user = new User(_.pick(req.body, ['firstName','lastName','email','password','subscription']));
const salt = await bcrypt.genSaltSync(15);
user.password = await bcrypt.hash(user.password, salt);
//Here the user is being saved in the Database.
await user.save();
//const token = user.generateAuthToken();
//const token = jwt.sign({_id: user._id}, config.get('jwtPrivateKey'));
const token = user.generateAuthToken();
//We are sending the authentication in the header, and the infromation back to client
res.header('x-auth-token',token).send( _.pick(user, ['_id','firstName','lastName','email','subscription']));
});
Now my question's are:
How can I call the second code block from a , in one particular html file. When using Action="path to the users.js", the browser opens the js file code but doesn't do anything.
Do I need to rewrite the Post block part so that it would as well include the connection details to the DB? And would this mean I would keep open the connection to MongoDB once I insert Read etc.? Wouldn't this eat a lot of resources if multiple users would e.g. log in at the same time?
Or is there a way how I can use the index.js + the users.js which is refereed in the index.js file together?
All of these are theoretical questions, as I am not quite sure how to use the created API in html, then I created as walking through a tutorial.
Do I need to change the approach here?
After some longs hours I finally understood my own issue and question.
What I wanted to achieve is from an HTML page post data in MongoDB through API (this I assume is the best way how to describe this).
In order to do this I needed to:
Start server for the API function e.g. nodemon index.js, which has the information regarding the API.
Opened VS Code opened the terminal and started the API server (if I can call it like that)
Opened CMD and startet the local host for the index.html with navigating to it's folder and then writting http-server now I could access this on http://127.0.0.1:8080.
For the register.html in the form I needed to post:
This is the part which I didn't understood, but now it makes sense. Basically I start the server API seperatly and once it is started I can use e.g. Postmon and other apps which can access this link. I somehow thought html needs some more direct calls.
So After the localhost is started then the register.html will know where to post it via API.
Now I have a JOI validate issue, though on a different more simple case this worked, so I just need to fix the code there.
Thank You For reading through and Apologize if was not clear, still learning the terminology!

net::ERR_CONNECTION_RESET with service worker in Chrome

I have a very simple service worker to add offline support. The fetch handler looks like
self.addEventListener("fetch", function (event) {
var url = event.request.url;
event.respondWith(fetch(event.request).then(function (response) {
//var cacheResponse: Response = response.clone();
//caches.open(CURRENT_CACHES.offline).then((cache: Cache) => {
// cache.put(url, cacheResponse).catch(() => {
// // ignore error
// });
//});
return response;
}).catch(function () {
// check the cache
return getCachedContent(event.request);
}));
});
Intermittently we are seeing a net::ERR_CONNECTION_RESET error for a particular script we load into the page when online. The error is not coming from the server as the service worker is picking up the file from the browser cache. Chrome's network tab shows the service worker has successfully fetched the file from the disk cache but the request from the browser to the service worker shows as (failed)
Does anyone know the underlying issue causing this? Is there a problem with my service worker implementation?
This is likely due to a bug in Chrome (and potentially other browsers as well) that could result in a garbage collection event removing a reference to the response stream while it's still being read.
Its fix in Chrome is being tracked at https://bugs.chromium.org/p/chromium/issues/detail?id=934386.

Raspberry PI server - GPIO ports status JSON response

I'm struggling for a couple of days. Question is simple, is there a way that can I create a server on Raspberry PI that will return current status of GPIO ports in JSON format?
Example:
Http://192.168.1.109:3000/led
{
"Name": "green led",
"Status": "on"
}
I found Adafruit gpio-stream library useful but don't know how to send data to JSON format.
Thank you
There are a variety of libraries for gpio interaction for node.js. One issue is that you might need to run it as root to have access to gpio, unless you can adjust the read access for those devices. This is supposed to be fixed in the latest version of rasbian though.
I recently built a node.js application that was triggered from a motion sensor, in order to activate the screen (and deactivate it after a period of time). I tried various gpio libraries but the one that I ended up using was "onoff" https://www.npmjs.com/package/onoff mainly because it seemed to use an appropriate way to identify changes on the GPIO pins (using interrupts).
Now, you say that you want to send data, but you don't specify how that is supposed to happen. If we use the example that you want to send data using a POST request via HTTP, and send the JSON as body, that would mean that you would initialize the GPIO pins that you have connected, and then attach event handlers for them (to listen for changes).
Upon a change, you would invoke the http request and serialize the JSON from a javascript object (there are libraries that would take care of this as well). You would need to keep a name reference yourself since you only address the GPIO pins by number.
Example:
var GPIO = require('onoff').Gpio;
var request = require('request');
var x = new GPIO(4, 'in', 'both');
function exit() {
x.unexport();
}
x.watch(function (err, value) {
if (err) {
console.error(err);
return;
}
request({
uri: 'http://example.org/',
method: 'POST',
json: true,
body: { x: value } // This is the actual JSON data that you are sending
}, function () {
// this is the callback from when the request is finished
});
});
process.on('SIGINT', exit);
I'm using the npm modules onoff and request. request is used for simplifying the JSON serialization over a http request.
As you can see, I only set up one GPIO here. If you need to track multiple, you must make sure to initialize them all, distinguish them with some sort of name and also remember to unexport them in the exit callback. Not sure what happens if you don't do it, but you might lock it for other processes.
Thank You, this was very helpful. I did not express myself well, sorry for that. I don't want to send data (for now) i just want to enter web address like 192.168.1.109/led and receive json response. This is what I manage to do for now. I don't know if this is the right way. PLS can you review this or suggest better method..
var http = require('http');
var url = require('url');
var Gpio = require('onoff').Gpio;
var led = new Gpio(23, 'out');
http.createServer(function (req, res) {
res.writeHead(200, {'Content-Type': 'text/html'});
var command = url.parse(req.url).pathname.slice(1);
switch(command) {
case "on":
//led.writeSync(1);
var x = led.readSync();
res.write(JSON.stringify({ msgId: x }));
//res.end("It's ON");
res.end();
break;
case "off":
led.writeSync(0);
res.end("It's OFF");
break;
default:
res.end('Hello? yes, this is pi!');
}
}).listen(8080);