Redirect issues when using MySQL as session storage - mysql

I have set up a Node.js app where I use sessions and store them in MySQL. When using MemoryStorage, redirections work fine, but when using MySQL, req.session doesn't update until you reload or you move to a different page, and I'm forced to replace every single res.redirect('/...') by res.render() of that same page to display anything in req.session immediately.
I've tried using both return res.redirect() and not, as well as using setTimeout, neither work. I can't figure it out and I need sessions to be stored in DB
router.get('/student-sign-up', function (req, res, next) {
res.render('student/signUp', {
title: 'Sign up',
errors: req.session.errors
});
req.session.errors = null; //to flush them on reload
}).post('/student-sign-up', function (req, res, next) {
//Some form checks
let errors = req.validationErrors();
if (errors) {
req.session.errors = errors;
req.session.signUpSuccess = false;
return res.redirect('/student-sign-up');
}
//...
}
The above should redirect to the same page, and display the error (I use Handlebars as my view engine) if there were one, but it simply redirects, and if you refresh manually or submit a faulty from again, then it displays it. Same thing for logins (both success not going into the platform's home, and failure not showing errors either). It's like everything's lagging behind by 1 step...

OK, I found the solution. According to the express-session docs, all I had to do was force a save and then redirect, as so:
req.session.save((err) => {
if (err) {
req.locals.error = err;
return res.redirect('/');
}
return res.redirect('/next-section');
});
I'll leave this here for anyone that might have the same issue!

Related

Slack webhooks cause cls-hooked request context to orphan mysql connections

The main issue:
We have a lovely little express app, which has been crushing it for months with no issues. We manage our DB connections by opening a connection on demand, but then caching it "per request" using the cls-hooked library. Upon the request ending, we release the connection so our connection pool doesn't run out. Classic. Over the course of months and many connections, we've never "leaked" connections. Until now! Enter... slack! We are using the slack event handler as follows:
app.use('/webhooks/slack', slackEventHandler.expressMiddleware());
and we sort of think of it like any other request, however slack requests seem to play weirdly with our cls-hooked usage. For example, we use node-ts and nodemon to run our app locally (e.g. you change code, the app restarts automatically). Every time the app restarts locally on our dev machines, and you try and play with slack events, suddenly when our middleware that releases the connection tries to do so, it thinks there is nothing in session. When you then use a normal endpoint... it works fine and essentially seems to reset slack to working okay again. We are now scared to go to prod with our slack integration, because we're worried our slack "requests" are going to starve our connection pool.
Background
Relevant subset of our package.json:
{
"#slack/events-api": "^2.3.2",
"#slack/web-api": "^5.8.0",
"express": "~4.16.1",
"cls-hooked": "^4.2.2",
"mysql2": "^2.0.0",
}
The middleware that makes the cls-hooked session
import { session } from '../db';
const context = (req, res, next) => {
session.run(() => {
session.bindEmitter(req);
session.bindEmitter(res);
next();
});
};
export default context;
The middleware that releases our connections
export const dbReleaseMiddleware = async (req, res, next) => {
res.on('finish', async () => {
const conn = session.get('conn');
if (conn) {
incrementConnsReleased();
await conn.release();
}
});
next();
};
the code that creates the connection on demand and stores it in "session"
const poolConn = await pool.getConnection();
if (session.active) {
session.set('conn', poolConn);
}
return poolConn;
the code that sets up the session in the first place
export const session = clsHooked.createNamespace('our_company_name');
If you got this far, congrats. Any help appreciated!
Side note: you couldn't pay me to write a more confusing title...
Figured it out! It seems we have identified the following behavior in the node version of slack's API (seems to only happen on mac computers... sometimes)
The issue is that this is in the context of an express app, so Slack is managing the interface between its own event handler system + the http side of things with express (e.g. returning 200, or 500, or whatever). So what seems to happen is...
// you have some slack event handler
slackEventHandler.on('message', async (rawEvent: any) => {
const i = 0;
i = i + 1;
// at this point, the http request has not returned 200, it is "pending" from express's POV
await myService.someMethod();
// ^^ while this was doing its async thing, the express request returned 200.
// so things like res.on('finished') all fired and all your middleware happened
// but your event handler code is still going
});
So we ended up creating a manual call to release connections in our slack event handlers. Weird!

Chrome doesn't use cache after power loss?

I am creating a digital signage player that uses Chrome as it's display engine. We need to be able to still muddle along if the network goes down without too much interruption.
Chrome works fine caching images, and I've set the "Exipres" header to be a month after access. I can set the player computer offline and have the app run for days with no problem. If I reboot the machine the right way (Start->Shut Down), caching still works as expected.
The issue is that when Chrome exits abnormally - Either a crash or power loss - on reboot, Chrome ignores the cache and refuses to load images. This happens if I cut power 5 minutes after it loads the page, so content is not expiring.
My guess is that Chrome is set to ignore the cache after an abnormal exit to prevent corrupted cache from continually crashing the browser. However, this behavior is not what I need.
Does anyone know of a command line arg or flag I can set to keep this from happening?
Thanks for your help.
I tried everything I could think of to make Chrome not invalidate the local cache on system failure, and came up empty. There's a few other people who had the same question, and I didn't see an answer.
Here's what I did that made this work, and if someone else is having the same problem, it might be the workaround that you need.
I added a service worker that would cache images. The code below isn't perfect yet, but should be a starting place for someone... (FYI, I learned this 5 minutes ago, so if someone wants to give me a pointer or two on how to make this more elegant, I'm all ears.)
We cache anything that has a response type of "cors" so we cache only images coming from the remote server. Note that your images must be loaded via https for this to work.
Taken (mostly) from: https://developers.google.com/web/fundamentals/getting-started/primers/service-workers
var CACHE_NAME = 'shine_cache';
var urlsToCache = [
'/'
];
self.addEventListener('install', function(event) {
// Perform install steps
event.waitUntil(
caches.open(CACHE_NAME)
.then(function(cache) {
console.log('Opened cache');
return cache.addAll(urlsToCache);
})
);
});
self.addEventListener('fetch', function(event) {
//console.log('Handling fetch event for', event.request);
if (event.request.method == 'POST') {
//console.log("Skipping POST");
event.respondWith(fetch(event.request));
return;
}
if (event.request.headers.get('Accept').indexOf('image') !== -1) {
event.respondWith(
caches.match(event.request)
.then(function(response) {
// Cache hit - return response
if (response) {
console.log("Returning from cache.", event.request);
return response;
}
// IMPORTANT: Clone the request. A request is a stream and
// can only be consumed once. Since we are consuming this
// once by cache and once by the browser for fetch, we need
// to clone the response.
var fetchRequest = event.request.clone();
return fetch(fetchRequest).then(
function(response) {
console.log("Have a response.", response);
// Check if we received a valid response
if(!response || response.status !== 200 || response.type !== 'cors') {
return response;
}
// IMPORTANT: Clone the response. A response is a stream
// and because we want the browser to consume the response
// as well as the cache consuming the response, we need
// to clone it so we have two streams.
var responseToCache = response.clone();
caches.open(CACHE_NAME)
.then(function(cache) {
console.log("Caching response", event.request);
cache.put(event.request, responseToCache);
});
return response;
}
);
})
);
}
});

Meteor Dev Tools Auditor is marking collections as insecure

I use Meteor Dev Tools plugin in Chrome, and I’ve noticed a cool new feature, that is worrying me about the way I've coded my app.
The audit collection tool is telling me that some of my collections are insecure.
I am still using Meteor 1.2 with Blaze
1.
One of them is meteor_autoupdate_clientVersions
1.1. should I worry about this one?
1.2. How do I protect it?
Insert, Update and Remove are marked as insecure.
2.
Then I have a cycles collection, which has marked as insecure: update and remove
This collection is updated on the database now and then but not supposed to be accessed from the frontend, and is not meant to be related to any client interaction.
For this collection I have these allow/deny rules in a common folder (both client and server)
I've tried applying these rules only on the server side, but I didn't see a difference on the audit results.
2.1. Should these rules be only on the server side?
Cycles.allow({
insert: function () {
return false;
},
remove: function () {
return false;
},
update: function () {
return false;
}
});
Cycles.deny({
insert: function () {
return true;
},
remove: function () {
return true;
},
update: function () {
return true;
}
});
2.2. How do I protect this collection?
3.
And then, I also have another collection with an insecure check which is users, where remove is marked as insecure.
On this webapp I don't make any use of users, there is no login, etc.
I might want to implement this in the future, though.
3.1 Should I worry about this collection being insecure, since I don't use it at all?
3.2 How do I protect this collection?
You do not have to allow or deny. Just remove the insecure package from the meteor app.
Then you can use publish/subscribe and methods for data insert, update and delete.
Remove this please fo code from app:
Cycles.allow({
insert: function () {
return false;
},
remove: function () {
return false;
},
update: function () {
return false;
}
});
Cycles.deny({
insert: function () {
return true;
},
remove: function () {
return true;
},
update: function () {
return true;
}
});
For 1.1
This happens while the user is logging.
Basically, issue is not with this but with the login method.
see wait time: https://ui.kadira.io/pt/2fbbd026-6302-4a12-add4-355c0480f81d
why login method slow?
This happens when everytime, your app gets reconnected. So, after the sucessful login, it will re-run all the publications again. That's why you saw such a delay to login hence this publication.
There is no such remedy for this and but this is kind fine unless your app is having a lot of througput/subRate to this method/publication.
For 3.1 :
You do not have to worry about inscure anymore after removing allow/deny and insecure package. But make sure, you write secure methods.

How to create an angular form that uses session storage that can be called throughout the html pages

I want to create a form on an index page that can store data via session storage. I also want to make sure that whatever data(let's say name) ... is remembered and used throughout the site with angular. I have researched pieces of this process but I do not understand how to write it or really even what it's called.
Any help in the right direction would be useful as I am in the infant stages of all of this angular business. Let me know.
The service you want is angular-local-storage.
Just configure it in your app.js file:
localStorageServiceProvider
.setStorageType('sessionStorage');
And then use it in the controller that contains whatever data you want to remember. Here is an example of a controller that loads the session storage data on initialization, and saves it when a user fires $scope.doSearch through the UI. This should give you a good place to start.
(function () {
angular.module("pstat")
.controller("homeCtrl", homeCtrl);
homeCtrl.$inject = ['$log', 'dataService', 'localStorageService', '$http'];
function homeCtrl ($log, dataService, localStorageService, $http) { {
if (localStorageService.get("query")) { //Returns null for missing 'query' cookie
//Or store the results directly if they aren't too large
//Do something with your saved query on page load, probably get data
//Example:
dataService.getData(query)
.success( function (data) {})
.error( function (err) {})
}
$scope.doSearch = function (query) {
vm.token = localStorageService.set("query", query);
//Then actually do your search
}
})
}()

Edit on Express outputing JSON to database field

Trying to create my first simple CRUD in Express JS and I cant seem to find this annoying bug.
When I try to update a field, the JSON from that field, gets outputed to the view, instead of the new data.
Screenshot: http://i59.tinypic.com/wi5yj4.png
Controller gist: https://gist.github.com/tiansial/2ce28e3c9a25b251ff7c
The update method is used for finding and updating documents without returning the documents that are updated. Basically what you're doing is finding documents without updating them, since the first parameter of the update function is the search criteria. You need to use the save function to update an exiting document, after updating it's properties.
Your code below, modified (not tested):
//PUT to update a blob by ID
.put(function(req, res) {
//find the document by ID
mongoose.model('Email').findById(req.id, function (err, email) {
//add some logic to handle err
if (email) {
// Get our REST or form values. These rely on the "name" attributes
email.email = req.body.email;
email.password = req.body.password;
email.servico = req.body.servico;
//save the updated document
email.save(function (err) {
if (err) {
res.send("There was a problem updating the information to the database: " + err);
}
else {
//HTML responds by going back to the page or you can be fancy and create a new view that shows a success page.
res.format({
html: function(){
res.redirect("/emails");
},
//JSON responds showing the updated values
json: function(){
res.json(email);
}
});
}
});
}
});
})