Knex NodeJS and inserting into the database - mysql

I am new to nodejs and was trying to set up an API server, here is my first attempt. I wanted to use mysql instead of mongo db.
My problem is that 'knex('user').insert({email: req.body.email});' doesn't seem to want to save to the database.
var dbConfig = {
client: 'mysql',
connection: {
host : 'localhost',
user : 'root',
password : '',
database : 'db_nodeapi'
}
};
var express = require('express'); // call express
var bodyParser = require('body-parser'); // call body-parser
var knex = require('knex')(dbConfig); // set up database connection
var app = express(); // define our app using express
app.use(bodyParser.urlencoded({ extended: true })); // configure app to use bodyParser()
app.use(bodyParser.json()); // this will let us get the data from a POST
var router = express.Router(); // get an instance of the express Router
router.use(function(req, res, next) { // middle ware for authentication
console.log(' -Logging- ');
next(); // continue to next route without stopping
});
router.get('/', function(req, res) { // listen for a post on root
res.json({ message: ' -Success- ' });
});
router.route('/user') // set up user route
.post(function(req, res) { // listen for a post on user
console.log(' -Post -'); // report a post
knex('user').insert({email: req.body.email}); // insert user into user table
res.json({ success: true, message: 'ok' }); // respond back to request
});
app.use('/api', router); // register routes beginning with /api
var port = process.env.PORT || 8080; // set server port number
app.listen(port); // setup listener
console.log('Magic happens on port ' + port); // report port number chosen
Problem is I can't get knex to add to the database!
CREATE TABLE IF NOT EXISTS `user` (
`id` int(11) NOT NULL,
`email` varchar(255) NOT NULL
) ENGINE=InnoDB DEFAULT CHARSET=latin1 AUTO_INCREMENT=2 ;
Here is the database

The problem in your code is that you are missing the ".then" statement, which causes the actual execution of the code.
knex('user').insert({email: req.body.email})
.then( function (result) {
res.json({ success: true, message: 'ok' }); // respond back to request
})
That should work. Since knex.js's insert function is a promise, you need to call .then() to actually call it.

Someone has already given a solution. I am here to talk about why adding a then statement can solve this problem.
In fact, then, catch statement are both ok. Please refer to the knex documentation(http://knexjs.org/#Interfaces-then), which mentions:
Coerces the current query builder chain into a promise state.
So select, update, insert, etc. are just the query statement builder, you have to use then or catch to convert it to promise state.
Examples are as follows:
knex('user').insert({email: req.body.email}) //not working
knex('user').insert({email: req.body.email}).then(()=>{}) //working
knex('user').insert({email: req.body.email}).catch(()=>{}) //working
.then(()=>{
knex('user').insert({email: req.body.email}) //not working
knex('user').insert({email: req.body.email}).then(()=>{}) //working
knex('user').insert({email: req.body.email}).catch(()=>{}) //working
return knex('user').insert({email: req.body.email}) //working
})

Solved try this I think there should be some after hook in knex where we can do this automatically but till then this should works.
knex('users')
.insert(data)
.then(async () => {
const result = await knex.raw('select LAST_INSERT_ID() as id');
const id = result[0][0].id;
const user = await knex.from('users').where('id', id);
res.json({ success: true, message: 'ok', user: user[0] });
});

Had similar issue once, try this:
//...
router.route('/user').post(function(req, res) {
knex('user').insert({email: req.body.email}).then(function(ret){
res.json({ success: true, message: 'ok'/*,ret:ret*/});
});
});
//...

So I was seen the above solutions all are good if you want to add more than one field without mentioning each field you can go with the below syntax:
knex('students').insert(req.body).then((newUser) => {
res.json({newUser});
}).catch((e)=>console.log(e));
So here your input tag names should be equivalent to the database table field names

Related

Restify: socket hangup error when copying a file and querying a database using a promise chain

I am using the restify framework to build a small app that copies an uploaded file from its temporary location to a permanent location and then inserts that new location into a MySQL database. However, when attempting to copy the file and then run the promisified query, the system throws a silent error not caught by the promise chain causing a 502 error on the web server end. A minimal working example is below. This example has been tested and does fail out of the gate.
If one of the steps in the process is removed (copying the file or storing the string in the database), the silent error disappears and API response is sent. However, both steps are needed for later file retrieval.
Main Restify File
const restify = require('restify');
const corsMiddleware = require('restify-cors-middleware');
const cookieParser = require('restify-cookies');
const DataBugsDbCredentials = require('./config/config').appdb;
const fs = require('fs');
const { host, port, name, user, pass } = DataBugsDbCredentials;
const database = new (require('./lib/database'))(host, port, name, user, pass);
const server = restify.createServer({
name: 'insect app'
});
// enable options response in restify (anger) -- this is so stupid!! (anger)
const cors = corsMiddleware({});
server.pre(cors.preflight);
server.use(cors.actual);
// set query and body parsing for access to this information on requests
server.use(restify.plugins.acceptParser(server.acceptable));
server.use(restify.plugins.queryParser({ mapParams: true }));
server.use(restify.plugins.bodyParser({ mapParams: true }));
server.use(cookieParser.parse);
server.post('/test', (req, res, next) => {
const { files } = req;
let temporaryFile = files['file'].path;
let permanentLocation = '/srv/www/domain.com/permanent_location';
// copy file
return fs.promises.copyFile(temporaryFile, permanentLocation)
// insert into database
.then(() => database.query(
`insert into Specimen (
CollectorId,
HumanReadableId,
FileLocation
) values (
1,
'AAA004',
${permanentLocation}
)`
))
.then(() => {
console.log('success!!!')
return res.send('success!')
})
.catch(error => {
console.error(error)
return res.send(error);
});
});
./lib/database.js
'use strict';
const mysql = require('mysql2');
class Database {
constructor(host, port, name, user, pass) {
this.connection = this.connect(host, port, name, user, pass);
this.query = this.query.bind(this);
}
/**
* Connects to a MySQL-compatible database, returning the connection object for later use
* #param {String} host The host of the database connection
* #param {Number} port The port for connecting to the database
* #param {String} name The name of the database to connect to
* #param {String} user The user name for the database
* #param {String} pass The password for the database user
* #return {Object} The database connection object
*/
connect(host, port, name, user, pass) {
let connection = mysql.createPool({
connectionLimit : 20,
host : host,
port : port,
user : user,
password : pass,
database : name,
// debug : true
});
connection.on('error', err => console.error(err));
return connection;
}
/**
* Promisifies database queries for easier handling
* #param {String} queryString String representing a database query
* #return {Promise} The results of the query
*/
query(queryString) {
// console.log('querying database');
return new Promise((resolve, reject) => {
// console.log('query promise before query, resolve', resolve);
// console.log('query promise before query, reject', reject);
// console.log('query string:', queryString)
this.connection.query(queryString, (error, results, fields) => {
console.log('query callback', queryString);
console.error('query error', error, queryString);
if (error) {
// console.error('query error', error);
reject(error);
} else {
// console.log('query results', results);
resolve(results);
}
});
});
}
}
module.exports = Database;
./testfile.js (used to quickly query the restify API)
'use strict';
const fs = require('fs');
const request = require('request');
let req = request.post({
url: 'https://api.databugs.net/test',
}, (error, res, addInsectBody) => {
if (error) {
console.error(error);
} else {
console.log('addInsectBody:', addInsectBody);
}
});
let form = req.form();
form.append('file', fs.createReadStream('butterfly.jpg'), {
filename: 'butterfly.jpg',
contentType: 'multipart/form-data'
});
If the request is made to the localhost, then an 'ECONNRESET' error is thrown as shown below:
Error: socket hang up
at connResetException (internal/errors.js:570:14)
at Socket.socketOnEnd (_http_client.js:440:23)
at Socket.emit (events.js:215:7)
at endReadableNT (_stream_readable.js:1183:12)
at processTicksAndRejections (internal/process/task_queues.js:80:21) {
code: 'ECONNRESET'
}
This error is only thrown if both the database and the file I/O are both present in the promise chain. Additionally, the error does not occur if the database request is made first with the file I/O occurring second; however, another rapid request to the server will immediately lead to the 'ECONNRESET' error.
I feel as though I should edit this answer, despite the solution revealing a rookie mistake, in the hopes that it may help someone else. I will keep the previous answer below for full transparency, but please not that it is incorrect.
Correct Answer
TL;DR
PM2 restarted the NodeJS service with each new file submitted to and saved by the API. The fix: tell PM2 to ignore the directory that stored the API's files. See this answer
Long Answer
While the OP did not mention it, my setup utilized PM2 as the NodeJS service manager for the application, and I had turned on the 'watch & reload' feature that restarted the service with each file change. Unfortunately, I had forgotten to instruct PM2 to ignore file changes in the child directory storing new files submitted through the API. As a result, each new file submitted into the API caused the service to reload. If more instructions remained to be executed after storing the file, they were terminated as PM2 restarted the service. The 502 gateway error was a simple result of the NodeJS service becoming temporarily unavailable during this time.
Changing the database transactions to occur first (as incorrectly described as a solution below) simply insured that the service restart occurred at the very end when no other instructions were pending.
Previous Incorrect Answer
The only solution that I have found thus far is to switch the file I/O and the database query so that the file I/O operation comes last. Additionally, changing the file I/O operation to rename rather than copy the file prevents rapidly successive API queries from throwing the same error (having a database query rapidly come after any file I/O operation that is not a rename seems to be the problem). Sadly, I do not have a reasonable explanation for the socket hang up in the OP, but below is the code from the OP modified to make it functional.
const restify = require('restify');
const corsMiddleware = require('restify-cors-middleware');
const cookieParser = require('restify-cookies');
const DataBugsDbCredentials = require('./config/config').appdb;
const fs = require('fs');
const { host, port, name, user, pass } = DataBugsDbCredentials;
const database = new (require('./lib/database'))(host, port, name, user, pass);
const server = restify.createServer({
name: 'insect app'
});
// enable options response in restify (anger) -- this is so stupid!! (anger)
const cors = corsMiddleware({});
server.pre(cors.preflight);
server.use(cors.actual);
// set query and body parsing for access to this information on requests
server.use(restify.plugins.acceptParser(server.acceptable));
server.use(restify.plugins.queryParser({ mapParams: true }));
server.use(restify.plugins.bodyParser({ mapParams: true }));
server.use(cookieParser.parse);
server.post('/test', (req, res, next) => {
const { files } = req;
let temporaryFile = files['file'].path;
let permanentLocation = '/srv/www/domain.com/permanent_location';
// copy file
// insert into database
return database.query(
`insert into Specimen (
CollectorId,
HumanReadableId,
FileLocation
) values (
1,
'AAA004',
${permanentLocation}
)`
)
.then(() => fs.promises.rename(temporaryFile, permanentLocation))
.then(() => {
console.log('success!!!')
return res.send('success!')
})
.catch(error => {
console.error(error)
return res.send(error);
});
});
You did not handle the database promise in then and catch -
Main Restify File
const restify = require('restify');
const corsMiddleware = require('restify-cors-middleware');
const cookieParser = require('restify-cookies');
const DataBugsDbCredentials = require('./config/config').appdb;
const fs = require('fs');
const { host, port, name, user, pass } = DataBugsDbCredentials;
const database = new (require('./lib/database'))(host, port, name, user, pass);
const server = restify.createServer({
name: 'insect app'
});
// enable options response in restify (anger) -- this is so stupid!! (anger)
const cors = corsMiddleware({});
server.pre(cors.preflight);
server.use(cors.actual);
// set query and body parsing for access to this information on requests
server.use(restify.plugins.acceptParser(server.acceptable));
server.use(restify.plugins.queryParser({ mapParams: true }));
server.use(restify.plugins.bodyParser({ mapParams: true }));
server.use(cookieParser.parse);
server.post('/test', (req, res, next) => {
const { files } = req;
let temporaryFile = files['file'].path;
let permanentLocation = '/srv/www/domain.com/permanent_location';
// copy file
return fs.promises.copyFile(temporaryFile, permanentLocation)
// insert into database
.then(() =>{
// Your database class instance query method returns promise
database.query(
`insert into Specimen (
CollectorId,
HumanReadableId,
FileLocation
) values (
1,
'AAA004',
${permanentLocation}
)`
).then(() => {
console.log('success!!!')
return res.send('success!')
})
.catch(error => {
console.error('Inner database promise error', error)
return res.send(error);
});
}).catch(error => {
console.error('Outer fs.copyfile promise error', error)
return res.send(error);
})
});

Node.JS and MySQL - queries lock up and execute extremely slowly

I am getting strange behavior using Node.JS and MySQL with this driver - https://github.com/mysqljs/mysql
Essentially, I have a button on the frontend that triggers an app.get that makes a query in the database and I can happily use the results in my backend.
This works nicely, until I press the button 4-5 times in a second, where as the queries lock up and I have to wait for 2-3 minutes until they continue executing. I have a similar write function that behaves the same way.
Is it possible this is a problem, because I'm trying to execute the exact same query asynchronously? I.e. do I have to limit this from the front end or is it a backend problem?
Any ideas on how to debug what exactly is going on?
// database.js
var mysql = require('mysql');
var pool = mysql.createPool({
connectionLimit: 100,
host : 'localhost',
user : 'secret',
password : 'secret',
database : 'mydb'
});
exports.getConnection = function(callback) {
pool.getConnection(function(err, connection) {
callback(err, connection);
});
};
// dbrw.js
var con = require('../config/database');
function read(id, done) {
con.getConnection(function(err, connection){
if(!err){
connection.query("SELECT * FROM users WHERE id = ?",[id], function(err, rows) {
connection.release();
if (err)
done(err);
if (rows.length) {
console.log("rows " + JSON.stringify(rows));
done(rows[0].progress);
};
});
}
else {
console.log(err);
}
});
}
exports.read = read;
// routes.js
var dbrw = require('./dbrw.js');
app.get('/read', isLoggedIn, function(req, res) {
dbrw.read(req.user.id, function(result) {
console.log(result);
});
});
// Frontend - angular app.js
$scope.tryread = function() {
$http.get('/read');
}
Thanks in advance for any input.
I see a few issues:
function read(id, done) {
con.getConnection(function(id, connection){...}
}
Notice how you overwrite the id passed to read by giving that same name to an argument of the callback to getConnection.
Also, your Express route doesn't actually end the request by sending back a response, which will make your browser time out the connection. At some point, it will even refuse to send more requests because too many are still pending.
So make sure to end the request:
app.get('/read', isLoggedIn, function(req, res) {
dbrw.read(req.user.id, function(result) {
console.log(result);
res.end(); // or `res.send(result)`
});
});
And a tip: you should use the callback calling convertion for Node, where the first argument represents an error (if there is any) and the second argument represents the return value.

Node JS Error: write after end

In the following code, I am trying to retrieve data from MySQL database and show them to a user by using response write. The error that I got is Error: write after end:
var http = require("http");
var mysql = require('mysql');
var express = require('express');
var app = express();
var bodyParser = require('body-parser');
var urlencodedParser = bodyParser.urlencoded({ extended: false })
app.use(express.static('public'));
app.get('/Search.html', function (req, res) {
res.sendFile( __dirname + "/" + "Search.html" );
})
var connection = mysql.createConnection(
{
host : 'localhost',
user : 'root',
password : 'somepass',
database : 'SocialQuery',
}
);
connection.connect();
app.post('/process_post', urlencodedParser, function (req, res) {
// Prepare output in JSON format
response = {
SearchType:req.body.SearchTypes,
Term:req.body.term
};
//var vas = JSON.stringify(response);
var search = req.body.SearchTypes;
var term = req.body.term;
var query = connection.query('Select * from ?? where Lable = ?', [search, term], function(err, rows) {
res.write(rows);
});
console.log(query.sql);
res.end();
})
//}).listen(8081);
http.createServer(app).listen(8081);
console.log('Server running at http://127.0.0.1:8081/');
I changed res.write(rows); to res.end(rows); but didn't work. Can someone help me solving this problem.
The problem is that MySQL queries are asynchronous in node.js. so, the result won't be in the variable query, but retrieved in the callback, to the variable rows. So what happens is that res.end() is called, and then the callback returns and res.write() is called, so it's called after end().
You are doing an Asynchronous call when fetching data from database. res.write() is inside callback function so before fetching data it would call res.end() and res.write() will be called after the data has been fetched. That's why you are getting Error: write after end . You can use res.end() in the same callback function.
var query = connection.query('Select * from ?? where Lable = ?', [search, term], function(err, rows) {
res.write(rows, function(err){
res.end();
});
});
Now the res.end() function will be called after the write process has been done.
It worked after I made two changes:
var query = connection.query('Select * from ?? where Lable = ?', [search, term], function(err, rows) {
console.log(rows);
res.write(JSON.stringify(rows));
res.end();
});
First, I moved res.end(); inside the connection.query part.
Second, instead of writing rows only, I changed to res.write(JSON.stringify(rows));

passport send error by json

I'm making an app with express + passport and angularJS; I want to be able to send any errors produced from passport (such as username taken or no email provided) by json so my angularJS app can receive these errors in a json response. More specifically right now I want to have a json response to my signup POST method that outputs any errors. I have tried to do this for myself and I've search all over the web and stack overflow I just cannot work this out!
Here is my users route file in express:
var express = require('express');
var router = express.Router();
var isAuthenticated = require('../config/isAuthenticated');
module.exports = function(passport){
router.get('/loggedin', function(req, res){
res.send(req.isAuthenticated() ? req.user : '0');
});
router.post('/signup', passport.authenticate('local-signup', {
successRedirect : '/',
failureRedirect : '/signup',
failureFlash: true
}));
router.post('/login', passport.authenticate('local-login'), function(req, res){
res.send(req.user);
});
router.post('/signout', function(req,res){
req.logout();
res.json({redirect: '/'});
});
router.get('/authtest', isAuthenticated, function(req, res){
res.render('authtest', {user: req.user});
});
return router;
};
This is my passport signup strategy:
passport.use('local-signup', new LocalStrategy({
usernameField : 'username',
passwordField : 'password',
passReqToCallback : true
},
function(req, username, password, done){
process.nextTick(function(){
User.findOne({'local.username' : username}, function(err, user){
if(err) return done(err);
if (user) { //username already exists
return done(null, false, {message: 'Username already exists'});
} else if(!req.body.email) { //no email address provided
return done(null, false, {message: 'You must provide an email address!'});
} else {
var newUser = new User();
newUser.local.username = username;
newUser.generateHash(password, function(err, hash){
if(err) return done(err);
newUser.local.password = hash;
});
newUser.email = req.body.email;
newUser.servers = [];
newUser.save(function(err){
if(err) throw err;
return done(null, newUser);
});
};
});
});
}
));
I know looking at my code right now it looks like I haven't tried to solve this myself at all but this is just my latest working code; I have been stuck at this for the past few days!
Any help would be greatly appreciated :)
According to the current code of passport this is probably achievable by passing custom callback to handle all results of authentiction yourself. This callback is given after options or instead of those.
passport( "local-signup", { ... }, callbackFn );
or
passport( "local-login", callbackFn );
This callback is used in all resulting situations of trying to authenticae. It is thus invoked on processing errors like this:
callbackFn( err )
If (all configured) authentications have failed it is called with
callbackFn( null, false, challenge(s), status(es) )
On successfully having authenticated user the callback is invoked like so:
callbackFn( null, user, infos )
with infos optionally provided by strategies.
Now comes the bottom-side: In either situation passport.authenticate() skips usual processing but instantly invokes provided callback to care for the rest. This includes processing of any options passed in call for passport.authenticate() like flashing messages, preparing session and request for containing authenticated user etc.
Since options given passport.authenticate() are never passed into callback there is actually no obvious reason to use both.
When I was stumbling over the very same problem (linking passport-service with angular-js POST request) I declined to consider use of callback a proper solution. This callback isn't documented. And it doesn't even look quite useful for it isn't passing req, res and next to pass any actual request in callback. Thus it makes very little sense to use it at all and I'd expect it to vanish soon or to change its behaviour quite much.
So the second approach was about trying to figure out why there is a problem in AngularJS. Passport is sending plain text Unauthorized in response with status code 401. AngularJS is trying to parse this as JSON and produces Syntax error. The text Unauthorized results from passprt ending response very simply by invoking
res.statusCode = 401;
res.end(http.STATUS_CODES[res.statusCode]);
Thus a proper workaround might try to replace
either text in http.STATUS_CODES though this is having impact on processing further requests and thus isn't preferrable
or res.end() by an overloaded method acting differently if res.statusCode is 401.
Due to affecting any current request, only, I tried the latter. Replaced res.end() might be used to send any text you want:
router.post('/login',
function(req, res, next) {
var _end = res.end;
res.end = function() {
if (res.statusCode === 401) {
return _end('{"status":"Unauthorized"}');
}
return _end.apply(this, arguments);
};
next();
},
passport.authenticate('local-login'),
function(req, res) {
res.send(req.user);
}
);
Alternatively the replaced method might add previously missing response header information on content type, for this was actually causing issues in AngularJS processing that response as JSON by default.
router.post('/login',
function(req, res, next) {
var _end = res.end;
res.end = function() {
if (res.statusCode === 401) {
res.set("Content-Type", "text/plain");
}
return _end.apply(this, arguments);
};
next();
},
passport.authenticate('local-login'),
function(req, res) {
res.send(req.user);
}
);
Finally, either approach is really just a workaround. I think passport is in the need for revising this annoying limitation.

NodeJS sessions, cookies and mysql

I'm trying to build an auth system and I have app.js
var express = require('express')
, MemoryStore = require('express').session.MemoryStore
, app = express();
app.use(express.cookieParser());
app.use(express.session({ secret: 'keyboard cat', store: new MemoryStore({ reapInterval: 60000 * 10 })}));
app.use(app.router);
and the route.index as
var express = require('express')
, mysql = require('mysql')
, crypto = require('crypto')
, app = module.exports = express();
app.get('/*',function(req,res){
var url = req.url.split('/');
if (url[1] == 'favicon.ico')
return;
if (!req.session.user) {
if (url.length == 4 && url[1] == 'login') {
var connection = mysql.createConnection({
host : 'localhost',
user : 'user',
password : 'pass',
});
var result = null;
connection.connect();
connection.query('use database');
var word = url[3];
var password = crypto.createHash('md5').update(word).digest("hex");
connection.query('SELECT id,level FROM users WHERE email = "'+url[2]+'" AND password = "'+password+'"', function(err, rows, fields) {
if (err) throw err;
for (i in rows) {
result = rows[i].level;
}
req.session.user = result;
});
connection.end();
}
}
console.log(req.session.user)
when I access http://mydomain.com/login/user/pass a first time it shows in the last console call but a second time access the cookie is clean
Why do you not just use Express's session handling? if you use the express command line tool as express --sessions it will create the project template with session support. From there you can copy the session lines into your current project. There more information in How do sessions work in Express.js with Node.js? (which this looks like it may be a duplicate of)
As for sanitizing your SQL, you seem to be using the library, which will santitize your inputs for your if you use parameterized queries (ie, ? placeholders).
Final thing, you are using Express wrong (no offence). Express's router will let you split alot of your routes (along with allowing you to configure the favicon. See Unable to Change Favicon with Express.js (second answer).
Using the '/*' route will just catch all GET requests, which greatly limits what the router can do for you.
(continued from comments; putting it here for code blocks)
Now that you have an app with session support, try these two routes:
app.get('/makesession', function (req, res) {
req.session.message = 'Hello world';
res.end('Created session with message : Hello world');
});
app.get('/getsession', function (req, res) {
if (typeof req.session.message == 'undefined') {
res.end('No session');
} else {
res.end('Session message: '+req.session.message);
}
});
If you navigate in your browser to /makesession, it will set a session message and notify you that it did. Now if you navigate to /getsession, it will send you back the session message if it exists, or else it will tell you that the session does not exist.
You need to save your cookie value in the response object:
res.cookie('session', 'user', result);
http://expressjs.com/api.html#res.cookie