Im a android dev and trying to make a simple rest api with node js, so Im basically new to js.
Im setting up a new rest api and want to connect to mysql database.
I was trying to solve that this way, but I'm getting errors.
And, also how many connection limits to set ?
const express = require('express');
const db = require('../db');
const mainNewsRouter = express.Router();
mainNewsRouter.get('/', async (req, res, next) => {
try {
let result = await db.getMainNews();
console.log(res.json(result));
res.json(result);
} catch(e) {
console.log(e);
}
});
module.exports = mainNewsRouter;
//DbHandler.js
var mysql = require('mysql2');
const url = require('url');
var SocksConnection = require('socksjs');
var remote_options = {
host:'xxx',
port: 3306
};
var proxy = url.parse('http://xxx:xxx#us-east-static-06.quotaguard.com:xxx');
var auth = proxy.auth;
var username = auth.split(":")[0];
var pass = auth.split(":")[1];
var sock_options = {
host: proxy.hostname,
port: 1080,
user: username,
pass: pass
};
var sockConn = new SocksConnection(remote_options, sock_options);
var dbConnection = mysql.createPool({
connectionLimit: 10,
user: 'xxx',
database: 'xxx',
password: 'xxx',
stream: sockConn
});
getMainNews = () => {
return new Promise((resolve, reject) => {
dbConnection.query('SELECT ... * from ...;',
(err, results) => {
if (err) {
return reject(err);
};
// sockConn.dispose();
return resolve(results);
});
});
dbConnection.end();
};
On first api call I get data from database, but with this error:
Error [ERR_HTTP_HEADERS_SENT]: Cannot set headers after they are sent to the client
at ServerResponse.setHeader (_http_outgoing.js:470:11)
at ServerResponse.header (node_modules\express\lib\response.js:771:10)
at ServerResponse.send (node_modules\express\lib\response.js:170:12)
at ServerResponse.json (node_modules\express\lib\response.js:267:15)
at mainNewsRouter.get (server\routes\mainNews.js:10:11)
at process._tickCallback (internal/process/next_tick.js:68:7)
And after second API call there is no data, I only get this exception.
> Server is running on port: { Error: This socket has been ended by the
> other party
> at Socket.writeAfterFIN [as write] (net.js:395:12)
> at SocksConnection._write (node_modules\socksjs\socks.js:72:24)
> at doWrite (_stream_writable.js:415:12)
> at writeOrBuffer (_stream_writable.js:399:5)
> at SocksConnection.Writable.write (_stream_writable.js:299:11)
> at PoolConnection.write (node_modules\mysql2\lib\connection.js:221:17)
> at PoolConnection.writePacket(node_modules\mysql2\lib\connection.js:279:12)
> at ClientHandshake.sendCredentials (node_modules\mysql2\lib\commands\client_handshake.js:63:16)
> at ClientHandshake.handshakeInit (node_modules\mysql2\lib\commands\client_handshake.js:136:12)
> at ClientHandshake.execute (node_modules\mysql2\lib\commands\command.js:39:22) code: 'EPIPE',
> fatal: true }
Although I am by no means an expert, I think one of the issues lies with closing the connection. The whole idea of a pool is to release the connection back to the pool, not close it.
I have done testing on connection pools and have used a pool size of min:4 max:12 with 100s of requests per second without running into connections issues with MySQL.
Personally, I use Knex to manage my db connections, it manages all of the pools too, taking care of a lot of the headache. Low overhead, I think it would be worth porting over that part of your code to. Once the connection issue is sorted out, then you could tackle other issues as they crop up.
Again, I am not an expert and cannot exactly nail down releasing the MySQL connection back to the pool in the code above, but I do think that is why you don't get data after your initial call.
It wont answer your full question but still. The "Error [ERR_HTTP_HEADERS_SENT]: Cannot set headers after they are sent to the client" means that a header has already been set but the user is again trying to set it. A header is set when we send a response. Headers include the content type, content-length,status and all the information about the response we are sending. When we write res.send or res.json or res.render i.e sending a response the headers get set automatically using the required information (Express does it automatically for us, in pure nodejs we have to set every header by ourselves). Notice that you have written res.json two times which means it has to set the headers twice. Also writing res.json inside console.log doesnt make any sense. Why have you done that?
Related
Ive started learning nodejs/ express and in building a web app I decided to use mysql2. I've been following tutorials on how I get it all working and it all seemed to work fine until I discovered promises. To cut a long story short I've now started to understand the value of promises, however, when I tried to alter my code to accommodate them I did not get the same result.
my previous functions would look like this.
function checkIfLogedIn(req,res,next){
let membershipnumber = req.session.membershipno;
if(req.session.loggedin){
db.query('SELECT * FROM customers WHERE membershipnumber = ?', [membershipnumber], function(error, results, fields) {
if (error) {
response.send('Error connecting to the database '+error);
};
if (results.length > 0) {
req.user = {
name: '${results[0].first_name} ${results[0].last_name}',
membershipnumber: results[0].membershipnumber,
customer_id: results[0].id
}
}
next();
});
}
}
This worked fine. In the attempt to implement promises I first changed my database file
const mysql = require('mysql2');
//const mysql = require('mysql2/promise');
/////////////////////////////////////////////
//local SQL connection
/////////////////////////////////////////////
module.exports = mysql.createConnection({
host : 'localhost',
user : 'root',
password : 'letsgetkinky',//
database : 'nodelogin'
});
Then in my function, i have done this as a test
const db = require('../../database')
async function checkIfLogedIn(req,res,next){
let membershipnumber = req.session.membershipno;
if(req.session.loggedin){
const data = await db.query('SELECT * FROM customers WHERE membershipnumber = 4');
console.log(data);
}
}
This results in an error below
Error: You have tried to call .then(), .catch(), or invoked await on the result of query that is not a promise, which is a programming error. Try calling con.promise().query(), or require('mysql2/promise') instead of 'mysql2' for a promise-compatible version of the query interface.
Im not sure where im going wrong with this, I've tried following the documentation but it doesn't seem to work or be consistent. when i change mysql2/promise it then tells me that the function query does not exist. I would really like to understand where im going wrong on this and how i should be interpreting the documentation as it looks as though there are multiple ways of achieving this.
For posterity, i managed to resolve my own issue. My database file seemed to be causing the issue. Im yet to fully understand why but nevertheless I have managed to achieve using await in my functions. Instead of createConnection which apparently returns a promise I used createPool. With this, I could then do const [fields,rows] = await db.query('SELECT * FROM customers WHERE membershipnumber = 4');
database.js file
const pool = mysql.createPool({
host : 'localhost',
user : 'root',
password : 'letsgetkinky',
database : 'nodelogin'
});
module.exports = pool.promise();
I have implemented load balancing in read database connection like when read db load increased to 60% it will initiate a new read database for balancing load on database but
When I see from AWS developer console dashboard all API calls It will initate new read database instance but most of the API's calls load took placed on database 1 upto 90 percent but like 10 req /sec and on read DB instance 2 1 to 5% database is used like 1req /sec
it should divided API request on both database equaly but It wont work
This issue is because mysql.createPool will not close connection from database 1 (createPool will reuse its opened connections) so that other API calls can move to second database instance.
To solve this problem I had changed mysql.createPool with mysql.createConnection on Each API calls
I had created 2 middleware
1-for createConnection
2-for connection.end()
whenever a request comes in middleware 1 calls and create new connection and on request finish middleware 2 will call which will end the connection. this solution has solved my problem of load balancing but a new issue takes place I have face to many database connection issues with this method
does anyone have a proper solution who has faced this issue or can help?
Sample Code :
var readDB = mysql.createConnection({
database: process.env.READ_DB_DATABASE,
host: process.env.READ_DB_HOST,
user: process.env.READ_DB_DB_USER,
password: process.env.READ_DB_DB_PASSWORD,
charset: "utf8mb4"
});
utils.js
async onFinish(req, res, next) {
return new Promise(async (resolve, reject) => {
try {
let readDB = req.readDB;
const dbEnd = util.promisify(readDB.end).bind(readDB);
const response = await dbEnd();
resolve(response);
} catch (error) {
reject(error);
}
});
}
app.js
/**
* middleware for create connection and end connection on finish
*/
app.use(async (req, res, next) => {
try {
const readDB = await utils.readDBCreateConnection();
req.readDB = readDB;
res.on("finish", function () {
console.log("onFinish called");
utils.onFinish(req, res, next);
});
next();
} catch (error) {
res.status(error.status || 500).send({
code: 500,
message: error.message || `Internal Server Error`,
});
}
});
/**
* Code to initialice routing
*/
require("./modules/v2-routes")(app); // v2 app routes
I am trying to set up an alexa skill that calls MySQL Querys when a certain question gets asked. Nothing I tried seemed to work because I either get an error or nothing happens at all.
I am using/what I am working with:
Alexa Developer Console
Cloud9 as IDE(which uploads the code to AWS Lambda, where I defined the environmental variables used in my code)
AWS Lambda, NodeJS
Amazon RDS, which hosts my DB instance
MySQL Workbench (where I created a few tables to test the database, which works fine)
I tried several ways to solve my problem, like creating a connection or a pool, but I think it has to be handled differently, because Alexa has to wait for the response.
const GetOeffnungszeiten_Handler = {
canHandle(handlerInput) {
const request = handlerInput.requestEnvelope.request;
return request.type === 'IntentRequest' && request.intent.name === 'GetOeffnungszeiten' ;
},
handle(handlerInput) {
const request = handlerInput.requestEnvelope.request;
const responseBuilder = handlerInput.responseBuilder;
let sessionAttributes = handlerInput.attributesManager.getSessionAttributes();
let say = 'OUTPUT: ';
var mysql = require('mysql');
var connection = mysql.createPool({
host : process.env.MYSQL_HOSTNAME,
user : process.env.MYSQL_USERNAME,
password : process.env.MYSQL_PASSWORD,
database : process.env.MYSQL_DATABASE,
port : process.env.MYSQL_PORT
});
exports.handler = (event, context, callback) => {
context.callbackWaitsForEmptyEventLoop = false;
pool.getConnection(function(err, connection) {
connection.query('select name from persons where id=1', function (error, results, fields) {
connection.release();
if (error) {
callback(error);
say=say+'0';
} else {
callback(null,results[0].name);
say=say+' 1';
}
});
});
};
return responseBuilder
.speak(say)
.reprompt('try again, ' + say)
.getResponse();
},
};
I expect the output to either be "OUTPUT: 1" or "OUTPUT: 0" but it is "OUTPUT: "
With output I refer to the say variable.
Your function is returning responseBuilder...getResponse() before the SQL connection finishes and callback is called.
I would suggest to refactor your code using async and await to make it easier to read and to understand. (read https://stormacq.com/2019/06/22/async-js.html for help)
Be sure to return the Alexa response only when your call to MySQL returns, and not before. Remember that Alexa timeout is 8 secs, so your code need to return before that. Be sure that the AWS Lambda timeout is aligned to the Alexa timeout too (put it at 7 secs)
Finally, I would advise against using MySQL for Alexa skills. Because each Lambda invocation might be served by different containers, your code will create a connection pool for each interaction between customers and your skill, creating a significant delay to bring a response to customers. DynamoDB and Elastic Cache are much better suited to Alexa skills.
running into some issues trying to figure out an Azure Function (node.js-based) can connect to our mysql database (also hosted on Azure). We're using mysql2 and following tutorials pretty much exactly (https://learn.microsoft.com/en-us/azure/mysql/connect-nodejs, and similar) Here's the meat of the call:
const mysql = require('mysql2');
const fs = require('fs');
module.exports = async function (context, req) {
context.log('JavaScript HTTP trigger function processed a request.');
if (req.query.fname || (req.body && req.body.fname)) {
context.log('start');
var config = {
host:process.env['mysql_host'],
user: process.env['mysql_user'],
password: process.env['mysql_password'],
port:3306,
database:'database_name',
ssl:{
ca : fs.readFileSync(__dirname + '\\certs\\cacert.pem')
},
connectTimeout:5000
};
const conn = mysql.createConnection(config);
/*context.log(conn);*/
conn.connect(function (err) {
context.log('here');
if (err) {
context.error('error connecting: ' + err.stack);
context.log("shit is broke");
throw err;
}
console.log("Connection established.");
});
context.log('mid');
conn.query('SELECT 1+1',function(error,results,fields) {
context.log('here');
context.log(error);
context.log(results);
context.log(fields);
});
Basically, running into an issue where the conn.connect(function(err)... doesn't return anything - no error message, no logs, etc. conn.query works similarly.
Everything seems set up properly, but I don't even know where to look next to resolve the issue. Has anyone come across this before or have advice on how to handle?
Thanks!!
Ben
I believe the link that Baskar shared covers debugging your function locally
As for your function, you can make some changes to improve performance.
Create the connection to the DB outside the function code otherwise it will create a new instance and connect every time. Also, you can enable pooling to reuse connections and not cross the 300 limit that the sandbox in which Azure Functions run has.
Use the Promises along with async/await
You basically can update your code to something like this
const mysql = require('mysql2/promise');
const fs = require('fs');
var config = {
host: process.env['mysql_host'],
user: process.env['mysql_user'],
password: process.env['mysql_password'],
port: 3306,
database: 'database_name',
ssl: {
ca: fs.readFileSync(__dirname + '\\certs\\cacert.pem')
},
connectTimeout: 5000,
connectionLimit: 250,
queueLimit: 0
};
const pool = mysql.createPool(config);
module.exports = async function(context, req) {
context.log('JavaScript HTTP trigger function processed a request.');
if (req.query.fname || (req.body && req.body.fname)) {
context.log('start');
const conn = await pool.getConnection();
context.log('mid');
await conn.query('SELECT 1+1', function(error, results, fields) {
context.log('here');
context.log(error);
context.log(results);
context.log(fields);
});
conn.release();
}
};
PS: I haven't test this code as such but I believe something like this should work
Debugging on serverless is challenging for obvious reasons. You can try one of the hacky solutions to debug locally (like Serverless Framework), but that won't necessarily help you if your issue is to do with a connection to a DB. You might see different behaviour locally.
Another option is to see if you can step debug using Rookout, which should let you catch the full stack at different points in the code execution and give you a good sense of what's failing and why.
I'm trying to build an auth system and I have app.js
var express = require('express')
, MemoryStore = require('express').session.MemoryStore
, app = express();
app.use(express.cookieParser());
app.use(express.session({ secret: 'keyboard cat', store: new MemoryStore({ reapInterval: 60000 * 10 })}));
app.use(app.router);
and the route.index as
var express = require('express')
, mysql = require('mysql')
, crypto = require('crypto')
, app = module.exports = express();
app.get('/*',function(req,res){
var url = req.url.split('/');
if (url[1] == 'favicon.ico')
return;
if (!req.session.user) {
if (url.length == 4 && url[1] == 'login') {
var connection = mysql.createConnection({
host : 'localhost',
user : 'user',
password : 'pass',
});
var result = null;
connection.connect();
connection.query('use database');
var word = url[3];
var password = crypto.createHash('md5').update(word).digest("hex");
connection.query('SELECT id,level FROM users WHERE email = "'+url[2]+'" AND password = "'+password+'"', function(err, rows, fields) {
if (err) throw err;
for (i in rows) {
result = rows[i].level;
}
req.session.user = result;
});
connection.end();
}
}
console.log(req.session.user)
when I access http://mydomain.com/login/user/pass a first time it shows in the last console call but a second time access the cookie is clean
Why do you not just use Express's session handling? if you use the express command line tool as express --sessions it will create the project template with session support. From there you can copy the session lines into your current project. There more information in How do sessions work in Express.js with Node.js? (which this looks like it may be a duplicate of)
As for sanitizing your SQL, you seem to be using the library, which will santitize your inputs for your if you use parameterized queries (ie, ? placeholders).
Final thing, you are using Express wrong (no offence). Express's router will let you split alot of your routes (along with allowing you to configure the favicon. See Unable to Change Favicon with Express.js (second answer).
Using the '/*' route will just catch all GET requests, which greatly limits what the router can do for you.
(continued from comments; putting it here for code blocks)
Now that you have an app with session support, try these two routes:
app.get('/makesession', function (req, res) {
req.session.message = 'Hello world';
res.end('Created session with message : Hello world');
});
app.get('/getsession', function (req, res) {
if (typeof req.session.message == 'undefined') {
res.end('No session');
} else {
res.end('Session message: '+req.session.message);
}
});
If you navigate in your browser to /makesession, it will set a session message and notify you that it did. Now if you navigate to /getsession, it will send you back the session message if it exists, or else it will tell you that the session does not exist.
You need to save your cookie value in the response object:
res.cookie('session', 'user', result);
http://expressjs.com/api.html#res.cookie