node.js response only one html request - mysql

I use node.js and mysql module to write a simple select statement.
The problem is it can only respond to one request, subsequent responses will be empty.
I use a browser to load the page for the first time, it return a complete result, but the browser is still loading. What happen:
Code:
var server = http.createServer(function (request, response) {
response.writeHead(200, {"Content-Type": "text/plain"});
client.query('SELECT * FROM ' + tbl,
function selectDb(err, results, fields) {
if (err) {
throw err;
}
for (var i in results){
var result = results[i];
response.write(result['CUSTOMERNAME']); // Writes to the web browser the value of test then a : to seperate values
}
response.end("END RESULT");
client.end();
}
);
});

According to the node-mysql docs (which I assume you are using) found here,
client.end();
Closes the mysql connection.
When you attempt another request, there is no open connection and node-mysql doesn't do any connection pool handling or auto re-connect, its all left up to you.
If you don't mind keeping a single connection open for the lifetime of the app (not the best design) you can just move that client.end() outside your connection handler.
Otherwise, create a little method that checks for an open connection or maybe does a connection pool, see this post for more info.

Related

ERROR Error [ERR_HTTP_HEADERS_SENT]: Cannot set headers after they are sent to the client

i am trying to do an authentitication using mysql and nodejs, and in the terminal returns an Error [ERR_HTTP_HEADERS_SENT]: Cannot set headers after they are sent to the cli
ent. how can i solve it?
res.json({
text:'protected'
});
});
app.post('/api/add',(req,res)=>{
const sql = 'INSERT INTO users SET ? ';
const userOBJ ={
users:req.body.name,
fullname:req.body.fullname,
email:req.body.email,
telephone:req.body.telephone,
password:req.body.password
};
connection.query(sql,userOBJ,err=>{
if(err)throw err;
res.send('added customer');
})
jwt.sign({user:userOBJ},'users',(err,token)=>{
res.json({
token
})
}
)
});
function verifyToken(req,res,next){
const bearerHeader =req.headers ['authorization']
if(typeof bearerHeader !=='undefined'){
const bearerToken= bearerHeader.split(' ')[1]
req.token =bearerToken
next()
}else{
res.sendStatus(403)
}
}````
In the '/api/add' route, you're calling both res.send() and res.json() on the same request. That will cause the error you see about headers already sent.
Pick one or the other as you can only send one response for each http request.
In addition, you have asynchronous issues and a lack of error handler handling on the database call. You need to nest the second block of code inside the database callback so they are sequenced appropriately and you need to actually send an error response if you get an error from your database.
Here you are using the two response in a api res.send() & res.json(), please send only one response for a request.

How to put the results of MySQL database in a variable in Node.js?

I have a problem in my node.js application. I'm connecting to the database and getting the data
let users = {};
let currentUser = "example"; // this variable changes every time a user connects
connection.query('SELECT * FROM users WHERE name="'+currentUser+'"', function (err, result) {
if (err) console.log(err);
users[currentUser] = result[0];
});
console.log(users[currentUser]);
When I try to console.log the result[0] from inside the function, it returns this:
RowDataPacket {
n: 11,
id: 'VKhDKmXF1s',
name: 'user3',
status: 'online',
socketID: 'JbZLNjKQK15ZkzTXAAAB',
level: 0,
xp: 0,
reg_date: 2018-07-16T20:37:45.000Z }
I want to put that result from MySQL into users.example or something like that, but when I try the code it returns undefined. So I tried console.log(users[currentUser].id) as an example and it shows an error
TypeError: Cannot read property 'id' of undefined
So how do I put the data from the result inside my variable users[currentUser]?
So how do I put the data from the result inside my variable users[currentUser]?
That's happening. The problem is how it is being tested.
The correct test is with the console.log inside the callback function(err, result)
function (err, result) {
if (err) console.log(err);
users[currentUser] = result[0];
console.log(users[currentUser]);
});
Issue: users[currentUser] is still undefined outside that function
Well, yes and no. It is undefined in code that executes before the callback is fired.
And how do I fix that?
Anything that needs the result of the query must be executed from within the callback function, because that's the only code location where you know for certain that the data exists.
Well, I need that data outside in a global variable.
You can stick the query data in a global, but that doesn't solve the timing issue
of only accessing that global when it is defined and contains current data. That will cause lots of frustration.
If you don't want to call one or more specific functions to process the query data
within the callback, an alternative is to use a nodejs EventEmitter to coordinate the data production and data consumption.
Yet another alternative is to not use a callback function, and use Promise and/or async/await, both of which are supported by modern nodejs. This alternative doesn't involve global variables, but provides different ways to code the fact that some operations need to wait for the results of others.
connection.query is an async call. The console.log is called before the query fetches the data from db.
Because you're using a callback, your console.log is happening before the result comes back from the database. Additionally, you have a type in your code user -> users.
let users = {};
let currentUser = "example"; // this variable changes every time a user connects
connection.query('SELECT * FROM users WHERE name="'+currentUser+'"', function (err, result) {
if (err) console.log(err);
users[currentUser] = result[0];
console.log(users[currentUser]);
});
Side note: research "SQL Injection" before you use this code. The way you're building your query opens you up for anyone to access your database. Using a library like squel.js or knex.js will help you avoid this.
For an explanation of why things happen in the order they do, take a look at the JavaScript event loop.

How does pool.query() and pool.getGetConnection() differ on connection.release()?

As i can understand every pool.query() will cost a connection and it is automatically release when it ends. based from this comment on github issue. But what about the nested queries performed using pool.getConnection()?
pool.getConnection(function(err, connection) {
// First query
connection.query('query_1', function (error, results, fields) {
// Second query
connection.query('query_2', function (error, results, fields) {
// Release the connection
// DOES THIS ALSO RELEASE query_1?
connection.release();
if (error) throw error;
// you can't use connection any longer here..
});
});
});
UPDATE
Here is my code using transaction when performing nested queries.
const pool = require('../config/db');
function create(request, response) {
try {
pool.getConnection(function(err, con) {
if (err) {
con.release();
throw err;
}
con.beginTransaction(function(t_err) {
if (t_err) {
con.rollback(function() {
con.release();
throw t_err;
});
}
con.query(`insert record`, [data], function(i_err, result, fields){
if (i_err) {
con.rollback(function() {
con.release();
throw i_err;
});
}
// get inserted record id.
const id = result.insertId;
con.query(`update query`, [data, id], function(u_err, result, fields)=> {
if (u_err) {
con.rollback(function() {
con.release();
throw u_err;
});
}
con.commit(function(c_err){
if (c_err) {
con.release();
throw c_err;
}
});
con.release();
if (err) throw err;
response.send({ msg: 'Successful' });
});
});
});
});
} catch (err) {
throw err;
}
}
I made a lot of defensive error catching and con.release() since at this point i do not know how to properly release every connection that is in active.
And i also assume that every con.query() inside pool.getConnection() will cost a connection.
EDIT:
A connection is like a wire that connects your application to your database. Each time you connection.query() all you're doing is sending a message along that wire, you're not replacing the wire.
When you ask the pool for a connection, it will either give you a 'wire' it already has in place or create a new wire to the database. When you release() a pooled connection, the pool reclaims it, but keeps it in place for a while in case you need it again.
So a query is a message along the connection wire. You can send as many messages along as you want, it's only one wire.
Original Answer
pool.query(statement, callback) is essentially
const query = (statement, callback) => {
pool.getConnection((err, conn) => {
if(err) {
callback(err);
} else {
conn.query(statement, (error, results, fields) => {
conn.release();
callback(error, results, fields);
});
}
})
}
Ideally you shouldn't be worrying about connections as much as the number of round trips you're making. You can enable multiple statements in your pool config multipleStatements: true on construction of your pool and then take advantage of transactions.
BEGIN;
INSERT ...;
SELECT LAST_INSERT_ID() INTO #lastId;
UPDATE ...;
COMMIT;
It sounds like you are not closing the first query as quickly as you should.
Please show us the actual code. You do not need to hang onto the query to get insertid.
(After Update to Question:) I do not understand the need for "nesting". The code is linear (except for throwing errors):
BEGIN;
INSERT ...;
get insertid
UPDATE ...;
COMMIT;
If any step fails, throw an error. I see no need for two "connections". You are finished with the INSERT before starting the UPDATE, so I don't see any need for "nesting" SQL commands. And get insertid is a meta operation that does not involve a real SQL command.
I don't know Node.js, but looking at the code and Github documentaion, it is almost certain that pool.getConnection gets a connection from a connection pool and it calls the function with a connection object obtained and any error encountered while getting a connection from the pool. Within the function body we may use the connection object any number of times, but once it is released it won't be usable as it goes back to the pool and I assume the connection object will no longer have the reference to underlying mysql connection (a little lower level connection object may be). Now we have to release the connection object only once, and we must release the connection object if we don't want to run out of free connection from the connection pool; otherwise subsequent call to pool.getConnection won't find any connection in "free" list of connection as they are already moved to "in_use" list of connections and they are never released.
Generally, after getting a connection from the connection pool, it may used for any number of operations/queries and it is released "once" to give it back to "free" list of the pool. That is how the connection pooling generally works.

Showing that node shell is async and mongo shell is not

So I have a mongodb setup and I have some test data in it. I want to be able to show that the mongo shell runs our script sync and node runs our script async. I have setup the two following js files which I got while doing a Mongo University course. This is really more of a test so that I understand what's going on. I am going to cd into the directory where I have mongo installed using npm and where the scripts are also at. Then I will call these scripts, I will call the mongoshell.js using
>mongo mongoshell.js
and nodeshell.js using:
>node nodeshell.js
Here are the two scripts:
mongoshell.js
//Find one document in our collection
var doc = db.allClasses.findOne();
print('before');
//Print the result
printjson(doc);
print('after');
And the result I get from running that in the shell is:
So my thinking here is that the print command is something that would return quicker than the query to mongo. BY placing a before and after print and everything coming out in the right order, it must be synchronous.
Next I have the nodeshell.js
var MongoClient = require('mongodb').MongoClient;
MongoClient.connect('mongodb://127.0.0.1:27017/test', function(err, db){
if (err) throw err;
//Find one document in our collection
db.collection('allClasses').findOne({}, function(err, doc){
//Print the result
console.dir(doc);
//close the DB
db.close();
});
});
setTimeout(function(){
console.dir("10 Milliseconds!");
}, 10);
setTimeout(function(){
console.dir("100 Milliseconds!");
}, 100);
And the result from the console is:
My thinking here is that I have determined that mongo usually takes between 10 and 100 milliseconds to return my data. If I put two print commands with timeouts one at 10 ms and one at 100 ms one should fire before the json is returned BECAUSE THE NODE SHELL IS ASYNC and the other should fire after..
MY QUESTION:
Does this hillbilly test actually show that each of the shells are what they are. Synchronous and Asynchronous? if Yes cool, if not, why?
I don't see how that trick with timeouts demonstrates the async nature. How about this?
var MongoClient = require('mongodb').MongoClient;
MongoClient.connect('mongodb://127.0.0.1:27017/test', function(err, db){
if (err) throw err;
//Find one document in our collection
db.collection('allClasses').findOne({}, function(err, doc){
console.log("Got the data!")
//Print the result
console.dir(doc);
//close the DB
db.close();
});
console.log("Data is being fetched and I do something else")
});
console.log("Mongo connection is being set up and I do something else")
Output
sergio#soviet-russia ‹ master ●● › : ~
[0] % node test.js
Mongo connection is being set up and I do something else
Data is being fetched and I do something else
Got the data!
null

socketstream async call to mysql within rpc actions

First, I need to tell you that I am very new to the wonders of nodejs, socketstream, angularjs and JavaScript in general. I come from a Java background and this might explain my ignorance of the correct way of doing things async.
To toy around with things I installed the ss-angular-demo from americanyak. My problem is now that the Rpc seems to be a synchronous interface and my call the the mysql database has an asynchronous interface. How can I return the database results upon a call of the Rpc?
Here is what I did so far with socketstream 0.3:
In app.js I successfully tell ss to allow my mysql database connection to be accessed by putting ss.api.add('coolStore',mysqlConn); in there at the right place (as explained in the socketstream docs). I use the mysql npm, so I can call mysql within the Rpc
server/rpc/coolRpc.js
exports.actions = function (req, res, ss) {
// use session middleware
req.use('session');
return {
get: function(threshold){
var sql = "SELECT cool.id, cool.score, cool.data FROM cool WHERE cool.score > " + threshold;
if (!ss.arbStore) {
console.log("connecting to mysql arb data store");
ss.coolStore = ss.coolStore.connect();
}
ss.coolStore.query(sql, function(err, rows, fields) {
if(err) {
console.log("error fetching stuff", err);
} else {
console.log("first row = "+rows[0].id);
}
});
var db_rows = ???
return res(null, db_rows || []);
}
}
The console logs the id of my database entry, as expected. However, I am clueless how I can make the Rpc's return statement return the rows of my query. What is the right way of addressing this sort of problem?
Thanks for your help. Please be friendly with me, because this is also my first question on stackoverflow.
It's not synchronous. When your results are ready, you can send them back:
exports.actions = function (req, res, ss) {
// use session middleware
req.use('session');
return {
get: function(threshold){
...
ss.coolStore.query(sql, function(err, rows, fields) {
res(err, rows || []);
});
}
}
};
You need to make sure that you always call res(...) from an RPC function, even when an error occurs, otherwise you might get dangling requests (where the client code keeps waiting for a response that's never generated). In the code above, the error is forwarded to the client so it can be handled there.