One response after few async functions - mysql

I have a web page with a form where a user can edit personal info, education, work history and etc.
And the user can add more than one degree, for example: bs, ms, phd. And a few job positions as well.
When the user push 'save' button I send all this data to my server. I send it all in one request. In the server I have a point to handle the request.
app.post(config.version + '/profile', (req, res, next) => {});
And there I do a few MySQL queries to insert/update/delete a data. I use mysql package from npm to do that.
new Promise((resolve, reject) => {
const userQuery = `INSERT INTO user ...;`;
const degreesQuery = 'INSERT INTO degree ...;';
const positionsQuery = 'UPDATE position SET ...;';
this.connection.query(userQuery, err => {});
this.connection.query(degreesQuery, err => {});
this.connection.query(positionsQuery, err => {});
resolve({});
})
In the end I do resolve({}) but I want to select updated profile and send it back (because in MySQL tables for degrees I add ids that helps me to not insert again duplicate data). So, my question is how to do resolve({}) only when all my async this.connection.querys finished?

My suggestion is to run all the queries in a Promise.all().
Example:
const queries = [
`INSERT INTO user ...;`;,
'INSERT INTO degree ...;',
'UPDATE position SET ...;'
];
Promise.all(queries.map((query) => {
return new Promise((resolve, reject) => {
this.connection.query(query, err => {
return err ? reject(err) : resolve();
});
});
})
.then(() => {
// continue
// get your updated data here with and send it as response
})
If your db library has support for Promise write it this way
Promise.all(queries.map((query) => {
return this.connection.query(query);
})
.then(() => {
// continue
// get your updated data here with and send it as response
})

Related

How to delete both element and the associated elements in join table with express.js and react.js

I have a project where I have two main tables: Contacts and Workers
I also have a join table called WorkerContacts
In my project, I give the user the option of deleting contacts, something that would also require deleting elements of the join table. My concern is that with my current setup (seen below), if I run into an error where I successfully delete a contact, but then fail to delete the associated join tables (resulting from an error), that would throw off everything. So my question is, is there a way to refactor this so that it ensures that both have been completed before doing the actual deletions then sending the promise to the front end?
Here's my current situation:
Frontend:
export const destroyContact = (contact_id) => dispatch => {
axios.post(`http://localhost:3001/contacts/destroy`, {id: contact_id})
.then(() => {
dispatch({type: 'CONTACT_DESTROYED', payload: contact_id});
axios.post(`http://localhost:3001/workerContacts/destroy`, {id: contact_id}) //I'm scared that the first thing will run but the second one won't, causing a lot of problems. We can deal with this by just throwing a big error message for the user hopefully
.then(() => {
dispatch({type: 'JOIN_TABLE_ROWS_DESTROYED', payload: contact_id});
})
.catch(err => dispatch({type: 'ERROR_CAUGHT', payload: {err_message: err.response.data.message, err_code: err.response.request.status, err_value: err.response.request.statusText}}))
})
.catch(err => dispatch({type: 'ERROR_CAUGHT', payload: {err_message: err.response.data.message, err_code: err.response.request.status, err_value: err.response.request.statusText}}))
}
I'm using redux as well so that's why I have all of the dispatch and whatnot, but essentially I've split the deletions into two axios calls: one where I delete the contact and one where I delete the join tables.
Backend:
For the contact I have this:
export const destroy = (req, res) => {
// Here is when we want to remove an existing contact
Contact.deleteMe(req.body.id)
.then(() => res.json("Contact deleted"))
.catch((err) => res.status(500).json({message: "Something went wrong when trying to save delete this. Try and reload the page and try again "}))
}
And the associated deleteMe function:
static deleteMe(customer_id){
//Uses SQL to delete an individual customer element
return db.execute('DELETE FROM contacts WHERE id = ?', [customer_id]);
}
For the jointable, I have this:
export const destroy = (req, res) => {
// Here is when we want to remove an existing contact
JoinTable.deleteMe(req.body.id)
.then(() => res.json("Join tables deleted"))
.catch(err => res.status(500).json({message: "Something went wrong on our end. Try to reload the page and start again"}))
}
And the associated deleteMe function:
static deleteMe(customer_id){
//Uses SQL to delete an individual customer element
return db.execute('DELETE FROM workercontacts WHERE workerContacts.contact_id = ?', [customer_id]);
}
I'm using a MySQL database if that helps.
Hopefully this is enough information, but if you require more, I can definitely provide you with it.
Just use a single call and execute the DELETE commands in a transaction:
export const destroyContact = (contact_id) => (dispatch) => {
axios
.post(`http://localhost:3001/contacts/destroy`, { id: contact_id })
.then(() => {
dispatch({ type: 'CONTACT_DESTROYED', payload: contact_id });
dispatch({ type: 'JOIN_TABLE_ROWS_DESTROYED', payload: contact_id });
})
.catch((err) =>
dispatch({
type: 'ERROR_CAUGHT',
payload: {
err_message: err.response.data.message,
err_code: err.response.request.status,
err_value: err.response.request.statusText,
},
})
);
};
One the backend:
static async function deleteMe(customer_id) {
await db.execute('START TRANSACTION');
try {
await db.execute('DELETE FROM contacts WHERE id = ?', [customer_id]);
await db.execute('DELETE FROM workercontacts WHERE workerContacts.contact_id = ?', [customer_id]);
await db.execute('COMMIT');
} catch (err) {
await db.execute('ROLLBACK');
}
}
...
export const destroy = (req, res) => {
// Here is when we want to remove an existing contact
Contact.deleteMe(req.body.id)
.then(() => res.json("Contact deleted"))
.catch((err) => res.status(500).json({message: "Something went wrong when trying to save delete this. Try and reload the page and try again "}))
}

How do I change nodejs admin hashed password or add another admin account?

I have a node.js application I'm working on, unfortunately a wrong email address was used for the admin account and we can't remember the password.
The hashed key type used is bcrypt. $2a$09$OczLcHx7lZQd1cgbLLmTrewUUx.nwEoZAuDembLxXI00mVEobyQZ6
I would like to know if there is a way to change the password or add a new admin account entirely.
Things I've done to resolve the admin login issue.
I have tried modifying the password from the phpmyadmin MySQL table and also changing the admin email address, but the nodejs app still doesn't recognize the changes.
This is my password.js file
const bcrypt = require("bcryptjs");
const db = require("../models/db");
exports.hashPassword = (password) => {
return new Promise((resolve, reject) => {
const newHashPassword = bcrypt.hash(password, 9);
resolve(newHashPassword)
})
}
exports.comparePassword = (password, hashPassword) => {
return new Promise((resolve, reject) => {
const same = bcrypt.compareSync(password, hashPassword);
resolve(same)
})
}
//SAVE RESET PASSWORD TOKEN TO DB
exports.saveResetPasswordIntoDB = (obj) => {
return new Promise((resolve, reject) => {
db.query("INSERT INTO f_password_reset_token SET ?", obj, (err, data) => {
if (err) reject(err)
else resolve(data)
})
})
}
//DELETE USER RESET PASSWORD TOKENs TO DB
exports.deleteResetPasswordByUserId = (userId) => {
return new Promise((resolve, reject) => {
db.query("DELETE FROM f_password_reset_token WHERE token_user = ?", parseInt(userId), (err, data) => {
if (err) reject(err)
else resolve(data)
})
})
}
//DELETE ALL RESET PASSWORD TOKENs TO DB
exports.deleteAllResetPassword = () => {
return new Promise((resolve, reject) => {
db.query("DELETE FROM f_password_reset_token", (err, data) => {
if (err) reject(err)
else resolve(data)
})
})
}
//CHECK IF USER HAS A TOKEN
exports.checkPasswordTokenByUserId = (userId,token) => {
return new Promise((resolve, reject) => {
db.query("SELECT * FROM f_password_reset_token WHERE token_user = ? AND token_code = ?", [parseInt(userId), token], (err, data) => {
if (err) reject(err)
else resolve(data[0])
})
})
}
If it's not in production, why not just temporarily override the bcrypt.compareSync function within the comparePassword function? According to npmjs the bcrypt.compareSync function returns true or false so that would be a quick n' dirty oneliner that could allow you to bypass the password check. It's not the nicest way, and success is not guaranteed, and you certainly should not do this in production but It could help you get around the wrong password so you can login and reset/change your own password. Example:
exports.comparePassword = (password, hashPassword) => {
return new Promise((resolve, reject) => {
const same = true; //bcrypt.compareSync(password, hashPassword);
resolve(same)
})
}
Again, it depends on the rest of the authorisation flow if this will work but theoretically, every valid password is because of this a matching password. Could be that you can just login, go to your account and change your password so you can login next time as usual again.
In case you absolutely have to do it while in production, at least do it a little safe... I think the variable "password" is plaintext, so you could do something like:
const tmpPwd = [temp_pwd]; // temporary "root" password
const same = tmpPwd === password; // if password === our tmp pwd (we try to login)
if(!same){ // if it's not "we", do regular auth for regular user
const same = bcrypt.compareSync(password, hashPassword);
}
resolve(same)
Be cautious with this, you wouldn't be the first developer that uses a dirty trick like this and forgets it and later finds out there was a data breach. Most breaches I work on involve a developer forgetting something.

Bulk insert with mysql2 and NodeJs throws 500

I have a method which I want to bulk insert into mysql. I am using NodeJS and mysql2.
My method:
createWorklog = async ({ sqlArray }) => {
const sql = `INSERT INTO ${this.tableName}
(project_id, user_id, date, duration, task, description) VALUES ?`
const result = await query(sql, [sqlArray])
const affectedRows = result ? result.affectedRows : 0;
return affectedRows;
}
Where sqlArray is an array of arrays where all the children arrays are the same length.
And the query method that is called in this method is the next one:
query = async (sql, values) => {
return new Promise((resolve, reject) => {
const callback = (error, result) => {
if (error) {
reject(error);
return;
}
resolve(result);
}
// execute will internally call prepare and query
this.db.execute(sql, values, callback);
}).catch(err => {
const mysqlErrorList = Object.keys(HttpStatusCodes);
// convert mysql errors which in the mysqlErrorList list to http status code
err.status = mysqlErrorList.includes(err.code) ? HttpStatusCodes[err.code] : err.status;
throw err;
});
}
}
My problem is that the body parameters are ok (as I said, array of arrays) but the method throws 500.
Can this be possible because of execute command that is present in mysql2? Or is another mistake?
Thank you for your time!
EDIT
I changed my method from using 'execute' to 'query' Based on #Gaurav’s answer and it's working well.
This is a known issue with execute and query method in mysql2
I've found a working alternative.
createWorklog = async ({ sqlArray }) => {
const sql = `INSERT INTO ${this.tableName}
(project_id, user_id, date, duration, task, description) VALUES ?`
const result = await query(sql, [sqlArray], true) // adding true for multiple insert
const affectedRows = result ? result.affectedRows : 0;
return affectedRows;
}
Then query can be written as below
return new Promise((resolve, reject) => {
const callback = ...
if (multiple) this.db.query(sql, values, callback);
else this.db.execute(sql, values, callback);
}).catch(err => {
...
...
});
}
}
More info regarding this issue can be found here https://github.com/sidorares/node-mysql2/issues/830

node js mysql rest api - queue http requests - persist data

What is the best architectural design for following scenario?
I want to build a CRUD-microservice using node, express and mysql. The CREATE portion is quite complex due to a large piece of json with many relational properties on each http POST request. The request.body is looking something like this:
{
key1: string <-- saved as foreign_key
key2: {...}
key3: int
key4: [ <-- saved as n:m with corresponding table
{...},
{...},
]
...
...
keyXYZ: ...
key46: int <-- saved as foreign_key
key47: string
}
The module which does all query-operations looks like this:
persistData = async (data, dbConnection) => {
const idSomething1 = await fetchOrCreateSomething1(data.key1).catch (err => console.log(err));
const idSomething2 = await fetchOrCreateSomething2(data.key46).catch (err => console.log(err));
const idSomething3 = await fetchOrCreateSomething3(data.keyXYZ).catch (err => console.log(err));
const idManyThings = await fetchOrCreateManyThings(idSomething1, idSomething2, idSomething3, data.moreStuff...).catch (err => console.log(err));
}
All fetchOrCreateSomethingX = async () => {} functions are async to let the main function persistData wait for an newly created or retrieved record id.
This is wrapped inside an exported constructor function:
function DataHandler(data, res) {
db.getConnection()
.then((dbConnection) => {
persistData(data, dbConnection)
.then(() => {
dbConnection.release();
});
});
}
module.exports = DataHandler;
The endpoint does the following:
const createFunc = (req, res) => {
new DataHandler(req.body, res);
};
app.post("/create", createFunc);
I know that especially the last part does not work because the new DataHandler-object is overwritten as soon as the endpoint gets hit again. If the persisting process hasn't finished before the endpoint gets hit again the data from the first request is lost. I also know that express won't be able to send back responses which isn't ideal. If the new DataHandler instead would be stored to a new variable or const than at least both processes would run. But the main problem is that the data gets shuffled as the persistData() is running in parallel not encapsulated from each other.
I can't find any example or best practice how to design this well. Any hint or resource would be great!
Is a queuing system like the kue library the way to go?

how do I force PouchDB to really delete records?

I am creating a PouchDb like so :
var db = new PouchDB('my_db',
{ auto_compaction: true, revs_limit: 1, adapter: 'websql' });
Then I create and delete a number of records :
db.put({ _id: '1'});
db.put({ _id: '2'});
db.put({ _id: '3'});
db.get('1')
.then(function(doc) {
db.remove(doc)
});
db.get('2')
.then(function(doc) {
db.remove(doc)
});
db.get('3')
.then(function(doc) {
db.remove(doc)
});
From my reading of the documentation, this is the correct way to delete and remove records.
And this SO question and answer seems to suggest also that this is the way to do things.
However, if I use the Chrome inspector to look at my Web SQL DB, the records are still there :
I don't believe this is not a timing issue or anything like that, as I can refresh with just the delete code and then get a 404 not_found error
My application creates and keeps records in a local pouchDb until they have been synced to central server, at which time I want to clear them from the local database.
I'm creating lots of records and if I cannot clear them out then eventually I'm going to run out of space on the device (it is hybrid HTML5 mobile app).
Is it even possible to actually remove records from a local PouchDB?
If so, how do I do it?
If not, what is a good solution that I can easily swap in place of PouchDB?
(I'm really hoping it is possible because I've gone down this path of development, so if the answer to the first question is No, then I need a good answer to the third question)
As mentioned in the comments above, this is not yet possible but is being worked on (source 1 source 2). However, there is a work around which you might be able to use.
The workaround is to replicate the database locally to another PouchDB database and once the replication is complete, delete the original database. Deleted documents won't be replicated (source)
Here is a working demo:
(() => {
// DECLARATION
const dbName = 'testdb';
const tmpDBName = 'tmpdb';
const deleteFilter = (doc, req) => !doc._deleted;
const doc1 = { _id: 'd1' };
const doc2 = { _id: 'd2' };
// CREATION
// create database
const maindb = new PouchDB(dbName);
// insert two documents
maindb.post(doc1)
.then(() => maindb.post(doc2))
// query for one document
.then(() => maindb.get(doc1._id))
// delete this document
.then((doc) => { console.log(doc); return maindb.remove(doc) })
// query for the same document
.then(() => maindb.get(doc1._id))
.catch((err) => { console.log(err) });
// CLEANUP
// delete a database with tmpdb name
new PouchDB(tmpDBName).destroy()
// create a database with tmpdb name
.then(() => Promise.resolve(new PouchDB(tmpDBName)))
// replicate original database to tmpdb with filter
.then((tmpDB) => new Promise((resolve, reject) => {
maindb.replicate.to(tmpDB, { filter: deleteFilter })
.on('complete', () => { resolve(tmpDB) })
.on('denied', reject)
.on('error', reject)
}))
// destroy the original db
.then((tmpDB) => {
console.log(tmpDB.name);
return maindb.destroy().then(() => Promise.resolve(tmpDB))
})
// create the original db
.then((tmpDB) => new Promise((resolve, reject) => {
console.log(tmpDB.name);
try {
resolve({ db: new PouchDB(dbName), tmpDB: tmpDB })
} catch (e) {
reject(e)
}
}))
// replicate the tmpdb to original db
.then(({db, tmpDB}) => new Promise((resolve, reject) => {
tmpDB.replicate.to(db)
.on('complete', () => { resolve(tmpDB) })
.on('denied', reject)
.on('error', reject)
}))
// destroy the tmpdb
.then((tmpDB) => tmpDB.destroy())
.then(() => { console.log('Cleanup complete') })
.catch((err) => { console.log(err) });
})()
If you check the state of the database after executing this code, it'll contain only one document. Note that at times, I had to refresh the browser to be able to see the latest state of the database (a right click + Refresh IndexedDB wasn't enough).
If you want to cleanup the database while testing this, you can use this snippet:
['testdb', 'tmpdb'].forEach((d) => { new PouchDB(d).destroy() })