I'm trying to validate a date field based on information stored in the database through another model.
When I test the api, validation works correctly throwing the exception, however, the insertion occurs before this exception. That is, it does not prevent the insertion in thedatabase. Where I went wrong?
This is my validate function:
module.exports = (sequelize, DataTypes) => {
const Step = sequelize.define('Step', {
...
resultDate: {
type: DataTypes.DATE,
validate: {
isEven(value){
sequelize.models.Call
.findById(this.call_id)
.then(call => {
if(value >= call.endingDate) throw new Error('Error message here!');
});
...
And this is the result:
Executing (default): SELECT [...] `Call`.`id` = '19c7e81e-5c23-4fd5-8623-0170deee6cd4');
Executing (default): INSERT INTO `Steps` [...];
Unhandled rejection Error message here!
Clearly, the initial SELECT is to perform validation, however, before the validation quit and throw the exception, the API inserts into the database and returns success already!
How do I ask the model to wait for all validations before inserting?
By changing custom validator arity (the second argument is a callback) you can change it to an asynchronous handler. So your code should look like this:
module.exports = (sequelize, DataTypes) => {
const Step = sequelize.define('Step', {
...
resultDate: {
type: DataTypes.DATE,
validate: {
isEven(value, next){
sequelize.models.Call
.findById(this.call_id)
.then(call => {
next(value >= call.endingDate ? 'Error message here!' : null)
})
.catch(next);
...
Related
I am working on an application using NextJS and Typescript and am attempting to determine the best way to properly type my MySQL responses. Here is the API endpoint:
import { hash } from "bcrypt";
import type { NextApiRequest, NextApiResponse } from "next";
import randomstring from "randomstring";
import { executeQuery } from "../../../lib/db";
const Test = async (req: NextApiRequest, res: NextApiResponse) => {
// Manage password generation
const password = randomstring.generate(16);
const hashedPassword = hash(password, 10);
// Create new auth using email and password
const auth = await executeQuery(
"INSERT INTO auth (email, password) VALUES (?, ?)",
["test#test.com", (await hashedPassword).toString()]
);
res.statusCode = 200;
res.json(auth.insertId);
};
export default Test;
I am wanting to strongly type insertId to remove all warnings and errors through ESLint, but unfortunately, every effort I have made has been unsuccessful. The error I am getting is:
Property 'insertId' does not exist on type 'RowDataPacket[] | RowDataPacket[][] | OkPacket | OkPacket[] | ResultSetHeader | { error: unknown; }'.
Property 'insertId' does not exist on type 'RowDataPacket[]'.ts(2339)
My executeQuery function is defined as:
import mysql from "mysql2/promise";
export const executeQuery = async (query: string, params: unknown[] = []) => {
try {
const db = await mysql.createConnection({
host: process.env.MYSQL_HOST,
database: process.env.MYSQL_DATABASE,
user: process.env.MYSQL_USER,
password: process.env.MYSQL_PASSWORD,
});
const [results] = await db.execute(query, params);
db.end();
return results;
} catch (error) {
return { error };
}
};
One of my implementation attempts was this SO response but I could not get it to work...
Any and all help is greatly appreciated!
So, I managed to solve my own problem after tackling it for a while.
It turns out, I was doing my checks incorrectly.
Before calling auth.insertId, you'll want to include the following check:
if (!auth || !("insertId" in auth)) {
// Do something
}
That way, you don't actually have to type anything, because it can get super complicated when you are attempting to do that with mysql.
I did as well get myself in the same problem as you, after looking up in the mysql2 types files, I found you can pass the result type of the query.
For example:
connection.query<OkPacket>('INSERT INTO posts SET ?', {title: 'test'},
function (error, results, fields) {
if (error) throw error;
console.log(results.insertId);
});
For inserts and updates you can be using the OkPacket type and for selects you can use RowDataPacket. You can type it even further by implementing RowDataPacket to the expected responsey type of the query and passing it in the query function.
For example:
export interface Post {
title: string;
}
export interface PostRow extends RowDataPacket, Post {}
Then when you are querying you can pass as follows:
connection.query<PostRow[]>('SELECT * FROM posts WHERE id = 1',
function (error, results, fields) {
if (error) throw error;
// ...
});
Here it is my 2 cents, hope it helps someone in the future.
I have this relationship
User -> Cartlists <- Product
I am trying to show the products I have in my cartlists by querying it like this using the where function
router.get('/cart', ensureAuthenticated, async function (req, res) {
let cartlists = await Cartlist.findAll()
Product.findAll({
include: User,
where: { productId: cartlists.productProductId },
raw: true
})
.then((cartlists) => {
res.render('checkout/cart', { cartlists });
})
.catch(err =>
console.log(err));
})
However, when I debugged it, it shows me that cartlists.productProductId is undefined even though there is a value in mySQL. I also get an error message saying
'WHERE parameter "productId" has invalid "undefined" value'
Does anyone know why it gives me an undefined value even though there are indeed values in MySQL table and how I might be able to fix this? Or how can I query just the products I have added to my cartlists table?
We are trying to negate SQL Injection errors by implementing bind parameters in our sequelize code
The Session table has been created with SessionId as a VARCHAR(36) field that holds a UUID value
We call the function below and pass in the sessionId in as a string. When the sessionId starts with a numeric character the function works when it starts with an alpha character it returns null
e.g. sessionId = '0b0885f7-110f-4a11-9432-74b0ef8940a7' works
and sessionId = 'acaf8037-be47-454b-a41d-f62cdc1e08ac' fails
We are running mySQL '5.7.30-0ubuntu0.16.04.1' database
getSession = (sessionId) => {
return Session.findOne({
where: {
SessionId: sequelize.literal('$1')
},
bind: [sessionId],
})
.then((session) => {
if (session === null) {
throw new Error('Session doesnt exist');
} else {
return session.get({plain: true});
}
})
.catch(err => err);
};
Any help/suggestions would be much appreciated
I'm trying to insert a simple user object into a mysql database using sequelize orm.
User model image
body payload & error image
Insert code:
try {
const { body } = req
const user = await User.create(body) // It's breaking here :(
const userJson = user.toJSON()
res.send({
user: userJson,
token: jwtSignUser(userJson)
})
} catch (err) {
res.status(400).send({
error: 'Something went wrong!'
})
I believe that you have not defined the hooks correctly. They are supposed to be functions and I do not see hashPassword being defined in the code that has been shared.
From the docs
hooks: {
beforeValidate: (user, options) => {
user.mood = 'happy';
},
afterValidate: (user, options) => {
user.username = 'Toni';
}
}
Make sure the reference to hashPassword function declaration is correct if it's in a different file :)
I am using Knex with node.js to create a table and insert some data to it. First I was first creating table and then inserting data but it ended up so that sometimes table was not created yet when data was going to be inserted. Then I ended up using callbacks like below. Now I'm mixing callbacks and promises and I'm not sure if it's very good thing. What could I do to make following work without callback and still take care that table is created before inserting data?
function executeCallback(next, tableName) {
knex.schema.hasTable(tableName)
.then((exists) => {
if (!exists) {
debug('not exists');
// Table creation for mysql
knex.raw(`CREATE TABLE ${tableName} ( id INT(6) UNSIGNED AUTO_INCREMENT PRIMARY KEY, timestamp BIGINT NOT NULL, deviceId VARCHAR(255) NOT NULL, data JSON )`)
.then((rows) => {
debug(rows);
next('Table created (mysql)');
})
.catch((err) => {
debug(`Error: ${err}`);
next(`Error: ${err}`);
});
} else {
debug('Table exists');
next('Table exists');
}
});
}
.
executeCallback((response) => {
debug('back from callback', response);
debug('insert');
knex(req.body.tableName).insert({
timestamp: req.body.timestamp,
deviceId: req.body.deviceId,
data: req.body.data,
})
.catch((err) => {
debug(`Error: ${err}`);
res.status(500).json({ success: false, message: `Error: ${err}` });
})
.then((dataid) => {
debug(`Inserted with id: ${dataid}`);
res.status(201).json({ success: true });
});
}, req.body.tableName);
In general mixing callbacks and Promises is discouraged. I would suggest looking into the async/await pattern for using Promises, as that is often easier to read in code. It works well with knex js too.
One trick with Node callbacks is the convention of the function parameters, where the first parameter is the error, and the second is the success result. Like this: function (error, results) {...} This makes the result easy to check, like
if(error) {
// do error stuff
return
}
// do success stuff with `results`
One could call that function like next(new Error('bad')) for an error, or next(null, 'success object') for a success.
Your callback next is only taking one parameter, and you are not checking its value. It matters whether the result was 'Table Exists' 'Table Created' or 'Error' to what you do next.
You might try something like this:
async function handleInsert(tableName, res) {
try {
let hasTable = await knex.schema.hasTable(tableName)
if(!exists) {
let createResult = await knex.raw(`CREATE TABLE...`)
// check create results, throw if something went wrong
}
//table guaranteed to exist at this point
let insertResult = await knex(req.body.tableName).insert({
timestamp: req.body.timestamp,
deviceId: req.body.deviceId,
data: req.body.data,
})
debug(`Inserted with id: ${insertResult}`) //might need insertResult[0]
res.status(201).json({ success: true })
} catch(err) {
// any error thrown comes here
console.log('Server error: ' + err)
res.error('Bad thing happened, but do not tell client about your DB')
}
}
One more thing. In general, you can either assume the tables you need exist already. Or use a migration to build your DB on server start/update.