Getting auto-generated (via trigger) field from an insert in sequelize - mysql

I have a base controller for generic insert/update operations across the whole API, using only a table dictionary so we can use the same function to insert data into many tables.
The problem is there is a table that uses a correlative number generated via trigger, and when sequelize returns the inserted value, it includes the new ID but the correlative field returns empty, and I need it to show it on the interface.
I've thought of just querying the new field again to the API, or querying it on the same save function again when it includes these certain tables names, but is there a way to tell sequelize to "wait" for this new generated value and then return the data alright? Just like getting the new ID
Or maybe this needs to be fixed on the database? I don't have much experience in that field, but we are using MySQL if that helps.
function Init(models, dictionary) {
this.post = (req, res, next) => {
const { obj } = req.body;
const model = models[dictionary[obj._type]];
//Just stripping fields starting with "_"
const objClear = {};
for (const attr in obj) {
if (attr.charAt(0) !== '_') {
objClear[attr] = obj[attr];
}
}
//Saving
model.create(objClear).then(
(objSaved) => {
const data = {
obj: objSaved.get({ plain: true }),
action: 'inserted',
};
//I guess I could query the new row here again
res.json(data);
},
).catch(next);
};
}
module.exports = {
Init,
};
The response looks like:
{"obj":{"TOTAL":"0","ID":14,...,"TRANSACTION_NO":""},"action":"inserted"}
Where TRANSACTION_NO is the field generated with a trigger.

AFAIK, you have to query the new row unless you use Postgres (in which case you might try the Model.create option called "options.returning")
Two quick tests that did NOT solve the problem:
an afterCreate hook - the model still shows fields created by a trigger as null.
a model having a default value from a DB function - the model shows the function call,
not the result of the function (which does make it to the DB field).
Hope someone else has a solution!

Related

How to update a value in an object stored in a Redis

Im very new to Redis but it seems like somthing my program need to work faster.
I have build my whole database with mongoose/mongodbAtlas.
But is there a way to update one item in the object I got from the database and set in cache. I want to update a location in the setted redis key many times and only need to save the last updated location to the actual database.
So far I have some code to get 1 object from the database and store it in redis but I want to implement the updating part in this function as it is used for the PUT request to update a persons location every second
const updateLocation = async (req, res) => {
const { id} = req.params;
if (!redisClient.isOpen) {
await redisClient.connect()
console.log('connected')
}
const value = await redisClient.get(`person-${id}`)
if (value) {
res.json(value)
// Here I would like to update the documents location everytime
//this endpoint is called from frontend
} else {
const res = await Person.findById(id);
await redisClient.set(`person-${id}`, res);
console.log("from source data")
res.status(200).json(res);
}
};

Sequelize update ignores invalid column names

When I try to update a certain entry in the db, sequelize - Model.update() ignores the wrong column name passed to it. Say my table has columns 'id' and 'password', if I pass an object that has 'id' and 'pwd' to the update function, the 'pwd' is simply ignored and 'id' is updated.
Is there a way, through sequelize, to check if an invalid column name is being passed to the update function?
You can do this by adding a custom instance function to your Model via prototype to be able to access this to pass up to this.update() if your checks pass.
const MyModel = sequelize.define(
'table_name',
{ ...columns },
{ ...options },
);
MyModel.prototype.validatedUpdate = async (updates, options) => {
// get a list of all the column names from the attributes, this will include VIRTUAL, but you could filter first.
const columnNames = Object.keys(MyModel.attributes);
// the keys from the updates we are trying to make
const updateNames = Object.keys(updates);
// check to see if each of the updates exists in the list of attributes
updateNames.forEach(updateName => {
// throw an Error if we can't find one.
if (!columNames.some((columnName) => columnName == updateName)) {
throw new Error(`The field ${updateName} does not exist.`);
}
});
// pass it along to the normal update() chain
return this.update(updates, options);
}
module.exports = MyModel;
Then use it like this:
try {
const myInstance = await MyModel.findById(someId);
await myInstance.validatedUpdate({ fake: 'field' });
} catch(err) {
console.log(err); // field does not exist
}

Using KnexJS to query X number of tables?

I have a unique situation here which I am having trouble solving in an elegant fashion.
A user passes up an array of signals which they want to export data for. This array can be 1 -> Any_Number so first I go fetch the table names (each signal stores data in a separate table) based on the signals passed and store those in an object.
The next step is to iterate over that object (which contains the table names I need to query), execute the query per table and store the results in an object which will be passed to next chain in the Promise. I haven't seen any examples online of good ways to handle this but I know it's a fairly unique scenario.
My code prior to attempting to add support for arrays of signals was simply the following:
exports.getRawDataForExport = function(data) {
return new Promise(function(resolve, reject) {
var getTableName = function() {
return knex('monitored_parameter')
.where('device_id', data.device_id)
.andWhere('internal_name', data.param)
.first()
.then(function(row) {
if(row) {
var resp = {"table" : 'monitored_parameter_data_' + row.id, "param" : row.display_name};
return resp;
}
});
}
var getData = function(runningResult) {
return knexHistory(runningResult.table)
.select('data_value as value', 'unit', 'created')
.then(function(rows) {
runningResult.data = rows;
return runningResult;
});
}
var createFile = function(runningResult) {
var fields = ['value', 'unit', 'created'],
csvFileName = filePathExport + runningResult.param + '_export.csv',
zipFileName = filePathExport + runningResult.param + '_export.gz';
var csv = json2csv({data : runningResult.data, fields : fields, doubleQuotes : ''});
fs.writeFileSync(csvFileName, csv);
// create streams for gZipping
var input = fs.createReadStream(csvFileName);
var output = fs.createWriteStream(zipFileName);
// gZip
input.pipe(gzip).pipe(output);
return zipFileName;
}
getTableName()
.then(getData)
.then(createFile)
.then(function(zipFile) {
resolve(zipFile);
});
});
}
Obviously that works fine for a single table and I have gotten the getTableName() and createFile() methods updated to handle arrays of data so this question only pertains to the getData() method.
Cheers!
This kind of problem is far from unique and, approached the right way, is very simply solved.
Don't rewrite any of the three internal functions.
Just purge the explicit promise construction antipattern from .getRawDataForExport() such that it returns a naturally occurring promise and propagates asynchronous errors to the caller.
return getTableName()
.then(getData)
.then(createFile);
Now, .getRawDataForExport() is the basic building-block for your multiple "gets".
Then, a design choice; parallel versus sequential operations. Both are very well documented.
Parallel:
exports.getMultiple = function(arrayOfSignals) {
return Promise.all(arrayOfSignals.map(getRawDataForExport));
};
Sequential:
exports.getMultiple = function(arrayOfSignals) {
return arrayOfSignals.reduce(function(promise, signal) {
return promise.then(function() {
return getRawDataForExport(signal);
});
}, Promise.resolve());
};
In the first instance, for best potential performance, try parallel.
If the server chokes, or is likely ever to choke, on parallel operations, choose sequential.

Finding out how many status objects are in a store

I have a store called CreativeStore and inside one of the fields is Status. The data is sent as JSON. I created a variable that is getting the Creative store. How would I find out what the status is and how many statuses their are.
In my Creative Model I have a field
}, {
type: 'int',
name: 'Status'
}, {
In my View Controller I have a method that checks if the store I created for the Creative Model exists (It does) and I assign it to a var called test.
var test = this.getCreativeStore();
getCreativeStore: function () {
var creativeStore = this.getStore('creativeStore');
if (!creativeStore) {
this.logError('creativeStore is undefined');
}
return creativeStore;
}
How do I find out how many Statuses are in the variable test?
You can use collect:
Collects unique values for a particular dataIndex from this store.
For example:
test.collect('status').length;
var statusCount = 0;
test.each(function(record) {
// Your status value in myStatus
if(record.get('Status') === myStatus)
++statusCount;
});

Convert Mongoose docs to json

I returned mongoose docs as json in this way:
UserModel.find({}, function (err, users) {
return res.end(JSON.stringify(users));
}
However, user.__proto__ was also returned. How can I return without it? I tried this but not worked:
UserModel.find({}, function (err, users) {
return res.end(users.toJSON()); // has no method 'toJSON'
}
You may also try mongoosejs's lean() :
UserModel.find().lean().exec(function (err, users) {
return res.end(JSON.stringify(users));
});
Late answer but you can also try this when defining your schema.
/**
* toJSON implementation
*/
schema.options.toJSON = {
transform: function(doc, ret, options) {
ret.id = ret._id;
delete ret._id;
delete ret.__v;
return ret;
}
};
Note that ret is the JSON'ed object, and it's not an instance of the mongoose model. You'll operate on it right on object hashes, without getters/setters.
And then:
Model
.findById(modelId)
.exec(function (dbErr, modelDoc){
if(dbErr) return handleErr(dbErr);
return res.send(modelDoc.toJSON(), 200);
});
Edit: Feb 2015
Because I didn't provide a solution to the missing toJSON (or toObject) method(s) I will explain the difference between my usage example and OP's usage example.
OP:
UserModel
.find({}) // will get all users
.exec(function(err, users) {
// supposing that we don't have an error
// and we had users in our collection,
// the users variable here is an array
// of mongoose instances;
// wrong usage (from OP's example)
// return res.end(users.toJSON()); // has no method toJSON
// correct usage
// to apply the toJSON transformation on instances, you have to
// iterate through the users array
var transformedUsers = users.map(function(user) {
return user.toJSON();
});
// finish the request
res.end(transformedUsers);
});
My Example:
UserModel
.findById(someId) // will get a single user
.exec(function(err, user) {
// handle the error, if any
if(err) return handleError(err);
if(null !== user) {
// user might be null if no user matched
// the given id (someId)
// the toJSON method is available here,
// since the user variable here is a
// mongoose model instance
return res.end(user.toJSON());
}
});
First of all, try toObject() instead of toJSON() maybe?
Secondly, you'll need to call it on the actual documents and not the array, so maybe try something more annoying like this:
var flatUsers = users.map(function() {
return user.toObject();
})
return res.end(JSON.stringify(flatUsers));
It's a guess, but I hope it helps
model.find({Branch:branch},function (err, docs){
if (err) res.send(err)
res.send(JSON.parse(JSON.stringify(docs)))
});
I found out I made a mistake. There's no need to call toObject() or toJSON() at all. The __proto__ in the question came from jquery, not mongoose. Here's my test:
UserModel.find({}, function (err, users) {
console.log(users.save); // { [Function] numAsyncPres: 0 }
var json = JSON.stringify(users);
users = users.map(function (user) {
return user.toObject();
}
console.log(user.save); // undefined
console.log(json == JSON.stringify(users)); // true
}
doc.toObject() removes doc.prototype from a doc. But it makes no difference in JSON.stringify(doc). And it's not needed in this case.
Maybe a bit astray to the answer, but if anyone who is looking to do the other way around, you can use Model.hydrate() (since mongoose v4) to convert a javascript object (JSON) to a mongoose document.
An useful case would be when you using Model.aggregate(...). Because it is actually returning plain JS object, so you may want to convert it into a mongoose document in order to get access to Model.method (e.g. your virtual property defined in the schema).
PS. I thought it should have a thread running like "Convert json to Mongoose docs", but actually not, and since I've found out the answer, so I think it is not good to do self-post-and-self-answer.
You can use res.json() to jsonify any object.
lean() will remove all the empty fields in the mongoose query.
UserModel.find().lean().exec(function (err, users) {
return res.json(users);
}
It worked for me:
Products.find({}).then(a => console.log(a.map(p => p.toJSON())))
also if you want use getters, you should add its option also (on defining schema):
new mongoose.Schema({...}, {toJSON: {getters: true}})
Try this options:
UserModel.find({}, function (err, users) {
//i got into errors using so i changed to res.send()
return res.send( JSON.parse(JSON.stringify(users)) );
//Or
//return JSON.parse(JSON.stringify(users));
}
Was kinda laughing at how cumbersome this was for a second, given that this must be extremely common.
Did not bother digging in the docs and hacked this together instead.
const data = await this.model.logs.find({ "case_id": { $regex: /./, $options: 'i' }})
let res = data.map(e=>e._doc)
res.forEach(element => {
//del unwanted data
delete element._id
delete element.__v
});
return res
First i get all docs which have any value at all for the case_id field(just get all docs in collection)
Then get the actual data from the mongoose document via array.map
Remove unwanted props on object by mutating i directly