ExpressJS / MYSQL / Prisma - Save DB' entities changes - mysql

I'm looking for a way to save database entities changes for some entities. I mean I need to save in a database table all changes that are done on some tables (add, modify / delete) with ability to track user which did the change.
I'm working on NextJS with a custom ExpressJS server and MYSQL database were I use Prisma as ORM. I think it's maybe possible to write an ExpressJS middleware but I have yet no idea how to do it and asking myself if any library already exist.
Usually I work on PHP Symfony and used to manage this StofDoctrineExtensionsBundle which is great and works as expected. But my current project is a Typescript project only with Express/NextJS/React/Prisma/MYSQL.
Any feedback from your knowledge will be very appreciate.
Thank's in advance.
Regards,
Gulivert
EDIT: My current API which has to be moved to Express/NextJS is still running on Symfony and the table where all changes is logged looks like this :
{
"id": 59807,
"user": "ccba6ad2-0ae8-11ec-813f-0242c0a84005",
"patient": "84c3ef66-548a-11ea-8425-0242ac140002",
"action": "update",
"logged_at": "2021-11-02 17:55:09",
"object_id": "84c3ef66-548a-11ea-8425-0242ac140002",
"object_class": "App\\Entity\\Patient",
"version": 5,
"data": "a:2:{s:10:\"birth_name\";s:2:\"--\";s:10:\"profession\";s:2:\"--\";}",
"username": "johndoe",
"object_name": "patient",
"description": null
}
Explanation about database columns:
user => relation to user table
patient => relation to patient table
action => can be "create"/"update"/delete"
logged_at => date time where the change was done
object_id => entity row ID where an entity get a change
object_class => the entity updated
version => how many time the object was change
data => all data changed during the modification
username => the username of logged user did the change
object_name => a string to identify the object modified without
using the namespace of object_class
description => a value that can be update on some specific change * during usually the action delete to keep a trace what was deleted for instance

You might find prisma middleware useful for this.
Check out the example with session data middleware which is somewhat similar to what you're doing.
For your use-case the middleware might look like something like this:
const prisma = new PrismaClient()
const contextLanguage = 'en-us' // Session state
prisma.$use(async (params, next) => {
// you can find all possible params.action values in the `PrismaAction` type in `.prisma/client/index.d.ts`.
if (params.model == '_modelWhereChangeIsTracked_' && (params.action == 'create' || params.action == "update")) {
// business logic to create an entry into the change logging table using session data of the user.
}
return next(params)
})
// this will trigger the middleware
const create = await prisma._modelWhereChangeIsTracked_.create({
data: {
foo: "bar"
},
})
However, do note that there are some performance considerations when using Prisma middleware.
You can also create express middleware for the routes where you anticipate changes that need to be logged in the change table. Personally, I would prefer this approach in most cases, especially if the number of API routes where changes need to be logged is known in advance and limited in number.

Related

Race conditions in Laravel when using split "read" and "write" database connections

I have a Laravel application which uses a lot of AJAX POST and GET requests (Single Page Application). Once an item is saved via POST, a GET request is sent to reload parts of the page and get any new data.
After enabling split read and write database connections using the Laravel connection configuration, the application runs incredibly quickly (never thought this would be a problem!). It saves and then requests so quickly that the RO database (reporting just 22ms behind) doesn't get chance to update and I end up with old information.
I have enabled the sticky parameter in the database configuration which I thought would mitigate the problem, but the POST and GET requests are separate so the stickiness gets lost.
I could rewrite a large portion of the application POST requests respond with the correct data, but this doesn't work for reloading many components at once and is an enormous job so I see this as a last resort.
Another idea I had was to modify the getReadPdo(){...} method and $recordsModified value inside the Database Connection class so that the stickiness is saved on the user's session for up-to 1 second. I was unsure if this would cause any further issues with speed or excessive session loading that it would cause more problems.
Has anyone experienced this before or have any ideas on how to tackle the problem?
Thanks in advance.
Thought I'd update and answer this in case anyone else came across the same issue.
This isn't a perfect solution but has worked well over the last week or so.
Inside the AppServiceProvider boot() method, I added the following
DB::listen(function ($query) {
if (strpos($query->sql, 'select') !== FALSE) {
if (time() < session('force_pdo_write_until')) {
DB::connection()->recordsHaveBeenModified(true);
}
} else {
session(['force_pdo_write_until' => time() + 1]);
}
});
In a nutshell, this listens to every DB query. If the current query is a SELECT (DB read), we check to see if the "force_pdo_write_until" key inside the user session has a timestamp that is more than the current time. If it is, we trick the current DB connection into using the ReadPDO by utilizing the recordsHaveBeenModified() method - this is how the core Laravel sticky sessions are normally detected
If the current query is not a SELECT (most likely a DB Write), we set the session variable for "force_pdo_write_until" for 1 second in the future.
Any time a POST request is sent, if the next GET request is within 1 second of the previous query, we can be sure that the current user will be using the RW DB connection and get the correct results.
Update (09/12/19):
It turns out the solution above doesn't actually modify the DB connection at all, it was just adding a few milliseconds of processing time to any request so looked like it was working about 75% of the time (because the DB replica lag fluctuates depending on load).
In the end I decided I'd go a bit deeper and override the DB connection class directly and modify the relevant functions. My Laravel instances uses MySQL, so I overrode the Illuminate\Database\MySqlConnection class. This new class was registered through a new service provider, which in turn is loaded through the config.
I've copied the config and files I used below to make it easier for any new developers to understand. If you're copying these directly, make sure you also add the 'sticky_by_session' flag to your connection config as well.
config/database.php
'connections' => [
'mysql' => [
'sticky' => true,
'sticky_by_session' => true,
...
],
],
config/app.php
'providers' => [
App\Providers\DatabaseServiceProvider::class
...
],
app/Providers/DatabaseServiceProvider.php
<?php
namespace App\Providers;
use App\Database\MySqlConnection;
use Illuminate\Database\Connection;
use Illuminate\Support\ServiceProvider;
class DatabaseServiceProvider extends ServiceProvider
{
/**
* Register the service provider.
*
* #return void
*/
public function register()
{
if (config('database.connections.mysql.sticky_by_session')) {
Connection::resolverFor('mysql', function ($connection, $database, $prefix, $config) {
return new MySqlConnection($connection, $database, $prefix, $config);
});
}
}
}
app/Database/MySqlConnection.php
<?php
namespace App\Database;
use Illuminate\Database\MySqlConnection as BaseMysqlConnection;
class MySqlConnection extends BaseMysqlConnection
{
public function recordsHaveBeenModified($value = true)
{
session(['force_pdo_write_until' => time() + 1]);
parent::recordsHaveBeenModified($value);
}
public function select($query, $bindings = [], $useReadPdo = true)
{
if (time() < session('force_pdo_write_until')) {
return parent::select($query, $bindings, false);
}
return parent::select($query, $bindings, $useReadPdo);
}
}
Inside of recordsHaveBeenModified(), we just add a session variable for later use. This method is used by the normal Laravel sticky session detection, as mentioned previously.
Inside of select(), we check to see if the session variable was set less than a second ago. If so, we manually force the request to use the RW connection, otherwise just continue as normal.
Now that we're directly modifying the request, I haven't seen any RO race conditions or effects from the replica lag.
I've published as a package!
mpyw/laravel-cached-database-stickiness: Guarantee database stickiness over the same user's consecutive requests
Installing
composer require mpyw/laravel-cached-database-stickiness
The default implementation is provided by ConnectionServiceProvider, however, package discovery is not available.
Be careful that you MUST register it in config/app.php by yourself.
<?php
return [
/* ... */
'providers' => [
/* ... */
Mpyw\LaravelCachedDatabaseStickiness\ConnectionServiceProvider::class,
/* ... */
],
/* ... */
];
Thats all! All problems will be solved.

email verification from database using express js node js and angular 6 with mysql database

i am creating a user, with 'email' field so i want to verify whether that email is already exist or not, if exists error must display. i have my code in express js, node js, angular 6 and mysql database and below is the code to create new user
exports.create = (req, res) => {
// Save to MySQL database
let customer = req.body;
Customer.create(customer).then(result => {
// Send created customer to client
res.json(result);
});
};
where should i use if statement in above code
Thanks in advance
I'm thinking the simplest way of solving your problem is making the email column in the database unique. If you try to insert a new user with an already existing email the query will fail.
Another solution would be that you first do a query that looks in the database if an already existing user has the email (from req.body.email). But that would require having two different SQL queries, which I personally would not prefer.
i think you are using Sequelize ORM.
You can do like this
Customer.findOrCreate({
where: {
email: req.body.email,
},
// other datas needs to inserted
defaults: {
name: req.body.name,
username: req.body.username,
},
}).spread((data, created) => {
if (created) {
// your logics
} else {
res.status(400).send(`${req.body.email} already exists.`);
}
});

Hooks not triggering when inserting raw queries via sequelize.query()

I have the following Employee model for a MySQL database:
var bcrypt = require('bcrypt');
module.exports = (sequelize, DataTypes) => {
const Employee = sequelize.define(
"Employee",
{
username: DataTypes.STRING,
password: DataTypes.STRING,
}, {}
);
return Employee;
};
Seeding the database is done by reading a .sql file containing 10,000+ employees via raw queries:
sequelize.query(mySeedingSqlFileHere);
The problem is that the passwords in the SQL file are plain text and I'd like to use bcrypt to hash them before inserting into the database. I've never done bulk inserts before so I was looking into Sequelize docs for adding a hook to the Employee model, like so:
hooks: {
beforeBulkCreate: (employees, options) => {
for (employee in employees) {
if (employee.password) {
employee.password = await bcrypt.hash(employee.password, 10);
}
}
}
}
This isn't working as I'm still getting the plain text values after reseeding - should I be looking into another way? I was looking into sequelize capitalize name before saving in database - instance hook
Your hooks won't be called until you use model's function for DB operation , so if you are running raw query , hooks will never be fired,
Reason : You can write anything inside your raw query , select/insert/update/delete anything , how does sequelize.js know that
it has to fire the hooks. This is only possible when you use methods
like
Model.create();
Model.bulkCreate();
Model.update();
Model.destroy;
And as per DOC raw query doesn't have hooks option to add.
And for MODEL queries you can check that it has option to
enable/disable hook.

How do you insert / find rows related by foreign keys from different tables using Sequelize?

I think I've done enough research on this subject and I've only got a headache.
Here is what I have done and understood: I have restructured my MySQL database so that I will keep my user's data in different tables, I am using foreign keys. Until now I only concluded that foreign keys are only used for consistency and control and they do not automatize or do anything else (for example, to insert data about the same user in two tables I need to use two separate insert statements and the foreign key will not help to make this different or automatic in some way).
Fine. Here is what I want to do: I want to use Sequelize to insert, update and retrieve data altogether from all the related tables at once and I have absolutely no idea on how to do that. For example, if a user registers, I want to be able to insert the data in the table "A" containing some user information and in the same task insert in the table B some other data (like the user's settings in the dedicated table or whatever). Same with retrievals, I want to be able to get an object (or array) with all the related data from different tables fitting in the criteria I want to find by.
Sequelize documentation covers the things in a way that every thing depends on the previous one, and Sequelize is pretty bloated with a lot of stuff I do not need. I do not want to use .sync(). I do not want to use migrations. I have the structure of my database created already and I want Sequelize to attach to it.
Is it possible insert and retrieve several rows related at the same time and getting / using a single Sequelize command / object? How?
Again, by "related data" I mean data "linked" by sharing the same foreign key.
Is it possible insert and retrieve several rows related at the same
time and getting / using a single Sequelize command / object? How?
Yes. What you need is eager loading.
Look at the following example
const User = sequelize.define('user', {
username: Sequelize.STRING,
});
const Address = sequelize.define('add', {
address: Sequelize.STRING,
});
const Designation = sequelize.define('designation', {
designation: Sequelize.STRING,
});
User.hasOne(Address);
User.hasMany(Designation);
sequelize.sync({ force: true })
.then(() => User.create({
username: 'test123',
add: {
address: 'this is dummy address'
},
designations: [
{ designation: 'designation1' },
{ designation: 'designation2' },
],
}, { include: [Address, Designation] }))
.then(user => {
User.findAll({
include: [Address, Designation],
}).then((result) => {
console.log(result);
});
});
In console.log, you will get all the data with all its associated models that you want to include in the query

Why Are My CakePHP AROs Not Being Created?

I followed the CakePHP Cookbook's simple ACL application tutorial and for a while all way fine and dandy. When I created a user, my AROs were automagically created too, and without too much effort I was able to give everyone permissions for the correct actions.
My application has become more complex now though. When I create a "Realtor", I create a user for them in the Realtor model's afterSave function, like so:
App::import( 'Component', 'Auth' );
$this->Auth = new AuthComponent();
$this->User->create();
$this->User->set(array(
'username' => $this->data['Realtor']['email'],
'password' => $this->Auth->password($this->data['Realtor']['password']),
'usergroup_id' => 2,
'realtor_num' => $this->id
));
if ($this->User->save()) {
$this->save(array('user_id'=>$this->User->id));
} else {
//error
}
Unfortunately, while this is successfully creating users, and the data all seems to match up with my expectations, I'm seemingly no longer getting AROs.
My Usergroup model contains the line
var $actsAs = array('Acl' => array('type' => 'requester'));
Beyond that, I have no idea how I would persuade my application to generate an ARO.
Is there anything I could have forgotten, that would help me get my ACL back on track?
EDIT:
I had this in the User model's afterSave which seems to have been causing various kinds of trouble:
function afterSave($created) {
if (!$created) {
$parent = $this->parentNode();
$parent = $this->node($parent);
$node = $this->node();
$aro = $node[0];
$aro['Aro']['parent_id'] = $parent[0]['Aro']['id'];
$this->Aro->save($aro);
}
}
(courtesy of this article: http://mark-story.com/posts/view/auth-and-acl-automatically-updating-user-aros) I don't know if that would have been fouling up my ARO creation somehow... probably teach me to add in random code snippets without fully understanding what they're doing, at the very least!
Ok I' m newbie
I can't understand your problem but
now I use Acl component with alaxos acl plugin
I try to understand of modified tree traversal algorithm
and set basic data for aro,aco,aro_aco,group table
and some requirement of plugin
I suggest you to use this