Querying with Associations in Sequelize - mysql

I am using Sequelizer in my nodejs project and i couldn't make up anything from the official docs. Yet i managed to create tables and add rows etc. But the problem is that i need to query for objects with some special conditions on their associations.
I have a user model, a vote model and a Position model. A use can vote for an other user in a position. So this is the code that makes up the relations :
Position.hasMany(User,{as: 'Candidate'});
Votes.belongsToMany(User,{through: 'UserVote'});
User.belongsToMany(Votes,{through: 'UserVote'});
the problem is that i didn't find a way to query for the Positions which one known user didn't vote for yet. How do i do that ?

I am not 100% how to do that with associations but I will try to show you how I would do it using a regular query. Excuse the syntax if it isn't exactly perfect, this is just from memory. Assuming you have a user identifier provided (I'll call it userId).
Edit: I also noticed you are using MySQL. This works in Postgres but you might need to tweak the syntax for it to work for you. I don't know MySQL specifics but this should be close enough for you to get the idea!
Position.findAll({
where: [
`NOT EXISTS(
SELECT 1
FROM "votes"
WHERE "votes"."position_id" = "position"."id" AND
"votes"."user_id" = ${userId}
)`
]
}).then(...);
I have not found a better way to query existence yet myself. This pattern is usually my go-to for these kinds of queries. I made a few assumptions about your actual database model.
Also good to note that this is for Sequelize 3.x.x but I think it should more or less work on 4.x.x.
Good luck! :)

Related

Is sql views the proper solution

I have a table named 'Customers' which has all the information about users with different types (user, drivers, admins) and I cannot separate this table right now because it's working on production and this is not a proper time to do this.
so If I make 3 views: the first has users types only, the second has drivers and the third has admins.
My goal is to use 3 models instead one in the project I'm working on so
is this a good solution and what does it cost on performance?
How big is your table 'Customers'? According to the name it doesn't sounds like heavy one.
How often these views will be queried?
Do you have some indices or pk constraints on the attribute you're are going to use in where clause for the views?
I cannot separate this table right now because it's working on
production and this is not a proper time to do this.
From what you said it sounds like a temporarily solution so it probably the good one. Later you сan replace the views with three tables and it will not affect the interface.
I suggest that it is improper to give end-users logins directly into the database. Instead, all requests should go through a database-access layer (API) that users must log into. This layer can provide the filtering you require without (perhaps) any impact to the user. The layer would, while constructing the needed SELECT, tack on, for example, AND type = 'admin' to achieve the goal.
For performance, you might also need to have type at the beginning of some of the INDEXes.

How to implement custom fields in database

I need to implement a custom fields in my database so every user can add any fields he wants to his form/entities.
The user should be able to filter or/and sort his data by any custom field.
I want to work with MySQL because the rest of my data is very suitable to SQL. So, unless you have a great idea, SQL will be preferred over NoSQL.
We thought about few solutions:
JSON field - Great for dynamic schema. Can be filtered and sorted. The problem is that it is slower then regular columns.
Dynamic indexes can solve that but is it too risky to add indexes dynamically.
Key-value table - A simple solution but a really slow one. You can't index it properly and the queries are awful.
Static placeholder columns - Create N columns and hold a map of each field to its placeholder. - A good solution in terms of performance but it makes the DB not readable and it has limited columns.
Any thoughts how to improve any of the solutions or any idea for a new solution?
As many of the commenters have remarked, there is no easy answer to this question. Depending on which trade-offs you're willing to make, I think the JSON solution is neatest - it's "native" to MySQL, so easiest to explain and understand.
However, given that you write that the columns are specified only at set up time, by technically proficient people, you could, of course, have the set-up process include an "alter table" statement to add new columns. Your database access code and all the associated view logic would then need to be configurable too; it's definitely non-trivial.
However...it's a proven solution. Magento and Drupal, for instance, have admin screens for adding attributes to the business entities, which in turn adds columns to the relational database.

Complex filtering in rails app. Not sure complex sql is the answer?

I have an application that allows users to filter applicants based on very large set of criteria. The criteria are each represented by boolean columns spanning multiple tables in the database. Instead of using active record models I thought it was best to use pure sql and put the bulk of the work in the database. In order to do this I have to construct a rather complex sql query based on the criteria that the users selected and then run it through AR on the db. Is there a better way to do this? I want to maximize performance while also having maintainable and non brittle code at the same time? Any help would be greatly appreciated.
As #hazzit said, it is difficult to answer without much details, but here's my two cents on this. Raw SQL is usually needed to perform complex operations like aggregates, calculations, etc. However, when it comes to search / filtering features, I often find using raw SQL overkill and not quite maintainable.
The key question here is : can you break down your problem in multiple independent filters ?
If the answer is yes, then you should leverage the power of ActiveRecord and Arel. I often find myself implementing something like this in my model :
scope :a_scope, ->{ where something: true }
scope :another_scope, ->( option ){ where an_option: option }
scope :using_arel, ->{ joins(:assoc).where Assoc.arel_table[:some_field].not_eq "foo" }
# cue a bunch of scopes
def self.search( options = {} )
output = relation
relation = relation.a_scope if options[:an_option]
relation = relation.another_scope( options[:another_option] ) unless options[:flag]
# add logic as you need it
end
The beauty of this solution is that you declare a clean interface in which you can directly pour all the params from your checkboxes and fields, and that returns a relation. Breaking the query into multiple, reusable scopes helps keeping the thing readable and maintainable ; using a search class method ties it all together and allows thorough documentation... And all in all, using Arel helps securing the app against injections.
As a side note, this does not prevent you from using raw SQL, as long as the query can be isolated inside a scope.
If this method is not suitable to your needs, there's another option : use a full-fledged search / filtering solution like Sunspot. This uses another store, separate from your db, that indexes defined parts of your data for easy and performant search.
It is hard to answer this question fully without knowing more details, but I'll try anyway.
While databases are bad at quite a few things, they are very good at filtering data, especially when it comes to a high volumes.
If you do the filtering in Ruby on Rails (or just about any other programming language), the system will have to retrieve all of the unfiltered data from the database, which will cause tons of disk I/O and network (or interprocess) traffic. It then has to go through all those unfiltered results in memory, which may be quite a burdon on RAM and CPU.
If you do the filtering in the database, there is a pretty good chance that most of the records will never be actually retrieved from disk, won't be handed over to RoR and won't then be filtered. The main reason for indexes to even exist is for the sole purpose of avoiding expensive operations in order to speed things up. (Yes, they also help maintain data integrity)
To make this work, however, you may need to help the database a bit to do its job efficiently. You will have to create indexes matching your filtering criteria, and you may have to look into performance issues with certain types of queries (how to avoid temporary tables and such). However, it is definately worth it.
Having that said, there actually are a few types of queries that a given database is not good at doing. Those are few and far between, but they do exist. In those cases, an implementation in RoR might be the better way to go. Even without knowing more about your scenario, I'd say it's a pretty safe bet that your queries are not among those.

Dynamic LINQ (Select clause) keyword issue:

I have been using an application I have made to distribute around my company which gives an average non-techie user(accountant, marketing type, mgmt) the ability to query any size DB with fast and friendly results. It uses the Dynamic.cs class to
Select all tables in a given DB
Select some filtered columns/fields in any Table
At runtime it figures out what the type is and then chooses which operators the user
can enter to aid their query
It gives the ability to only display the fields the user selects.
and Finally it gives the ability to Order and Group By
People, especially my superiors, LOVE it as it is incredibly useful. I can put this application on any DB and in 5 minutes they are able to query and export to Excel Worksheets in seconds.
Now, here is my issue, when I generate my select clause, if I have the field named "Object" I get a parse error from Dynamic.cs "Expecting a "(" or a "." - I am quite sure this is a keyword issue, and when the Parser hits the word Object it gets confused.**
One of my developers thinks, oh just write a partial class to get around the issue, but I think this is a serious enough bug, that I would like to fix the Dynamic.cs class -
Can anyone help me on this??? I have researched but have not found anything to point me in the right direction. I am pretty sure I could fix this but time is not on my side
Thanks in advance!!
First of all you are making a syntax error i guess, secondly even if you remove that syntax error it wont work because you are trying to create a dynamic select statement I guess, which is not as simple as you guess.
Dynamic linq library or regular expression can help you.

Abstracted JOIN for maintainability?

Does anyone know an ORM that can abstract JOINs? I'm using PHP, but I would take ideas from anywhere. I've used Doctrine ORM, but I'm not sure if it supports this concept.
I would like to be able to specify a relation that is actually a complicated query, and then use that relation in other queries. Mostly this is for maintainability, so I don't have a lot of replicated code that has to change if my schema change. Is this even possible in theory (at least for some subset of "complicated query")?
Here's an example of what I'm talking about:
ORM.defineRelationship('Message->Unresponded', '
LEFT JOIN Message_Response
ON Message.id = Message_Response.Message_id
LEFT JOIN Message AS Response
ON Message_Response.Response_id = Response.id
WHERE Response.id IS NULL
');
ORM.query('
SELECT * FROM Message
SUPER_JOIN Unresponded
');
Sorry for the purely invented syntax. I don't know if anything like this exists. It would certainly be complicated if it did.
One possibility would be to write this join as a view in the database. Then you can use any query tools on the view.
Microsofts Entity Framework also supports very complex mappings between code entities and the database tables, even crossing databases. The query you've given as an example would be easily supported in terms of mapping from that join of tables to an entity. You can then execute further queries against the resulting joined data using LINQ. Of course if you're using PHP this may not be a huge amount of use to you.
However I'm not aware of a product that wraps up the join into the syntax of further queries in the way you've shown.