I am currently making an online multiplayer chess game. I was wondering if it would be appropriate to user triggers to log in every movement.
Then, using nodejs I would repetitively check the trigger table and update the visual for both players & observers.
Since I only need to make changes in the database, the visual aspect would follow up automaticly (using the recurring function to check changed data in the database). The goal of this would be to seperate the visuals from the logic involved to make the game.
Would any of you recommend this technique or its simply a no go?
You describe a possible technical solution for your task. But i strongly recommend NOT to do so.
This will not scale very well - it adds a lot of overhead and load to both your database and application server.
There are other lightweight possibilities that scale much better:
use a message queue like (for example) redis with the node_redis client. It has built-in pubsub semantics.
abstract your database calls and push all database updates to the message queue, too.
instead of using a "recurring" function (AJAX poll) to get status updates, you could use a HTTP streamer like jquery-stream or jquery-socket for example. This avoids the overhead of opening a new HTTP connection for each client update.
use the event-driven features of nodejs on the server side to push new events to the client connection.
Related
I am trying to build a simple project online. I have a MySql Database,Database where I will store different information, such as fake orders made from fake clients. The application will be formed by a frontend made with Javascript and HTML/CSS, while the backend will be a Node/Express API that should handle all the requests and handle the database.
I wanted to know whether there is a way to, each time a new order is made, have a refresh in my own page, and see, for example, a new column in an hypotetical table in my HTML with a minumum latency, avoiding making a request from the client every x seconds. This because it could be quite expensive in terms of bandwith and also prett unefficient.
I thought that each time I connect to the site, I get subscribed to a sort of list in the server, that broadcast a trigger to then update the frontend when tha UPDATE function is triggered in the backend. In other words, every time an update is done on the backend, the server sends a trigger to the clients that he knows are currently connected. Then, the frontend asks for the update directly.
This solutions i really complicated to handle and may be not that performant. I was thinking if there where some functionalities of the frontend or the backend or the database, or any framework that allow me to do this thing.
I would like to have all as real time as possible, using the least bandwith possible. This is because I would like to use the free tier of some online service, and I don't want to consume all the bandwith.
If you have some suggestions of framework or functionalities, or any protocol, you are welcome. Thank you a lot in advice
You can use websockets. When a user creates an order and after there is a success saving to the data base, the backend will push or publish the data to the client who is subscribed to a specific channel. The logic is not complicated at all it is called the pub/sub pattern you should search for it.
Also https://socket.io/ this is library that used on both backend and front end to deal with websockets.
Currently I am using MySQL as a main database which is not real time database. To synchronize client side data with server and keep data when offline I am considering to add real time database as a slave database in my architecture. Synchronizing data is not easy, so I want to use cloud firestore.
What I searched until now, It seems there is no pratical way to synchronize between not real time RDMS(in my case is MySQL) and cloud firestore. I can not migrate current data to cloud firestore because other services depend on that.
If there is no pratical solution for this, please suggest me the best way. Thanks.
Not sure, but it seems there is no solution to synchronize that. To synchronize such this case, on the client side has to be implemented manually.
I have been thinking about this as well. What I have thought up so far is this.
Use Firestore to create a document.
Write cloud function to listen to create events on the collection that stores that document.
send the created document over http to a rest endpoint that stores that data on a relational db (mysql, postgres), if server is down or status code is anything other than 200, assign true to a boolean flag on the document called syncFailed.
write a worker that periodically fetches documents where syncFailed is true and sends this document to the endpoint again until sync is a success.
THINGS TO CONSIDER
Be careful in adopting this approach with update events. You may accidently create an infinite loop that will cost you all your wealth. If you do add a update listener on a document, make sure you devise a way to avoid creating an infinite loop. You may update a document from an update-listening cloud function that in turn re-triggers this cloud function which in turn updates the document which in turn re-re-triggers this cloud function and so on. Make sure you use a boolean flag on the document or some field as a sentinel value to get out of such a looping situation.
I have an Android frontend.
The Android client makes a request to my NodeJS backend server and waits for a reply.
The NodeJS reads a value in a MySQL database record (without send it back to the client) and waits that its value changes (an other Android client changes it with a different request in less than 20 seconds), then when it happens the NodeJS server replies to client with that new value.
Now, my approach was to create a MySQL trigger and when there is an update in that table it notifies the NodeJS server, but I don't know how to do it.
I thought two easiers ways with busy waiting for give you an idea:
the client sends a request every 100ms and the server replies with the SELECT of that value, then when the client gets a different reply it means that the value changed;
the client sends a request and the server every 100ms makes a SELECT query until it gets a different value, then it replies with value to the client.
Both are bruteforce approach, I would like to don't use them for obvious reasons. Any idea?
Thank you.
Welcome to StackOverflow. Your question is very broad and I don't think I can give you a very detailed answer here. However, I think I can give you some hints and ideas that may help you along the road.
Mysql has no internal way to running external commands as a trigger action. To my knowledge there exists a workaround in form of external plugin (UDF) that allowes mysql to do what you want. See Invoking a PHP script from a MySQL trigger and https://patternbuffer.wordpress.com/2012/09/14/triggering-shell-script-from-mysql/
However, I think going this route is a sign of using the wrong architecture or wrong design patterns for what you want to achieve.
First idea that pops into my mind is this: Would it not be possible to introduce some sort of messaging from the second nodjs request (the one that changes the DB) to the first one (the one that needs an update when the DB value changes)? That way the the first nodejs "process" only need to query the DB upon real changes when it receives a message.
Another question would be, if you actually need to use mysql, or if some other datastore might be better suited. Redis comes to my mind, since with redis you could implement the messaging to the nodejs at the same time...
In general polling is not always the wrong choice. Especially for high load environments where you expect in each poll to collect some data. Polling makes impossible to overload the processing capacity for the data retrieving side, since this process controls the maximum throughput. With pushing you give that control to the pushing side and if there is many such pushing sides, control is hard to achieve.
If I was you I would look into redis and learn how elegantly its publish/subscribe mechanism can be used as messaging system in your context. See https://redis.io/topics/pubsub
I have a requirement to make Endpoint calls to multiple Web API's designed by other companies. These calls will be made on periodic basis like once an hour, or once a day to post and retrieve some data (business to business transactions). Am working with .NET framework and ServiceStack.
Am not sure, what would be the best approach to achieve this type of functionality?
Maybe, I can have a Windows Service application which scans through the relevant config tables in SQL Server and generate CURL commands and execute them? Not sure, whether this will be the correct approach or there is something better you would like to propose?
I have never worked with CURL before, these are just initial thoughts.
To achieve this your backend needs a data structure to hold all necessary data for the requests (which can be a database table as you suggest) and a scheduling mechanism. This could be as simple as a timer and when triggered it picks up the requests and executes them (by using the built-in HttpClient for instance). IMO you should keep this logic within the application itself, no need to make things complicated by introducing a system-dependant service that then issues curl commands on the os level.
Well I don't know if the title for this question is appropriate or not, but I didn't know how to put this in few words.
I'm currently developing a multiplayer 2D game using NodeJS and Socket.io on the server side and HTML5 on the client side. This game doesn't need to save the players progress unless when they finish it. And I must make sure that all information about the players, such as, scores and helps are always valid. So I decided to centralize this information on the server, this way the clients never send a score to the server, instead based on the information sent the server calculates the scores and send them to all clients. Besides that, I can have different sessions of this game running with 2 players minimum and 4 players maximum.
At first I decided to implement this with the server always maintaing the games sessions data in memory, but now I'm questioning myself if I shouldn't have used a database to store this data instead. I'm using a database to store the sessions data only when they are finished because I have no use for unfinished sessions data. But should I instead maintain the sessions and players data on the database while they are playing? My problem here is that the clients communicate very frequently with the server, with this approach I would have to first request their data from the database, make the necessary changes, store it back into the database, and repeat this process on each client request. Maybe the answer to this is obvious, but should I use this approach instead?
It is the first time I'm developing a game, so I have no idea how things usually work. I just know that I want the server to be fast. I chose to maintain everything on memory mainly because all examples and tutorials I found about multiplayer games development never mentioned a database...
Thank you!
I am also developing a multiplayer game using node.js and socket.io. You should not access the database on each client request because,
1) I/O operations are expensive.
Reading/writing to database is "expensive" (slow). It is much faster to read and write from memory.
2) I/O operations should be asynchronous in Node.js.
function read_or_alter_database(input, function(callback){ update_the_client(); });
This makes the database operation non-blocking: the rest of your application will still run, until the operation is done. Then, the callback function is executed. If the player's client rely on the database access to update the game state, then this becomes a blocking operation (since the game cannot proceed until the database operation is done), which negates the main purpose of Node.js.
3) There will be a high volume of client requests in multiplayer games.
This plus point 1 results in a big efficiency loss.
Sounds like you already have your answer: Since the state is only important once a session is complete, only store the data upon completion of that session. Storing the intermediate values serves no purpose, and only slows things down.
Node.js is a relatively new development platform - it is uncertain if it would be able to support such a system. As far as what is NORMALLY done - as much information as possible is stored client side, first off, and then servers typically hold data in memory, or, at least, I know this is the case for the popular games League of Legends, World of Warcraft, and Starcraft II. Then again, those applications are not HTML/Node.js, are very very demanding on the system, and are by huge development companies - it may be that the DB approach is fine for a smaller game, I've just never heard of it.