I'm developing an app that shows the score of sports-related games in real-time. I'm using a paid API that has limited no. of requests and to show the score in real-time, I'm using a short polling technique (hit the API after every 2-3 seconds to see if any change happens in the score)
If I directly place that API url in the application, then every application user would be directly hitting an API. Assuming 10 users are using an application, then 10 API calls would be deducted after specified time interval (2-3 seconds), right?
So what should be the strategy (better way or approach) to do this thing to prevent multiple API calls?
What I could come up with his store the API JSON response in the MYSQL database. This way, I would be serving the data to application users through the database (this way, users would hit the database, not an actual API) Is it the correct way to do it?
Store the API JSON response into the MYSQL database
Then reconvert the MySQL database into the JSON format
and then the application users would be polling the database JSON response
I don't know if this is the correct way to do it! That's why posted this question
Thank you
Related
I am trying to build a simple project online. I have a MySql Database,Database where I will store different information, such as fake orders made from fake clients. The application will be formed by a frontend made with Javascript and HTML/CSS, while the backend will be a Node/Express API that should handle all the requests and handle the database.
I wanted to know whether there is a way to, each time a new order is made, have a refresh in my own page, and see, for example, a new column in an hypotetical table in my HTML with a minumum latency, avoiding making a request from the client every x seconds. This because it could be quite expensive in terms of bandwith and also prett unefficient.
I thought that each time I connect to the site, I get subscribed to a sort of list in the server, that broadcast a trigger to then update the frontend when tha UPDATE function is triggered in the backend. In other words, every time an update is done on the backend, the server sends a trigger to the clients that he knows are currently connected. Then, the frontend asks for the update directly.
This solutions i really complicated to handle and may be not that performant. I was thinking if there where some functionalities of the frontend or the backend or the database, or any framework that allow me to do this thing.
I would like to have all as real time as possible, using the least bandwith possible. This is because I would like to use the free tier of some online service, and I don't want to consume all the bandwith.
If you have some suggestions of framework or functionalities, or any protocol, you are welcome. Thank you a lot in advice
You can use websockets. When a user creates an order and after there is a success saving to the data base, the backend will push or publish the data to the client who is subscribed to a specific channel. The logic is not complicated at all it is called the pub/sub pattern you should search for it.
Also https://socket.io/ this is library that used on both backend and front end to deal with websockets.
I'm working on a RESTful API using node and express (with a MySQL database).
I have a table of products, and a page on my front end which is supposed to display those products. The problem is, there's hundreds (sometimes thousands) of them, I can't just send all of them back in a response.
What is the best way (in terms of performance and speed) for transferring all those rows back to the client?
I thought of streaming the data so that the client starts displaying them while the streaming is still happening. But I have no idea on how to do that, or whether that's the best way.
I am offering a Restful API to clients that access it by webservice. Now I would like to be able to count the API call per month.
What would be the best way to count the calls? Incrmenting a DB field would mean on DB call more. Is there a workaround? We are talking about millions of API calls per month.
You can also log to text file and use log analytics tools such as Webalizer(http://www.webalizer.org/) to analyze the text files
HTH
You can use a separate in-memory database to track these values, and write them to disk occasionally. Or store calls in a collection and batch-write them to the database occasionally.
I am developing a GPS application that tracks a user, sort of like the app Runkeeper that tracks where you have been on your run.
In order to do this, should I store GPS coordinates in a SQL Lite database on the phone or on Google App Engine and then when the user selects the data, I can send the entire set to the phone?
What would be a better design?
I have a lot of experience in this category. I am also developing a similar app. I save data on the phone, but periodically check for a connection and then upload the data to an app engine, and then delete the data on the phone (so it doesnt comsume too much phone memory). I store upload files name to data store, and files to the cloud storage (which can accept larger size than the data store). Then I upload the data to bigQuery for analysis and main storage of all data. I also remember what data was successfully uploaded to bigQuery. So there are really 4 memories just to get the data into BQ. Its quite an effort. My next phase is doing the analytics using BigQuery and sending info back the phone app. I also use Tableau as they have a good solution for reading bigQuery and they recognize geo data and you can plot lat,lon on an implementation of Open Street Map.
Certainly there are other solutions. The downside to my solution is that the upload to bigQuery is slow, sometimes takes minutes before the data is available. But on the other hand, Google's cloud tools are quite good and it integrates with 3rd party analytics/viewers.
Keep me updated on how you progress.
Each day hundreds of thousands of items are inserted, updated and deleted on our service (backend using .Net and a MySql database).
Now we are integrating our service with another service using their RESTful API. Each time an item is inserted, updated or deleted on our service we also need to connect to their web service and use POST, PUT, DELETE.
What is a good implementation of this case?
It seems like not a very good idea to connect to their API each time a user inserts an item on our service as it would be a quite slow experience for the user.
Another idea was to update our database like usual. Then set up another server constant connecting to our database and fetching data that needs to be posted to the RESTful API. Is this the way to go?
How would you solve it? Any guides of implementing stuff like this would be great! Thanks!
It depends if you delay in updating the other service is acceptable or not. If not, than create a event and put this in queue of event processor who can send this to second service.
If delay is acceptable than there can be background batch job that can run periodically and send the data.