I am offering a Restful API to clients that access it by webservice. Now I would like to be able to count the API call per month.
What would be the best way to count the calls? Incrmenting a DB field would mean on DB call more. Is there a workaround? We are talking about millions of API calls per month.
You can also log to text file and use log analytics tools such as Webalizer(http://www.webalizer.org/) to analyze the text files
HTH
You can use a separate in-memory database to track these values, and write them to disk occasionally. Or store calls in a collection and batch-write them to the database occasionally.
Related
I'm developing an app that shows the score of sports-related games in real-time. I'm using a paid API that has limited no. of requests and to show the score in real-time, I'm using a short polling technique (hit the API after every 2-3 seconds to see if any change happens in the score)
If I directly place that API url in the application, then every application user would be directly hitting an API. Assuming 10 users are using an application, then 10 API calls would be deducted after specified time interval (2-3 seconds), right?
So what should be the strategy (better way or approach) to do this thing to prevent multiple API calls?
What I could come up with his store the API JSON response in the MYSQL database. This way, I would be serving the data to application users through the database (this way, users would hit the database, not an actual API) Is it the correct way to do it?
Store the API JSON response into the MYSQL database
Then reconvert the MySQL database into the JSON format
and then the application users would be polling the database JSON response
I don't know if this is the correct way to do it! That's why posted this question
Thank you
I am using Google Cloud Datastore, to store some data. When trying to extract all entities from my "Kind", through the API into GAS code, I realized that the API extracts 300 entities each time. To extract the totality of entities, I used the "cursor" option to fetch the next batch where the previous stopped.
Is there any way to extract all entities (or at least more than 300) at once?
While trying to find an answer in the web, I did not find any specific answer.
The max number of entities you can update/upsert in one go via Datastore's Data API is 500 but if you use a lookup operation you could potentially fetch 1000 entities (as long as they are collectively under 10MB for the transaction) as listed under the "Limits" section of Datastore's reference documentation.
However, you might be able to leverage the export/import endpoints of Datastore's Admin API to export/import data in bulk. Check out the guide for more information.
I have a requirement to make Endpoint calls to multiple Web API's designed by other companies. These calls will be made on periodic basis like once an hour, or once a day to post and retrieve some data (business to business transactions). Am working with .NET framework and ServiceStack.
Am not sure, what would be the best approach to achieve this type of functionality?
Maybe, I can have a Windows Service application which scans through the relevant config tables in SQL Server and generate CURL commands and execute them? Not sure, whether this will be the correct approach or there is something better you would like to propose?
I have never worked with CURL before, these are just initial thoughts.
To achieve this your backend needs a data structure to hold all necessary data for the requests (which can be a database table as you suggest) and a scheduling mechanism. This could be as simple as a timer and when triggered it picks up the requests and executes them (by using the built-in HttpClient for instance). IMO you should keep this logic within the application itself, no need to make things complicated by introducing a system-dependant service that then issues curl commands on the os level.
I have been working with Angular for some time now. My question is simple, I have a database with multiple tables. There is a clients table and around 7 or 8 other tables that contain information about that client that I need. None of the data from these tables is too terribly large. In order to reduce http calls, it was my thought to load all of the tables and store the data from each into a object stored in a factory.
So once a particular client is called, the http requests are made from each table and each are stored inside of a factory. Then, when a user needs to access that table, its data is stored in memory as the http call has been completed at the outset. When the data is changed, it can make a quick save of the table data and reload it again.
Most of the data is financial containing information about the income and asset categories of the client.
Question is .. is this wise? Am I missing something?
Thanks in advance
Your use of the term factory is inappropriate as a factory is a creational pattern. What you are describing is a facade. It is reasonable for a facade to aggregate data for a client and present it in a unified manner.
So, a remote client requests some data. The server-side facade makes the many requests on behalf of the client and composes the single response.
You have mentioned about caching the data. If you choose to do so, you will need to consider how to manage the cache data for staleness, how much memory you will need, etc.
Each day hundreds of thousands of items are inserted, updated and deleted on our service (backend using .Net and a MySql database).
Now we are integrating our service with another service using their RESTful API. Each time an item is inserted, updated or deleted on our service we also need to connect to their web service and use POST, PUT, DELETE.
What is a good implementation of this case?
It seems like not a very good idea to connect to their API each time a user inserts an item on our service as it would be a quite slow experience for the user.
Another idea was to update our database like usual. Then set up another server constant connecting to our database and fetching data that needs to be posted to the RESTful API. Is this the way to go?
How would you solve it? Any guides of implementing stuff like this would be great! Thanks!
It depends if you delay in updating the other service is acceptable or not. If not, than create a event and put this in queue of event processor who can send this to second service.
If delay is acceptable than there can be background batch job that can run periodically and send the data.