Asp.net Web API http response data size - json

I have an asp.net web api with getting a list of data from database with a very heavy sql query(using store procedure) then serialize to json, my data result could return sometime more than 100,000 rows of data and is beyond the max limitation of http JSON response which is 4MB, I've been trying to use pagination to limit my result size but it pulls down performance as everytime user click on next page will trigger a heavy sql command, but if I don't use pagination, sometimes the result data size is more than 4MB, and my client side grid won't render properly. Since I don't have a way to check the JSON data size before sending back to client from web api. So my questions would be:
Is there any way to check data size in asp.net web api before sending back to client? For example, if it's more than 4MB then send a response saying "please modify your date range to have less data"? Would this be a good idea in application design?
Is there any way to save the entire data result in cache or
somewhere with asp.net web api so that every time when user perform a
pagination, it will not get result again from database but from the
cache.
Is there any way to store the entire data result in cache or in a temp file on client side(using Angular 5) so that when user perform a pagination, it will not request another http call to web api.
I would be more than happy to listen any experience or suggestion from anyone! Thank you very much!

Related

Large API call with no pagination

I need to retrieve data from an API source that has a massive amount of entries. (1800+) The problem is that the source has no pagination or way to group the entries. We are then saving the entries as post on the site and will run through a Cronjob daily.
Using curl_init() to retrieve the data from the API source. But the we keep getting a 503 error, timing out. When it works it retrieves the data as json saving important info with as metadata and the rest as json.
Is there a more reliable way to retrieve the data. On other sites I have worked on we have been able to programmatically run through an API per page in the backend.
You might try saving the JSON to a file first, then running the post creation on the JSON in the file vs. the direct cURL connection. I ran into similar issues in the past, even with an API that had pagination.

Already Mobile Number Exists Validation Pattern html forms

Is there any pattern if already mobile number exists then the error message has to be shown in form validation.
If existing mobile numbers are present in the database, then you should make an API request to your backend using Ajax to avoid page reload. You should send the mobile number as a parameter in the request.
On the backend side, you need to check in the database that if this number already exists or not and based on your finding you should return appropriate responses.
On frontend, you need to check the response returned from the API and make appropriate decisions based on it but all the validation through the database should be done on the backend/server side.

Pagination when data source is supporting multi page requesting

Does Google Data Studio Community connector support pagination?
I work with an external data service. The service returns data page by page. It requires start and next parameters and it requires 2 req/sec. Can I override a method like getData or upgrade request argument for implement this feature?
If it's not. Is there the best practice for getting data of this kind?
Community Connectors do not support pagination for web APIs at present.
The best practice would depend on your use case. If you want to get the full dataset for user, you can make multiple UrlFetch calls to get the full dataset, merge it, and return the merged set as the getdata() response. It might also make sense to cache this result to avoid making a large number of requests in the short term. You can cache using Apps Script cache, or a Sheet, or even to BigQuery. Keep in mind that Apps Script has 6 min / execution limit.
However, if you want to return only specific pages, the only way to configure that would be through getConfig since configparams are passed with the getData() request. Example use case would be returning only first n number of pages where n selected by user in the config.

Client get new data from server

I am designing a web app where the server generates batches of data, and the client periodically checks whether new batches of data are available for download. The way I am doing this is that whenever the server generates a new batch of data, it is available at a particular URL. The client periodically checks the URL to see whether a new batch is available for it. (I am currently not using web sockets.) This batch of data is in the format of a JSON object.
Since I have very little web experience, I'm a bit confused about what to do when the client visits the URL. How should the client know whether the batches of data at the URL are new (in which case the client should download them) or old (in which case the client should ignore them, since it has already downloaded them in the past)?
Also, there may be multiple clients working with the same server, so the solution should work regardless of the number of clients.
Include the Timestamp property (through server side script) in the JSON which is thrown by the server. You need to change the value of timestamp property everytime you update data to your server. Now it would be easy for you to detect change by checking the modification date.

JSON response data size issue

I am using Discover the Google Analytics platform to generate queries in order to make callouts to GA from inside a SalesForce application. Upon creating the custom the report an API Query URI is generated which presents the data from the report in a JSON format.
One example uri looks like the following:
https://www.googleapis.com/analytics/v3/data/ga?ids=[my gi id] &start-date=[start date]&end-date=[end date[&metrics=ga%3Asessions%2Cga%3AsessionDuration&dimensions=ga%3AdaysSinceLastSession%2Cga%3Acampaign%2Cga%3AsourceMedium%2Cga%3AsocialNetwork%2Cga%3Adimension2&filters=ga%3AsessionDuration%3E1&access_token=[my access token]
The issue is that the presented data is limited to 1000 rows max, and I am not sure how can I surpass this size view limit.
The google analytics API has a field you can send called max-results if you add
&max-results=10000
to your request you will get paging of 10000 rows. That is the max you can set it to if there are more results a nextlink will be returned with the results that you can use to make more requests to get the additional data.