Pagination when data source is supporting multi page requesting - google-apps-script

Does Google Data Studio Community connector support pagination?
I work with an external data service. The service returns data page by page. It requires start and next parameters and it requires 2 req/sec. Can I override a method like getData or upgrade request argument for implement this feature?
If it's not. Is there the best practice for getting data of this kind?

Community Connectors do not support pagination for web APIs at present.
The best practice would depend on your use case. If you want to get the full dataset for user, you can make multiple UrlFetch calls to get the full dataset, merge it, and return the merged set as the getdata() response. It might also make sense to cache this result to avoid making a large number of requests in the short term. You can cache using Apps Script cache, or a Sheet, or even to BigQuery. Keep in mind that Apps Script has 6 min / execution limit.
However, if you want to return only specific pages, the only way to configure that would be through getConfig since configparams are passed with the getData() request. Example use case would be returning only first n number of pages where n selected by user in the config.

Related

Google Drive API /files slow response

I want to ask for help/ideas on the issue I will describe below.
Our iOS app allows users to access their Google Drive files.
We use Changes API (https://developers.google.com/drive/api/v3/reference/changes). The main pre-condition to using this API is to build a local DB that holds the snapshot of the user's Drive file tree and the token. To initially fill the DB we must request the list of all files from user's Drive. Getting the list of all files (with metadata) takes too long for many of our users. This is the issue I want to address.
We request files with the series of Files requests (https://developers.google.com/drive/api/v3/reference/files/list). Most requests are plain files?q=trashed%20%3D%20false.
For example, at my own private Google Drive:
69K files
initial request of all files takes 5+ minutes with my current network speed (Download 527 Mbps, Upload 417 Mbps; ping www.googleapis.com – 40–45 ms)
~150 requests
each request brings information about ~460 files
each request takes around 2-2.5 seconds
Sometimes I observed requests to take up to 6 seconds, which means that getting all files list took 15 minutes at my account.
If I look at the Developer Console, the latency is below 0.1s
Many of our users have Drives far bigger than mine. Standard iOS app user's session is not long enough to complete the initial request. We do save every intermediate page token so that all data received during single app session is not lost if user leaves the app – next session we will keep downloading data from the last saved token. But still there're some cases when our app needs the DB to be filled out with data before starting some operations – in that case our users see "Pending..." progress and they complain that our app is slow.
So, questions:
is it possible to improve the described request speed/latency?
maybe there's some quota that we are missing and it can be changed?
maybe someone can advice a more effective way of getting all files list?
P.S. We could potentially reduce the amount of requests. We have to perform some double checks for Shared with Me folders as we observed that sometimes request of all files doesn't list all files from Shared folders. That's a bit of a side story, and I don't think this will dramatically improve situation for us. I can provide more details on the actual set of requests we perform if necessary.
Are you returning all the fields - I would assume so since the only query param provided is trashed=false as the query param. Do you need all the fields? Can you try to reduce the query to only return the fields you really care about (using a field mask) and see if that improves your performance?

Asp.net Web API http response data size

I have an asp.net web api with getting a list of data from database with a very heavy sql query(using store procedure) then serialize to json, my data result could return sometime more than 100,000 rows of data and is beyond the max limitation of http JSON response which is 4MB, I've been trying to use pagination to limit my result size but it pulls down performance as everytime user click on next page will trigger a heavy sql command, but if I don't use pagination, sometimes the result data size is more than 4MB, and my client side grid won't render properly. Since I don't have a way to check the JSON data size before sending back to client from web api. So my questions would be:
Is there any way to check data size in asp.net web api before sending back to client? For example, if it's more than 4MB then send a response saying "please modify your date range to have less data"? Would this be a good idea in application design?
Is there any way to save the entire data result in cache or
somewhere with asp.net web api so that every time when user perform a
pagination, it will not get result again from database but from the
cache.
Is there any way to store the entire data result in cache or in a temp file on client side(using Angular 5) so that when user perform a pagination, it will not request another http call to web api.
I would be more than happy to listen any experience or suggestion from anyone! Thank you very much!

GAS - What's the max size of the payload I can send through the Execution API

I'm building a relatively big Google Docs file using Google Apps Script, and I basically need to inject a lot of data in order to build it programatically.
I'm thinking of executing a function init() and passing the json string as it value through the Execution API. I'm worried about the max size of the string that I can pass. What's the max size?
Checked the documentation for the docs on Execution API but there is no mention of such limit. If it follows the standard protocol RFC 2616 as mentioned in this thread, then your big payload may push through. The only thing you need to do now is actually try.

How to make basic REST API calls using a browser

I am trying to get started with REST API calls by seeing how to format the API calls using a browser. Most examples I have found online use SDKs or just return all fields for a request.
For example, I am trying to use the Soundcloud API to view track information.
To start, I've made a simple request in the browser as follows http://api.soundcloud.com/tracks/13158665.json?client_id=31a9f4a3314c219bd5c79393a8a569ec which returns a bunch of info about the track in JSON format
(e.g. {"kind":"track","id":13158665,"created_at":"2011/04/06 15:37:43 ...})
Is it possible to only to get returned the "created_at" value using the browser? I apologize if this question is basic, but I don't know what keywords to search online. Links to basic guides would be nice, although I would prefer to stay out of using a specific SDK for the time being.
In fact, it's really hard to answer such question since it depends on the Web APIs. I mean if the API supports to return only a subset of fields, you could but if not, you will receive all the content. From what I saw on the documentation, it's not possible. The filters only allow you to get a subset of elements and not control the list of returned fields within elements.
Notice that you have a great application to execute HTTP requests (and also REST) in Chrome: Postman. This allows to execute all HTTP methods and not only GET ones and controls the headers and sent content and also see what is received back.
If you use Firefox, Firebug provides a similar thing.
To finish, you could have a look at this link to find out hints about the way Web APIs work and are designed: https://templth.wordpress.com/2014/12/15/designing-a-web-api/.
Hope it helps you and I answered you question,
Thierry
Straight from the browser bar you can utilize REST endpoints that respond to a GET message. That is what you are doing when you hit that URI, you are sending an HTTP GET message to that server and it is sending back a JSON.
You are not always guaranteed a JSON, or anything when hitting a known REST endpoint. What each endpoint returns when hit with a GET is specific to how it was built. In that case, it is built to return a JSON, but some may return an HTML page. In my personal experience, most endpoints that utilize JSON returns expect you to process that object in a computer fashion and don't give you a lot of options to get a specific field of the JSON. Here is a good link on how to process JSON utilizing JavaScript.
You can utilize REST clients (such as the Advanced REST Client for Chrome) to craft HTTP POST and PUT if a specific REST endpoint has the functionality built in to receive data and do something with it. For example, a lot of wiki style REST endpoints will allow you to create a page with a specifically crafted HTTP POST with either specific header information, URI parameters or a JSON as part of it.
you can install DHC client app in your chrome and send request like put or get

Add analytics to a desktop application

I have developed a desktop application using HTML 5 and node web-kit .
I would like to track parts of the app , such as how long its used , clicks ect.
I would like the analytics system to work both on and offline (storing data until its on-line).
Is there anything that I could use to do this?
The Google measurement protocol allows you to track everything that can send an http request. You need to generate a unqiue client id to group pageviews into session (the part is usually done by the Javascript tracker which does not help you) and can then choose between various interaction types and their related data to be added as parameters in a request to the Google Analytics server.
As far as offline capabilites, there is a "queue time" parameter that allows you to send delayed calls to GA. However as per documentation that delay is 4 hours at most (intended for Smartphones and Tablets that temporarily lose connection rather than to work permanently offline).
In the end it depends what data you need - you might just as well send calls to your own server and log them in a csv file and feed that to Klipfolio or some other dashboard solution (or even use Excel if you expect a low data volume).