Create large amount of contacts Google Contacts API V3 - google-contacts-api

I need to create 1000+ contacts with Google Contacts API v3. For now, I am sending a request for every single user, and for 1000 contacts it takes almost 10 min.
Is there any better/faster way to do this?

Better way is by using Batch Operations. As mentioned in the given documentation,
With batch requests you can have the server perform multiple operations with a single HTTP request. The basic idea is that you create a contacts or contact groups feed and add an entry for each operation you want to perform.
To send a batch request for operations on contacts, send an authorized POST request to the contacts batch feed URL with the batch feed data in the body:
https://www.google.com/m8/feeds/contacts/{userEmail}/full/batch
However, please note that batch requests are limited to 100 operations at a time.
Please find more information about batch operations in the Google Data APIs Batch Processing documentation.

Related

With what frequency is my script calling the API? I don't want to get cut off

I'm using a json script from another stack overflow post and don't know with what frequency I'm calling the external data (from Seatgeek).
I don't want to get cut off from the Seatgeek api so I want to make sure I'm not going to bog down the system - I really only need the data to refresh twice a day.
I would not consider myself a developer so I'm not sure where to look. Can someone please help by taking a look at the script?
According to the Google Sheets API documentation:
Google Sheets API has a limit of 500 requests per 100 seconds per project, and 100 requests per 100 seconds per user. Limits for reads and writes are tracked separately. There is no daily usage limit.
Also, if you want to find the usage and more statistics regarding the API, you can check the Google API console and find all the information there.
For more information regarding your issue, you could check the following documentation:
[1] Google Sheets API;
[2] Google Console API Quota

Bulk API request for Google geo-coding API

I have a data-set of 16 Millions for which I will send the address etc to Google Geocoding API and get the lat-longs. Right now we are making an API request for POI by POI, I'm wondering if we can do bulk /batch request, where in which I send around 500 POIs or so, so that we can save some round-trip time. Please advise.
You can use the Web-Service Geocoding API and call the request via Curl, you'll just need to create an interval for each batch request to reaching the request/100s limit. If the maximum request/100s limit is not enough for your use case, you may file a support case via https://console.cloud.google.com/google/maps-apis/support so the support team can help you give your account more capability to send requests more than the limit.

Batch process with google geocoding api webservices (looking for updates)

Google references this:
https://developers.google.com/maps/documentation/javascript/geocoding
And states:
The per-session rate limit prevents the use of client-side services for batch requests, such as batch geocoding. For batch requests, use the Geocoding API web service.
However, when you go to the Geocoding API web services page, I see no reference to batch processing. The above sentence infers that you can do batch processing. I need to send a large number of addresses to get lat and longitude, but doing individual calls for each address is taking extremely long periods of time and need a more efficient method. Hopefully, a single batch call to send all the addresses.
Any ideas of how to batch process addresses on google to get lat and longitude?
I have seen this Google Batch Geocoding API
However, it states you can not which is not what the above google statement infers.
Per the Geocoding API web service documentation:
Other Usage Limits
While you are no longer limited to a maximum number of requests per day (QPD), the following usage limits are still in place for the Geocoding API:
50 requests per second (QPS), calculated as the sum of client-side and server-side queries.
You are just limited to 50 requests per second and have to pay for them (after you use up the $200 credit)
The best way I found to solve this problem is to use the Directions API with up to 27 destinations (origin, destination and 25 waypoints) and get your geolocation for the response legs. The position accuracy is slightly lower than in the geocode case from what I observed, but it is still a great tradeoff.
In the worst case you will have to call the Directions API twice when one or more addresses are not found in your call. The good thing in this case is that the Directions API will give you a response with the geocoded_waypoints which will specify the NOT_FOUND locations with a geocoder_status. After that, you can eliminate the bad ones and call again.
There are currently no available feature for a Geocoding API to handle multiple address at a single call, however, you may implement the batch process via cURL, by doing this, you can call multiple requests at once automatically. Implementation will be up to your use case as well.

Google Maps API - matching a request with the asynchronous response

I'm using the google maps distance matrix javascript API to query the travel time from one point to about 50 destinations.
in order to do this, I need to break my query up into multiple chunks, since the google maps API allows only a maximum of 25 destinations per request.
the problem that I'm having is the synchronization of the request and the result.
other APIs offer the option to include a key that lets the developer match a request and an asynchronous result.
however, since such a parameter is missing from the request specification, I have no idea how to make sure that I'm matching the right data to my incoming results.
I would be glad for a "clean" workaround - currently the only idea that I'm having is to have a unique number of destinations for each request, for instance the first with 25, the next with 24 destinations and so on. but I would not consider this a satisfactory approach.
thank you

Maximum limit on number of people can access a public Google spreadsheet?

I am developing an application which fetchs data from Google Spreadsheet using query.
The data comes in JSON format. I want to know whether there is any restrictions on number of request that can be sent to Google server for fetching the JSON from the spreadsheets.
I mean is there any restrictions like per hour or per day N requests.
Never seen a limit and suspect there is no such limit, but might be wrong.
Any ideas of rough volumes? I have has about 10 people at once use the service, it did slowdown a little, but was ok.
If it helps, the gdata style spreadsheet API takes about 1 Second/read and 2-10seconds/write (large sheets are slower)