How to get load balancer data from google compute engine? - google-compute-engine

I have searched the various APIs including stackdriver and https://cloud.google.com/compute/docs/apis but cannot find any that return load balancing data.
This API returns the data I need but it seems to be an internal API used by google. network.googleapis.com/loadbalancer/requests
Any suggestions?

Related

How to do you store data to Google Drive using console.cloud.google

In my google console it says here Cloud Storage pricing
that the price for standard storage is $.026 per gigabyte month, which I think means that 500 gigs stored during one month will cost $13 since 500 * .026 = 13. But this article The Google Drive Price Cut Changes The Game For Personal Cloud Storage says:
Google is making a terabyte of cloud storage available for just $10
I don't see where you upload data to Google drive at Google cloud console.
My second question is that I want to make sure that I can create a virtual instance and connect it to Google drive or Storage and read the data from it and put that data into the RAM of the virtual instance.
Google Drive
is a web application which works as a file store allowing users to store files. Communication with it is normally done though the web application itself however developers can use the Google drive api to interact with google drive programticlly.
You may want to go though the documentation on the Google drive api to understand what its capable of.
Google cloud storage
is designed as a Unified object storage for developers and enterprises Cloud Storage allows world-wide storage and retrieval of any amount of data at any time. You can use Cloud Storage for a range of scenarios including serving website content, storing data for archival and disaster recovery, or distributing large data objects to users via direct download.
Interaction with this is done primarrly though the cloud console and command line tools.
I don't see where you upload data to Google drive at Google cloud console.
You dont cloud console wont help you upload to google drive.
My second question is that I want to make sure that I can create a virtual instance and connect it to Google drive or Storage and read the data from it and put that data into the RAM of the virtual instance.
Google drive is a web application you cant create a virtual instance of that.
You might want to go though a few of the quickstarts to understand how Google cloud console and the command line tool work Quick Starts

Get google maps api usage info

we using some google maps api's, and we wan't to add gapi usage/quotas to our monitoring system. Does Console api has it's own api for retrieving that metrics?
Maybe other solutions?
There is no current API endpoint to get the current quota usage information of your API. The only way to check the quota used by your application is by checking your own Developer Console. You can also see here the traffic (number of request per seconds), and error ratio of the enabled API of your project.

Google Maps v3 API - hitting geocoding limits

We have 2 websites hosted on a Cloud Server. Our provider has assured us that we have our own IP for the server.
The first does an address lookup.
The other is a store finder function to find local your lawyer.
Every few days we get the message.
You are over your query limit for the Google's Geocoding services. If you are on a shared IP, it's advised you upgrade to a dedicated IP so you dont have to share your requests with other sites. For more information on Geocoding limits, go to https://developers.google.com/maps/documentation/geocoding/#Limits
The api stops working [I assume blocked by Google].
Some notes:
We have check the api log on Google and only have around 60 requests in a month
The lookup form uses a honeypot and validation to restrict submissions
The lat/lon for the store finder come from the database. The lookup only occurs when the form is used.
We have sites running the same sort of lookup on non-cloud hosting working fine.
The pages with the maps are only being loaded around 1000 times a month.
We're out of ideas - anyone know what else we could try?
Thanks

Google Maps Engine with a local MySql Server

I want to use maps engine to show data in a map. The problem is that my data (kmz, csv, Mysql) is in a local server and because of internal politics I can't upload all this data to the cloud. I have seen that the Google Maps Engine API documentation talks about authentification for installed applications (https://developers.google.com/maps-engine/documentation/oauth/installedapplication). But does this mean that I can use Google Maps Engine locally? Can I use my local data in Google Maps Engine without uploading it to the cloud?
Thanks
Google Maps Engine is a cloud based application. You must upload your data to GME in order to make use of it. The link you reference is for oAuth - an authentication mechanism to provide access to GME maps requiring a user account. An installed application is, for example, a Windows app that uses the Maps Engine API.
If you can get over your cloud issue, you could use the Maps Engine API to write a connector from mySQL to Maps Engine relatively easily
In your situation you should probably look at geoserver.

How does Google Geocoding API usage limits apply to [Sales]force.com?

The following are the usage limits as specified by Google on their Developer Guide pages:
Google Maps JavaScript API v3 => For-profit web sites are permitted to generate up to 25,000 map loads per day
Google Geocoding API => subject to a query limit of 2,500 geolocation requests per day
Google Maps API for Business => may perform up to 100,000 requests per day
Am trying to evaluate using any one of the above for use with Visualforce on Salesforce.com (SFDC) platform [*]
I understand for a public website the requests are per IP. Now for SFDC, there could be many different Organizations on a particular server (say, NA1). So, two different companies using SFDC and Google Maps API could have an URL at https://na1.salesforce.com/something_here and their requests should be counted separately.
Will it be so? What will happen in case of each API?
[*]SFDC is a SaaS cloud for the purpose of our discussion. All users login through the same page but they could be logged into different "orgs"/"organizations" but their URLs might look similar
It's important to differentiate between the server-side and client-side limits here. The server-side geocoding api would have have the 2500 limit enforced across the shared Salesforce instance based on how many machines the requests come from (I assume NA1 isn't 1 huge server). Multiple organization using the free geocoding API would all share the same server-side geocoding limit. I've actually run into the same limits using Google's own App Engine platform, where a bunch of applications share the same outbound IP address.
For any sort of guaranteed performance you'll need to send the queries from your own server or go the Maps for Business route which lets you authenticate your queries to get those higher limits.
Client-side geocoding via the JavaScript API doesn't have these server-limits, so if users do any sort of action to trigger a geocode or two using the JS API is the best route.
You can already create your own "bucket" to track your 25K map loads per day by signing up for an API Key.
This question on SO addresses the geocoding API specifically being run from a visualforce page directly, Salesforce: Google maps query status 620 G_GEO_TOO_MANY_QUERIES and it does seem to mean that without a key the limits are shared. I would suspect that unless you plan on giving the app away that you are working on, you will pretty much be forced to pick up an upgraded API key. One thing you may want to look at to work around this is hosting the maps portion in another location, and iframing it into Salesforce.