Google cloud instance instantiation - Authorized GAE - mysql

My task is to create mysql insided google cloud sql. Following instructions I try to set an instance unluckily. The problem is a message
"Authorized GAE applications must be in the same region as the database instance"
at the time when I have checked both instance and application for that region setting and it is matching. I don't know what shall I put in the box "authorized networks". Thanks in advance.

That message means you chose a region (EU for example) for your Cloud SQL that is different from the region of your App Engine application (US for example) where you created the Cloud SQL instance.
From the documentation
Note: An App Engine application must be in the same region (either
European Union or United States) as a Google Cloud SQL instance to be
authorized to access that Google Cloud SQL instance.
As the GAE location can't be changed, you should change the region of the Cloud SQL instance, which also can't be changed. So you'd need to create a new instance in the exact region of your app.
The Authorized networks is exactly what Paul said. The IPs or subnetworks you want to whitelist to access your instance, only if you plan to access your instance with mysql client.

Related

Oracle Cloud Infrastructure - Replicate Vault Across Regions

I have created a vault/key under a compartment.
As vault service is a regional service it is only available under the region I created it.
Even if tenancy subscribes to multiple region the compartment shows up but still Vault is not available for that region. Is there a way we could replicate Vault / Key /secrets while tenancy subscribes to multiple regions .
I have not done this myself, but you could try this approach and see if the following steps will work for you:
Step 1. Use the BackupKey/BackupVault API (from Vault Service) in the SOURCE region to create the relevant key/vault encrypted file(s).
Step 2. Use the CopyOBject API (from Object Storage Service) to copy the file(s) created in Step 1 from your SOURCE region to all DESTINATION regions.
Step 3. Use the RestoreKey/RestoreVault API (from Vault Service) to restore the key/vault in the DESTINATION regions. See

IBM Watson Assistant: How to read data from an AWS MySQL table and determine dialogue based on the data (Yes or No)?

We have an AWS MySQL users table that has a column (flag) to indicate if the user is a paid member or not (Yes or No).
Can IBM Watson Assistant on IBM Cloud read the data, given the user ID, and depending on whether it's Yes or No, proceed to a different dialogue sequence?
Do you have an example of how to do it?
It is possible to reach out to database systems from within an IBM Watson Assistant dialog. For an example including code see this IBM Cloud solution tutorial on building a database-driven Slackbot.
Watson Assistant supports so-called programmatic calls from within a dialog node. It allows to either signal the calling application to perform some action or to invoke an IBM Cloud Functions action. In the mentioned tutorial Cloud Functions is used to to reach out to a database system to retrieve or insert data.
In your case, you would write an IBM Cloud Functions action, would need to bind the credentials, and then, in the chatbot dialog, invoke that action to check for the member status.

How to assign multiple service account credentials to Google Cloud Functions?

I have three service accounts:
App engine default service account
Datastore service account
Alert Center API service account
My cloud functions uses Firestore in datastore mode for book keeping and invokes Alert Center API.
One can assign only one service account while deploying cloud functions.
Is there way similar to AWS where one can create multiple inline policies and assign it to default service account.
P.S. I tried creating custom service account but datastore roles are not supported. Also I do not want to store credentials in environment variables or upload credentials file with source code.
You're looking at service accounts a bit backwards.
Granted, I see how the naming can lead you in this direction. "Service" in this case doesn't refer to the service being offered, but rather to the non-human entities (i.e. apps, machines, etc - called services in this case) trying to access that offered service. From Understanding service accounts:
A service account is a special type of Google account that belongs to
your application or a virtual machine (VM), instead of to an
individual end user. Your application assumes the identity of the
service account to call Google APIs, so that the users aren't
directly involved.
So you shouldn't be looking at service accounts from the offered service perspective - i.e. Datastore or Alert Center API, but rather from their "users" perspective - your CF in this case.
That single service account assigned to a particular CF is simply identifying that CF (as opposed to some other CF, app, machine, user, etc) when accessing a certain service.
If you want that CF to be able to access a certain Google service you need to give that CF's service account the proper role(s) and/or permissions to do that.
For accessing the Datastore you'd be looking at these Permissions and Roles. If the datastore that your CFs need to access is in the same GCP project the default CF service account - which is the same as the GAE app's one from that project - already has access to the Datastore (of course, if you're OK with using the default service account).
I didn't use the Alert Center API, but apparently it uses OAuth 2.0, so you probably should go through Service accounts.

Google Cloud function send call to app hosted on GKE

I would like to load data to my db hosted on GKE, using cloud function (small ETL needs, Cloud function would be great for that case)
I'm working in the same region. my GKE has an internal load balancer exposing an gcloud internal IP.
the method called is working perfectly when it's from Appengine but when doing it with cloud function I have an connexion error : "can't find client at IP"
I would like to know if it is possible ?
if so, what would be the procedure ?
Many thanks !!
Gab
We just released this feature to Beta. You can get started by following our docs:
https://cloud.google.com/functions/docs/connecting-vpc https://cloud.google.com/appengine/docs/standard/python/connecting-vpc
https://cloud.google.com/vpc/docs/configure-serverless-vpc-access
This is not currently possible as of today.
https://issuetracker.google.com/issues/36859738
Thanks for your feedback.
You are totally right. At the moment the instances are only able to receive such requests via the external IP [1].
I have filed a feature request in your behalf so that this functionality might be considered for future deployments. I cannot guarantee this will be implemented or provide an E.T.A. Nevertheless, rest assured that your feedback is always seriously taken.
We also reached out to our Google Cloud representative who confirmed this was a highly requested feature that was being looked at but was unable to provide an ETA as when it would be released.

Whitelist IBM Cloud function location

Hi does anyone know what I can use to whitelist IBM Cloud function locations? I wrote a function that makes rest-api calls to a server but the server needs to whitelist incoming requests. Eg. If I select "US South" as the location for my IBM Cloud function, then what ip/domain/hostname etc does that appear as so I can whitelist it?
Thank you.
I recommend to have a look at IBM Cloud's Statica service which allows you to access restricted resources behind firewalls and whitelisted services using a static IP regardless of where your app is running or the number of instances.
https://console.bluemix.net/catalog/services/statica
Does this help?