IBM Cloud Functions Service can't retrieve a list of databases on trying to create a new binding to a Cloudant action - ibm-cloud-functions

I'm trying to create a sequence for documents uploading. I created the Cloudant instance, database in it and IAM and legacy credentials. Suddenly, I receive an error retrieving a list of databases when I try to bind a Cloudant action to a Cloudant instance. What do I check to resolve this issue?
Failed to retrieve a list of databases

I got the same problem.
You can choose Instance > Input your own credentials instead.
Fill your username, password. Host can be found in Cloudant > Service credentials.

Related

How to integrate Own Database in WSO2

I have installed WSO2 IS 6.0.0 and it is working. I have even configured Service Providers which are working fine.
Now I have my own MYSQL Database which has its own table(local_users) with users in it. There are around 10 users.
I want those users to be able to log in to WSO2 MYAcount using a username and password from the "local_users" Table and not WSO2 User Store.
I have read the documentation here which seems to only create WSO2 tables in MYSQL Database. But it doesn't take my table(local_users) from the same Database. I didn't find any option to give my table name or column name.
So my question is, is it possible to incorporate your own MYSQL DB Table with WSO2 Identity Server?
You can plug your own database which has user data as a userstore to WSO2 IS.
For that, you have to write a custom userstore manager to manage the user store, because the DB schema is different than the schema used in WSO2 default userstores.
Refer to https://nishothan-17.medium.com/custom-user-store-manager-for-wso2-identity-server-5-11-0-6e23a4ddf1bb this guide for more information on writing a custom userstore manager and plug your own userstore to WSO2 IS.

Cloud Functions - PrismaClientInitializationError: Can't reach database server at `my.cloudsql.ip`:`3306

Im trying to deploy my NodeJS app to Google cloud functions connected to a Cloud SQL instance (MySQL) and using Prisma ORM.
The deployment was successful, but whenever I access an API route with a connection to the database, I get the following error as a response:
PrismaClientInitializationError: Can't reach database server at `my.cloudsql.ip`:`3306`. Please make sure your database server is running at `my.cloudsql.ip`:`3306`.
My database string looks like this: "mysql://user:password#cloud-sql-ip/database?host=/cloudsql/instance-connection-name"
I alread try adding ?connect_timeout=300 to the connection string of the database as mentioned here. But I didn't succeed.
It also took me a while to figure this out, here's the URL that worked for me:
DATABASE_URL=mysql://{USER}:{PASSWORD}#{INSTANCE_PUBLIC_IP}:3306/{DB_NAME}?socket={INSTANCE_CONNECTION_NAME}
Then, in your schema.prisma:
datasource db {
provider = "MySQL"
url = env("DATABASE_URL")
}
ALSO, remember to update the IAM permissions in your Cloud Function/Cloud SQL instance to authorize the Cloud function to read from DB:
https://cloud.google.com/sql/docs/mysql/connect-functions
In my case, It is because I use the generated password from cloud SQL .Thus, I encode it and my connection string is like
DATABASE_URL=mysql://{USER:{PASSWORD}#{INSTANCE_PUBLIC_IP}:3306/{DB_NAME}

Sync user data from amazon cognito to my mysql database from laravel after user is registered

I need to use AWS Cognito user authentication service with my React-Native app and also store user data through my mysql database from my laravel backend..
is there a way after user is registered in aws cognito, to send all its data to my mysql database and store it in my users table? like the email, name, cellphone
so after I can be able to use my 'mysql' database to makes queries inside my app..
Thanks in advance..
You can use Cognito Trigger for it, for example
Documentation
Example

Google Cloud SQL - Catch bad logins

I have an existing MySQL (version 5.7) instance hosted (managed) by Google Cloud SQL. I want to get a notification when someone is trying to connect my database with a bad username\password.
My idea was to look for it on the Google Stackrive logs, but it's not there.
There is an option to collect this information?
UPDATE 1:
I tried to connect the instance with gcloud but unfortunately, it's not working.
$ gcloud sql connect mydb
Whitelisting your IP for incoming connection for 5 minutes...done.
ERROR: (gcloud.sql.connect) It seems your client does not have ipv6 connectivity and the database instance does not have an ipv4 address. Please request an ipv4 address for this database instance.
It's because the database is accessible only inside the internal network. I searched for flags like --internal-ip but didn't find one.
However, I was guessing that it's not making any difference if I'll try to access the database from my DB editor (workbench). So I did it:
Searching for the query that #Christopher advised - but it's not there.
What I missed?
UPDATE 2:
Screenshot of my Stackdrive:
Even if I remove this (resource.labels.database_id="***") condition - the result is the same.
There is an option to collect this information?
One of the best options to collect information about who is trying to connect to your Google Cloud SQL instance with wrong credentials is Stackdriver Logging.
Before beginning
To reproduce this steps, I connected to the Cloud SQL instance using the gcloud command:
gcloud sql connect [CLOUD_SQL_INSTANCE]
I am not entirely sure if using the mysql command line something will change along the lines, but in case it does, you should only look for the new log message, and update the last boolean entry (from point 4 on).
How to collect this information from Stackdriver Logging
Go under Stackdriver → Logging section.
To get the information we are looking for, we will use advanced log queries. Advanced log queries are expressions that can specify a set of log entries from any number of logs. Advanced logs queries can be used in the Logs Viewer, the Logging API, or the gcloud command-line tool. They are a powerful tool to get information from logs.
Here you will find how to get and enable advanced log queries in your logs.
Advanced log queries are just boolean expressions that specify a subset of all the log entries in your project. To find out who has enter with wrong credentials into your Cloud SQL instance running MySQL, we will use the following queries:
resource.type="cloudsql_database"
resource.labels.database_id="[PROJECT_ID]:[CLOUD_SQL_INSTANCE]"
textPayload:"Access denied for user"
Where [PROJECT_ID] corresponds to your project ID and [CLOUD_SQL_INSTANCE] corresponds to the name of the Cloud SQL instance you would like to supervise.
If you notice, the last boolean expression corresponding to textPayload uses the : operator.
As described here by using the : operator we are looking for matches with any sub string in the log entry field, so every log that matches the string specified, which in this case is: "Access denied for user".
If now some user enters the wrong credentials, you should see a message like the following appear within your logs:
[TIMEFRAME][Note] Access denied for user 'USERNAME'#'[IP]' (using password: YES)
From here is a matter of using one of GCP products to send you a notification when a user enters the wrong credentials.
I hope it helps.
As said in the GCP documentation :
Cloud Shell doesn't currently support connecting to a Cloud SQL instance that has only a private IP address.

AWS S3 error with MySQL dump importation in Drupal 7

I'm getting this error:
Aws\Common\Exception\InstanceProfileCredentialsException: Error retrieving credentials from the instance profile metadata server. When you are not running inside of Amazon EC2, you must provide your AWS access key ID and secret access key in the "key" and "secret" options when creating a client or provide an instantiated Aws\Common\Credentials\CredentialsInterface object. ([curl] 28: Connection timed out after 5010 milliseconds [url] http://169.254.169.254/latest/meta-data/iam/security-credentials/) in Aws\Common\InstanceMetadata\InstanceMetadataClient->getInstanceProfileCredentials() (line 85 of /var/www/public/sites/all/libraries/awssdk2/Aws/Common/InstanceMetadata/InstanceMetadataClient.php).
In this scenario:
I'm using the AWS S3 module in Drupal 7. One bucket from S3 works for development, and the other one works for staging.
After the client add some content in staging, I try to import the new DB dump from staging to develop and the error appers, maybe is something with the image path in the new dump, the AWS credentials work for development fine, the problem is with the new DB dump.
Con someone help me please?
Looks like your instance is not able to authenticate to S3 bucket specified.
1) Set AWS Credentials in your Environment Path ( Not Recommended ). Instead you shoud assign AWS IAM Role to the Instance to give access to S3 resource.
2) You need to change S3 bucket permission such that it allows the present server to access the bucket.
Iam not sure if this is your real case. Please elaborate your question so that we can give a precise solution.
I'm not allowed to comment. so, had to post an answer.