How to connect to amazon mysql aurora from google cloud function? - google-cloud-functions

I need to fetch data from amazon mysql aurora inside google cloud function. How can I connect to mysql aurora from cloud function to do the same?
I am able to connect to mysql from my local system using mysql connector in python as below.
import mysql.connector
cnx = mysql.connector.connect(user='user', password='password',host='host', database='db')
c_list = cnx.cursor()c_list.execute('select id, type from table ch limit 1')
But it is not working inside cloud function.

Related

Laravel: No such file or directory for table in AWS RDS laravel

I currently deployed my laravel app to the AWS elastic beanstalk. I created RDS mysql database. I imported my sql dump from my local project and deployed it to the AWS EC2 server. But I'm getting this error.
SQLSTATE[HY000] [2002] No such file or directory (SQL: select * from settings where code = company-name limit 1).
Is there any way how to check if there are data in my DB ? Just to note I'm still learning moving around AWS. Thanks.
EDIT :
I'm also getting correct Environment & details like RDS_PORT RDS_HOST... , on error page where I can see the error displaying
You can follow this tutorial to connect to a MYSQL RDS Database on AWS.
So you can check if all your tables have been created successfully.
AWS Connect to Database Tutorial
Make sure that you open the port of the RDS database in the security group. Otherwise your local machine can't connect to the Database.

Cloud Run: Connecting to Cloud SQL instances

I am unable to connect to a Cloud SQL instance when running an image on Cloud Run. Is this feature working yet?
I have successfully connected to the same SQL instance with Compute Engine.
Tried to connect to the Cloud SQL instance using a simple shell command:
mysql --host=$MYSQL_IP --user=$MYSQL_ROOT --password=$MYSQL_PASS -e "SHOW DATABASES"
Result is logged as such:
ERROR 2003 (HY000): Can't connect to MySQL server on '.*..'
(110)
This question was asked several months before Cloud Run reached GA (General Availability).
Cloud Run (fully-managed) now supports connecting to Cloud SQL instances (both MySQL and PostgreSQL) using the public IP address of the database.
gcloud run services update run-mysql \
--add-cloudsql-instances [INSTANCE_CONNECTION_NAME] \
--set-env-vars CLOUD_SQL_CONNECTION_NAME=[INSTANCE_CONNECTION_NAME],\
DB_NAME=[MY_DB],DB_USER=[MY_DB_USER],DB_PASS=[MY_DB_PASS]
...where INSTANCE_CONNECTION_NAME is of the form PROJECT_ID:REGION:INSTANCE_ID (as returned by gcloud sql instances describe <INSTANCE-ID> | grep connectionName or available in the Overview section of the Cloud SQL instance in Cloud Console).
Note that the service account used by Cloud Run to authorize your connections to Cloud SQL must have the correct IAM permissions which is will require some configuration if the DB instance and the Cloud Run services are not part of the same project.
The above takes care of connectivity between Cloud Run and Cloud SQL.
Having your application actually talk to the Cloud SQL instance requires connecting from your Cloud Run service using the Unix domain socket located at /cloudsql/INSTANCE_CONNECTION_NAME (these connections are automatically encrypted btw).
Different languages and different database drivers will use different ways to connect to the Cloud SQL instance. This link has more details.
See also How to securely connect to Cloud SQL from Cloud Run?

How to call or import a table from Google cloud SQL into Spark dataframe?

I have created an instance in Google Dataproc and I am running pyspark over it. I am trying to import data from a table into this pyspark. So I created a table in Google cloud platform SQL. But I don't know how to call or import this table from other pyspark. Like I dont have any url kind of thing to point to this table. Could you please help in this regard.
Normally, you could use spark.read.jdbc(): How to work with MySQL and Apache Spark?
The challenge with Cloud SQL is networking -- figuring out how to connect to the instance. There's two main ways to do this:
1) Install the Cloud SQL proxy
You can use this initialization action to do that for you. Follow the instructions under "without configuring Hive metastore", since you don't need to do that:
gcloud dataproc clusters create <CLUSTER_NAME> \
--scopes sql-admin \
--initialization-actions gs://dataproc-initialization-actions/cloud-sql-proxy/cloud-sql-proxy.sh \
--metadata "enable-cloud-sql-hive-metastore=false"
The proxy is a local daemon that you can connect to on localhost:3306 and proxies to the cloud sql instance. You'd need to include localhost:3306 in your jdbc connection uri in spark.read.jdbc().
2) If you're instead willing to add to your driver classpath, you can consider installing the Cloud SQL Socket factory.
There's some discussion about how to do this here: https://groups.google.com/forum/#!topic/cloud-dataproc-discuss/Ns6umF_FX9g and here: Spark - Adding JDBC Driver JAR to Google Dataproc.
It sounds like you can either package it into a shaded application jar in pom.xml, or just provide it at runtime by adding it via --jars.

Connect to AWS RDS MySql database with Python

I am currently trying to connect to my MySql database created on AWS with a python program using the library PyMySQL.
import pymysql.cursors
connection = pymysql.connect(host='.blablablablabla.us-east.rds.amazonaws.com',
user='Username',
password='Password',
db='nameofmydb',
)
When I run the above code I get the following error:
pymysql.err.InternalError: (1049, "Unknown database 'nameofmydb'")
What am I doing wrong? The arguments I gave to the function are correct. Could it be a problem with the MySql driver?
Thank you in advance
The message is the standard one, returned by mysql server. It means what it says, it can not find the database. Doesn't look like a connectivity problem or a driver issue: it reaches the server and the server returns a message.
Just make sure you have created the database using the CREATE DATABASE command. Use the mysql CLI client to connect to the database and type SHOW DATABASES to see all the available databases.

AWS connect to database

I created a database on AWS - RDS.And created a nodes app.
On MySql Workbench localhost I connected to database using endpoint and it was success and I got data on localhost/users
I deployed the nodejs app server to AWS but I did not get any data when I tried to call https://xalynj2ul4.execute-api.us-west-2.amazonaws.com/staging/users.
Try white-listing your node server in mysql host config file by entering your server ip in it