Shift from phpmyadmin as Server to Google Cloud platform as server? - mysql

Currently I am using PhpMyAdmin as my server. Now I want to shift to Google cloud as my server.
I have few php files which are called as url from android app, these php file connects to mysql database and returns result back. Relative paths of images are stored in the database and these images are stored on server.
I am not able to figure out how I can perform same thing with Google cloud. I tried to upload same php file on google cloud and database to google cloud sql. But, I don't know how to access relative path images or individual php files, and I am lost as I am new to Google cloud.
Can someone please help? Thank you!

You'll probably want to use Google Cloud Storage:
https://cloud.google.com/appengine/docs/php/googlestorage/
Your site users can't upload their files directly to App Engine, so you can let them upload to Cloud Storage and then keep the metadata in your Cloud SQL.

Related

Building cloud service using python google drive API

Hi i'm trying to make my own cloud web service that uses python google drive API.
what i'm trying to make is basically cloud service but can also interact with Google Drive.
To do that, my web service users have to have their own 'storage.json' file in their server's virtual directory (which really stores in server).
when I somehow get my own 'storage.json' file, (I followd: https://developers.google.com/drive/api/v3/manage-uploads)
and upload 'storage.json' file to my directory like User1, upload and download with Google drive works just fine.
but my final goal is when my user doesn't have 'storage.json' file in their directory like User2, I want browser to pops up in User2's computer and User2 can follow the google's authentication flow and finally download 'storage.json' file into server's ~/User2/ directory.
Do you think this is possible? or if there are better ways, can you notice me? thank you

How to upload a CSV file in a microservice deployed in cloud foundry

I am new to cloud foundry. I am currently working on a requirement where I have to upload a CSV file (via JSP UI) into a service deployed in cloud foundry and persists its data in service.
The issue is from UI, I only get a local path of that CSV file and when I am trying to parse that CSV via this path the file is not recognized. I guess the reason is service is already deployed in CF, so it does not recognize this local machine path.
Can you please let me know how can I parse this CSV file in local machine and where to parse this CSV.
Thanks in Advance!
There is nothing specific to Cloud Foundry about how you would receive an uploaded file in a web application. Since you mentioned using Java, I would suggest checking out this post.
How to upload files to server using JSP/Servlet?
The only thing you need to keep in mind that's specific to Cloud Foundry is that the filesystem in Cloud Foundry is ephemeral. It behaves like a normal filesystem and you can write files to it, however, the lifetime of the filesystem is equal to the lifetime of your application instance. That means restarts, restages or anything else that would cause the application container to be recreated will destroy the file system.
In short, you can use the file system for caching or temporary files but anything that you want to retain should be stored elsewhere.
https://docs.cloudfoundry.org/devguide/deploy-apps/prepare-to-deploy.html#filesystem

How to upload user defined function to Google cloud MySQL instance

I am running a MySQL database instance on Google Cloud, and I am currently trying to upload a shared library file (my_library.so) to my plugin directory, so that I can declare it as an user-defined function to call in MySQL queries.
Does someone know how to do it?
It seems that the Google cloud console does not give me access to the files in my instance. Also, I have not been able to upload it using MySQL workbench.
Thank you
Cloud SQL does not support uploading compiled code.

Google Drive upload in Service Account

We have a requirement where we should provide capability to upload files up to 100 GB size. Current flow which we have is to put the file from client location/local system to the application server. Then application server pushes the file to a service account in Google Drive server. I would like to know if there is a way to push the file from local system directly to service account in Google Drive. This would help us to not have to store such big files in application server. Anyone has any suggestion on this ?
Service accounts are not meant to be used like this. They have limited quota: Google Drive Service Accounts Additional Storage

Google Cloud SQL: keeping dev and live databases updated

I have a local MySQL dev instance that I've been developing with and now I want my local MySQL table changes to be made in the live Google Cloud SQL instance that I have connected in my Google Plugin for Eclipse. I can't figure out how to do this except for a full dump and import. Is there another/better way? Ideally, database versioning would be great, but I don't think that exists with Google Cloud SQL.