insert MySQL from Google sheet - mysql

I need to insert a Google spreadsheet information into a MySQL table database automatically every 2 h. Can someone help me, please?
I have already used import tool in phpMyAdmin, but manually, and I need to do it automatically.
Maybe it can be with importmysql or load data infile script. But I need some guidelines, I'm not a programmer, however I can learn the basics to implement it.

Based from this forum: Google Apps Script to Export Spreadsheets to mySQL execute on multiple files:
If your MySQL instance is a public-facing instance, accessible from outside your local network, you could use Google Apps Script JDBC Service to connect to your MySQL instance and insert/update data from your google sheets. Please read the Setup for other databases section of JDBC guide for details on setting up your database for connection from Google Apps Script.
You can have one script that loops through all spreadsheets in a given folder and inserts data from each into your MySQL database.

Related

How to access BigQuery generally from Google Apps Script

I have a Google Apps Script application that currently accesses a GCP SQL database with JDBC.
Using the normal SQL database doesn't cut it, so I decided to try BigQuery.
Is there a way to access BigQuery from Google Apps Script without connecting with an account connected to the GCP project? I want guests who use my script to be able to get data.
I'm looking for either a general way (as in: IP, database, username, password and I manage the connection) or a client library way I can use with Apps Script.
Note, the BigQuery Apps Script plugin seem to only allow access to my own databases, so guests will get denied access.
As per this link , you can share your BigQuery dataset with specific users even if they're not part of the GCP project. Since they need to be able to retrieve data, they will need "Viewer" permission on the dataset. The steps to do this are described in here and this link will show you with an example how you can query your BigQuery table from Apps Script.

Google Form output to remote MySQL database

I have a Google Form I created. I have a website with a remote MySQL database. I would like to embed the Google Form into my site (this I've figured out), however instead of the data from form submission being stored in a Google Spreadsheet(s), I'd like the data to be sent to my MySQL database, to a predefined table designed to accept the data types being collected and validated for in the Google Form.
I have researched and come across Google Apps Scripts (https://developers.google.com/apps-script/guides/jdbc). My issue is that I'm not experienced with Google Apps Scripts and am seeking guidance in setting this up. I have the Google Form, access to a my Google Apps account, and the connection string to my remote MySQL database with administrator privileges. I'm seeking guidance, step-by-step as I have not found any tutorials online yet. If you can guide me to a tutorial, that would be appreciated as well.
Recently you can use a database in Google app script.
App script support ( Google Cloud SQL, MySQL, Microsoft SQL Server, and Oracle databases ) with JDBC Class
DOcumentation : https://developers.google.com/apps-script/guides/jdbc
You can add script in Google From.
It's possible to create a form that does not send responses to a linked spreadsheet. In the "RESPONSE" menu, choose "CHOOSE RESPONSE DESTINATION". You can create an installable 'Form Submit' trigger to run some code. In the code editor, choose RESOURCES, CURRENT PROJECT TRIGGERS, add a trigger, and set it to run when the form is submitted. Then you'll need to use get item responses:
Google documentation - Forms
Bruce McPherson offers some step-by-step guides on his site if you use his cDbAbstraction library to access any external DBs.
If you don't want to roll your own solution to this, check out the Form integration in SeekWell. You just write a SQL snippet and can map fields in the Form to parameters in the SQL.
Disclaimer: I made this.

Google Apps Script to Export Spreadsheets to mySQL execute on multiple files

Ahoy!
How can I export all of the Google Spreadsheet's data to a MySQL, I have the basics of an export script but All of my Spreadsheets have 1,500+ rows and there is 41 of them, next my question is Can I execute these scripts on all of the Spreadsheet Files at once, perhaps in a folder? because I don't fancy trawling through all 41 and assigning a script to each.
Thanks in advance :)
How can I export all of the Google Spreadsheet's data to a MySQL
There are several ways you can do this. Which one to use depends on how your MySQL instance is configured.
If your MySQL instance is a closed local network-only instance, then you can't connect to it from outside your local network, so google apps script will not be able to connect to it. In this case your only option is to export your google spreadsheets data as CSV files (i.e. using File->Download as->Comma-separated values menu), then import those into your MySQL db table. See Load Data Infile MySQL statement syntax for details.
If your MySQL instance is a public-facing instance, accessible from outside your local network, you could use Google Apps Script JDBC Service to connect to your MySQL instance and insert/update data from your google sheets. Please read the Setup for other databases section of JDBC guide for details on setting up your database for connection from Google Apps Script.
Can I execute these scripts on all of the Spreadsheet Files at once,
perhaps in a folder?
In the second case (public-facing MySQL instance) you can definitely automate this with a bit of scripting. You can have one script that loops through all spreadsheets in a given folder (or a list of spreadsheet ids, if they are in different folders) and inserts data from each into your MySQL database. The Drive Service and Spreadsheet Service will be your friends here. However, keep in mind that maximum execution time for a google script is 10(?) minutes, so if your sheets contain a lot of data and/or your connection to your db instance is slow, such script may run into a timeout. You may have to implement some back-off/resume functionality in your script so it knows where it finished previous run and picks up from there on next run.

Connect Google Drive to Could SQL

I am looking to try and combine Google's Cloud SQL service with google drive. Essentially I want to use Google forms for the user to easily input data, and then have that data feed into the Cloud SQL environment (from which I can do reporting and analysis).
My question is, has anyone done this already, or have any ideas of how this might be accomplished? I already have google forms writing to spreadsheets, and that works fine. I am familiar with SQL so creating reports and pulling data from the Cloud SQL environment shouldn't be hard.....but I don't know how to connect the two.
Ideally I would like something to run on a schedule (maybe on a nightly basis) to pull the data from a google spreadsheet, update the Cloud SQL database, and clear the old data from the spreadsheet.
I think the best way to do this would be to use Google Apps Script.
Apps Script allows you, for a given form, to define an onSubmit() method that will be called when someone submitted an answer to the form.
Then you can use the JDBC service to connect to Cloud SQL.

Sync data from Google spreadsheet to Cloud SQL using Google Apps Script

I am creating an application where I will be inserting spreadsheet data to Cloud SQL using jdbc service. I would like to know answers of following :
Q1. What will happen if the insert statement is executing and someone closes the spreadsheet/script?
Q2. is there any possibility of inserting the data to Cloud SQL using Google Apps script in offline mode?
The answer to both questions is actually a similar one - since both Cloud SQL and Apps Script are entirely always online cloud systems - it doesn't matter if the file is closed or if a user goes offline. The script should run to completion.