is there any database for Google apps script/workspace add-on - google-apps-script

I am working on a google workspace add-on for gmail. As part of it, I want to store and index some data, so that it can be later aggregated/enriched for insights.
I don't want to use some external(outside-google-workspace) database as it will have data privacy and security implications. So, I want to keep that data within the safe precincts of the user workspace.
Is there any database service within the google workspace for the above mentioned requirements?
PS: I am already aware of Properties service which can store configurations. The data i want to persist is definitely more than that.

There is no such database.
The closest alternatives, as already mentioned, are PropertiesService, which has obvious limitations, and Google Cloud SQL databases, which live in Google's cloud but is not part of Google Workspace.

Related

AppScript. Will linked GCP project affect quota?

I'm confused with one moment regarding GCP. So, i have different AppScript projects on different google accounts and i want to link them to single GCP project (on organization level) to be able to see logs in single place (followed this description). Everything work fine, but i don't understand now if it will affect quota somehow. From my understanding quota described here is per user, but i just worry that i missed something and because of the linked GCP project quota will be splitted between all appscript project.
I would appreciate any help and advice related to this question)
I don't think there's any limit to read/write operations for sheets using apps script, there's nothing about it in quotas and it doesn't make much sense.
There are quotas if you use Sheets Api via http
https://developers.google.com/sheets/api/limits
but that's entirely other story, and you only need this to access sheets from other environments(nodejs,php,python etc)
If you're using apps script I believe you're only limited by script runtime and total trigger runtime, which are counted per-script.

Best Practice to Store Keys in Google Apps Script

I would like to inquire for best practices in storing keys within the Google AppScript environment.
Currently, while I am prototyping, I just store the keys as local variables in the file where it is used. This key is used by all users of my App Script regardless of their domains.
Moving forward however I would like to safekeep this key in a more reliable storage. And would like to ask for advice on how to best safekeep these keys.
Currently, I am thinking of :
Using PropertiesService.getUserScriptProperties().setProperty(key,value) as this is shared by all users.
as part of the manifest? Is there a way to add userData in the contextual and homepage triggers?
Or just keep using local variables as the code is not visible to the users anyway?
Thank you all in advance.
I understand that you ask about the best way to store a static key that would be retrieved by anybody who runs your Apps Script project, being indifferent to its domain. I also assume that the script is being run as the owner (you) and that the final users shouldn't be able to read the key, please leave a comment if that isn't the case. With this situation you have the following approaches:
The straightmost approach would be to use the Properties Service. In this particular scenario the key should be accessible to anyone executing the script, therefore PropertiesService.getScriptProperties() is the way to go (you can learn more about other scenarios here).
As an alternative you could store the key in your own database and use the JDBC Service to access it. This method is compatible with Google Cloud, MySQL, Microsoft SQL Server and Oracle databases. Here you can learn more about reading from a database.
Another possible choice is the Drive API. You could take advantage of the application data folder since it is intended to store any files that the user shouldn't directly interact with. For this option you would need to use the Advanced Drive Service on Apps Script.
Please be aware that an admin from your domain could access (or gain access) to the stored key. Also please check the quota limits to see if it fits your usage.
As you may have noticed, the PropertiesService provides several methods for storing key/value at the document's level, user's level or script level:
getDocumentProperties()
getUserProperties()
getScriptProperties()
I'd recommend to store a property based on who needs access to it. If only the authenticated user should have access to a property (such as for example a setting relevant only to its accounts, like it's locale language), go with the UserProperties. Contrary, if the property is relevant to a document (Google Docs, Google Sheets, etc.) go with the DocumentProperties.
With this said, I wouldn't recommend using the ScriptProperties in general. The main reason being that quota applies to the PropertiesService (see table below). This means that as your add-on gets more and more users, you will hit the quota limit quite rapidly.
| Service | Consumer accounts (e.g. #gmail.com) | Google Workspace accounts |
|--|--|--|
|Properties read/write | 50,000 / day | 500,000 / day |
Source: https://developers.google.com/apps-script/guides/services/quotas
Depending on your use case, you might also be tempted by alternatives to the PropertiesService:
using local variables in your code - as you mentionned
using the CacheService - which store data for a limited period of time.
making request to a distant server where you could query your own database.
We heavily rely on the latest at my company, thanks to the UrlFetchApp service. The main reason being that it allows us to pull a user profile from our database without making updates to the codebase.

Running python script on database file located in google drive

I have a database file which is located on my own google drive (private) and it's updated on a daily basis.
I wrote a python script in order to track the data from the database file, however I need to download the DB file to my pc, and then running my script locally, every single day.
I am wondering if there are any better options, so I wouldn't have to download the DB file and move it manually.
From a slight searching on the web I found that there is no way to run the script in the google drive folder (obviously due to security issues), and using google cloud platform is not a real option since it's not a free service (and as I understood there is no free trial).
Anyways, any method that would make my life easier would be happily accepted.
Sorry
That's not possible AFAIK. At least in the way you have asked the question.
It may be that you are looking for a Database hosting service, which, as a rule, are not free. I remember seeing a SQL viewer around, I don't know if it is still available and I don't think it was accessible via a local Python script.
Google Cloud Platform, as other services, do however offer a free tier of services - though this depends on how much use you need to give it and also on your location. Though this can get quite involved quite quickly. There are also various options to choose from. For example Big Query may be something you are interested in.
A Possible Workaround
I assume that the reason you don't want to download it is because it is tedious, and not because your network connection or hard drive can't handle it, if so:
The simple workaround may just be to find a way to automatically download the database via the (Google Drive API)[https://developers.google.com/drive] with your Python Script, modify it, or read it, and then upload it again if you need to. This can be done all within one script using the API. So all you would need to do is run the Python script and you wouldn't need to manually download and move it into a folder. You would also not need to worry about running over the free tier and getting billed for it.
You could even use the Google Drive Desktop software to keep the database file synced locally. Since it is local, I doubt the software would be able to block a Python script on the file system.
If you don't even want to think about it, you could set up the Python script with a CRON job or a Scheduled task if you are on Windows.

Executing SSH commands from Google Apps script

I'm trying to create a Google apps script that adds a new user to a Ubuntu vm that I've created whenever a form is submitted. I'm wondering if there is some way to initiate an ssh connection from a Google apps script that would allow me to login to the vm and create a new user. I have the IP and login credentials for the vm. I've set it up so that the script will run whenever a form is submitted, however I'm not sure where to go from there. I apologize in advance if there is a better way to do this, I could just manually create the accounts based off form submissions, but I really need the automation. If there is a solution to this, even if it doesn't involve ssh, I would really appreciate the help!
This is not trivial.
Google AppsScript does not support SSH by default, so you have to work around that.
The user Perhaps you see this name has given you a great idea. I'll further explain how to do what he suggested below:
What you will need on the linux machine
A web service, callable from the google IPs (you can white list it or leave it open to the public (which is dangerous, and should be done only as last resort)).
A account with user creation permission on linux.
A script to create the new users from the data received on the web service.
For the first part, you can do this with any technology you want. I recommend Node.Js + Express.js, as it is easy to create what you want with child processes.
I'll assume you already have an user account able to create users. You probably want to use that.
The last part is just another linux command. You can just Google it and you'll find lots of examples.
There is one catch, while the real-time user creation option with APIs might look enticing to you, I would strongly advise against leaving a public service for something like creating users, as that could become a security risk.
What you might want to do instead is to have a machine with no value (AKA a cheap machine your planned to throw away, with no important data and no confidential information) hosting your web service and then make a script on your Ubuntu VM to fetch the data from said service in an encrypted secure way.

Saving one record using Google Cloud Functions and then overwriting it

I'm working with gmail API and need to save the historyID to determine the changes that happened in the email from the pubsub events.
However, I don't need to store all the historyIDs and just need to pull the old historyID, use it in my function, and overwrite it with the new one.
Wondering what kind of architecture would be best for this. I can't use the temp storage of google cloud functions because it would not be persistent.
Using google sheets requires extra authorization within the cloud function. Do I really need to make a new cloud bucket for one text file?
It seems like Cloud Datastore would be your best alternative to Cloud Storage if your use case is to persist, retrieve and update the historyID as log data at low-cost. Using Google Cloud Functions and Cloud Datastore would be like a server-less log system.
Datastore is a NoSQL document database built for automatic scaling, high performance, and ease of application development. It can handle large amount of non-relational data with relatively low price. It also has a user-friendly Web Console.
I found a very useful web tutorial which you can use to help you architect a Cloud Functions with Cloud Datastore solution like so:
Create Cloud Datastore
Obtain Datastore Credential
Update your Code
Deploy your Cloud Function
Send a Request to Cloud Function
Check the Log on Datastore
Take a look at the full tutorial here. Hope this helps you.