Executing SSH commands from Google Apps script - google-apps-script

I'm trying to create a Google apps script that adds a new user to a Ubuntu vm that I've created whenever a form is submitted. I'm wondering if there is some way to initiate an ssh connection from a Google apps script that would allow me to login to the vm and create a new user. I have the IP and login credentials for the vm. I've set it up so that the script will run whenever a form is submitted, however I'm not sure where to go from there. I apologize in advance if there is a better way to do this, I could just manually create the accounts based off form submissions, but I really need the automation. If there is a solution to this, even if it doesn't involve ssh, I would really appreciate the help!

This is not trivial.
Google AppsScript does not support SSH by default, so you have to work around that.
The user Perhaps you see this name has given you a great idea. I'll further explain how to do what he suggested below:
What you will need on the linux machine
A web service, callable from the google IPs (you can white list it or leave it open to the public (which is dangerous, and should be done only as last resort)).
A account with user creation permission on linux.
A script to create the new users from the data received on the web service.
For the first part, you can do this with any technology you want. I recommend Node.Js + Express.js, as it is easy to create what you want with child processes.
I'll assume you already have an user account able to create users. You probably want to use that.
The last part is just another linux command. You can just Google it and you'll find lots of examples.
There is one catch, while the real-time user creation option with APIs might look enticing to you, I would strongly advise against leaving a public service for something like creating users, as that could become a security risk.
What you might want to do instead is to have a machine with no value (AKA a cheap machine your planned to throw away, with no important data and no confidential information) hosting your web service and then make a script on your Ubuntu VM to fetch the data from said service in an encrypted secure way.

Related

Best Practice to Store Keys in Google Apps Script

I would like to inquire for best practices in storing keys within the Google AppScript environment.
Currently, while I am prototyping, I just store the keys as local variables in the file where it is used. This key is used by all users of my App Script regardless of their domains.
Moving forward however I would like to safekeep this key in a more reliable storage. And would like to ask for advice on how to best safekeep these keys.
Currently, I am thinking of :
Using PropertiesService.getUserScriptProperties().setProperty(key,value) as this is shared by all users.
as part of the manifest? Is there a way to add userData in the contextual and homepage triggers?
Or just keep using local variables as the code is not visible to the users anyway?
Thank you all in advance.
I understand that you ask about the best way to store a static key that would be retrieved by anybody who runs your Apps Script project, being indifferent to its domain. I also assume that the script is being run as the owner (you) and that the final users shouldn't be able to read the key, please leave a comment if that isn't the case. With this situation you have the following approaches:
The straightmost approach would be to use the Properties Service. In this particular scenario the key should be accessible to anyone executing the script, therefore PropertiesService.getScriptProperties() is the way to go (you can learn more about other scenarios here).
As an alternative you could store the key in your own database and use the JDBC Service to access it. This method is compatible with Google Cloud, MySQL, Microsoft SQL Server and Oracle databases. Here you can learn more about reading from a database.
Another possible choice is the Drive API. You could take advantage of the application data folder since it is intended to store any files that the user shouldn't directly interact with. For this option you would need to use the Advanced Drive Service on Apps Script.
Please be aware that an admin from your domain could access (or gain access) to the stored key. Also please check the quota limits to see if it fits your usage.
As you may have noticed, the PropertiesService provides several methods for storing key/value at the document's level, user's level or script level:
getDocumentProperties()
getUserProperties()
getScriptProperties()
I'd recommend to store a property based on who needs access to it. If only the authenticated user should have access to a property (such as for example a setting relevant only to its accounts, like it's locale language), go with the UserProperties. Contrary, if the property is relevant to a document (Google Docs, Google Sheets, etc.) go with the DocumentProperties.
With this said, I wouldn't recommend using the ScriptProperties in general. The main reason being that quota applies to the PropertiesService (see table below). This means that as your add-on gets more and more users, you will hit the quota limit quite rapidly.
| Service | Consumer accounts (e.g. #gmail.com) | Google Workspace accounts |
|--|--|--|
|Properties read/write | 50,000 / day | 500,000 / day |
Source: https://developers.google.com/apps-script/guides/services/quotas
Depending on your use case, you might also be tempted by alternatives to the PropertiesService:
using local variables in your code - as you mentionned
using the CacheService - which store data for a limited period of time.
making request to a distant server where you could query your own database.
We heavily rely on the latest at my company, thanks to the UrlFetchApp service. The main reason being that it allows us to pull a user profile from our database without making updates to the codebase.

Running python script on database file located in google drive

I have a database file which is located on my own google drive (private) and it's updated on a daily basis.
I wrote a python script in order to track the data from the database file, however I need to download the DB file to my pc, and then running my script locally, every single day.
I am wondering if there are any better options, so I wouldn't have to download the DB file and move it manually.
From a slight searching on the web I found that there is no way to run the script in the google drive folder (obviously due to security issues), and using google cloud platform is not a real option since it's not a free service (and as I understood there is no free trial).
Anyways, any method that would make my life easier would be happily accepted.
Sorry
That's not possible AFAIK. At least in the way you have asked the question.
It may be that you are looking for a Database hosting service, which, as a rule, are not free. I remember seeing a SQL viewer around, I don't know if it is still available and I don't think it was accessible via a local Python script.
Google Cloud Platform, as other services, do however offer a free tier of services - though this depends on how much use you need to give it and also on your location. Though this can get quite involved quite quickly. There are also various options to choose from. For example Big Query may be something you are interested in.
A Possible Workaround
I assume that the reason you don't want to download it is because it is tedious, and not because your network connection or hard drive can't handle it, if so:
The simple workaround may just be to find a way to automatically download the database via the (Google Drive API)[https://developers.google.com/drive] with your Python Script, modify it, or read it, and then upload it again if you need to. This can be done all within one script using the API. So all you would need to do is run the Python script and you wouldn't need to manually download and move it into a folder. You would also not need to worry about running over the free tier and getting billed for it.
You could even use the Google Drive Desktop software to keep the database file synced locally. Since it is local, I doubt the software would be able to block a Python script on the file system.
If you don't even want to think about it, you could set up the Python script with a CRON job or a Scheduled task if you are on Windows.

Preparing to switch to google compute engine for web hosting

Im currently in the process of switching to google compute engine for my web hosting because my current provider performance has been deteriorating over time. Giving me more flexibility to upgrade as I need to.
Ive got my website setup and working on the engine. But the next steps need to go smoothly to ensure my customers don't experience any downtime.
I have a few things I need to work out:
- Does google have a way of managing email addresses at your own domain? Then I can just send or receive from gmail.com or another email client on my domain? Or do I have to setup a email server in my VM? If so is there any way to setup a cpanel like management software on it?
- To my understanding I should just have to call my current provider to ask them for my SSL certificates and for them to switch my domain over to google and then point it to my VM? or is there something I'm missing here?
Are there any simple ways to ensure my server says secure when I'm managing it myself other then just updating packages manually? Like a website I can use to track known security problems with the packages I have installed?
Edit:
Please read Dan Cornilescu's comment on this question about setting up your own custom domain email. He said it can possibly be managed using google apps.
On the topic of SSL/Domains I called my current provider and they said they would help me switch over if its what I decided. They also upgrading my hosting plan and things seem better now and are comparable to the performance I was getting on my google VM so Ill be trying that for now.

Connecting MySQL Database with Google Chrome Extension?

Is it possible to do this?
I'm familiar with MySQL when it comes to webapps, but I'm learning how to create google chrome extension, and I'd like to somehow connect to my database from existing web app. I have created a form in the popup.html from google's example, and would like to send input values to a script to insert data into MySQL when user submits this data. How would I do that?
Right I'm in developing stage and uploading chrome extension is easily done without uploading to my server. So I'm just wondering how I can connect MySQL database if its not localhost?
Thanks!
I think what you don't want to give your chrome extension access to the database. That's a security nightmare of biblical proportions.
What you want to do is create a remote web-service that the extension can talk to, that then handles your database operations.
This can be as a PHP script that you call that returns a quick bit of JSON, or vastly more complex.

Linux web front-end best practices

I want to build a web based front-end to manage/administer my Linux box. E.g. I want to be able to add users, manage the file system and all those sorts of things. Think of it as a cPanel clone but more for system admin rather that web admin.
I was thinking about creating a service that runs on my box and that performs all the system levels tasks. This way I can have a clear separation between my web based front-end and the actual logic. The server pages can than make calls to my specialized server or queue tasks that way. However, I'm not sure if this would be the best way to go about this.
I guess another important question would be, how I would deal with security when building something like this?
PS: This just as a pet project and learning experience so I'm not interested in existing solutions that do a similar thing.
Have the specialized service daemon running as a distinct user -- let's call it 'managerd'. Set up your /etc/sudoers file so that 'managerd' can execute the various commands you want it to be able to run, as root, without a password.
Have the web server drop "trigger" files containing the commands to run in a directory that is mode '770' with a group that only the web server user and 'managerd' are members of. Make sure that 'managerd' verifies that the files have the correct ownership before executing the command.
Make sure that the web interface side is locked down -- run it over HTTPS only, require authentication, and if all possible, put in IP-specific ACLs, so that you can only access it from known locations, in advance.
Your solution seems like a very sensible solution to the 'root' issue.
Couple of suggestions:
Binding the 'specialised service' to localhost as well would help to guarantee that requests can't be made externally.
Checking request call functions that perform the actions and not directly give the service full unrestricted access. So calling a function "addToGroup(user,group)" instead of a generic "performAction(command)".