Is there a way to recover Google Cloud Function source code when the bucket does not exist, but it can still run? - google-cloud-functions

I have 13 Google Cloud Functions running in my Google Cloud Project. Several of them are triggered by Google Cloud Scheduler. I went to make an edit to one of the functions the other day and got the following error when trying to view the "SOURCE".
When looking into this issue, the culprit seems to be that the Archive Bucket has been deleted or that the name has changed. When looking through the Container Registry as well as the buckets, I can confirm that they are no longer there. I think another developer in the project may have accidentally deleted them when cleaning up images from an old project.
That being said, when looking at the logs I have noticed that the functions being triggered by Google Cloud Scheduler are still successfully running on their schedules even though I can no longer access the source code. The functions are logging data the I console within the source code even though it is no longer there.
My question is whether or not these are recoverable and what it would take to find them.
I have tried looking for the images and the buckets to see if they were renamed and I could not find them. I also created a new function in the same region and it spun up a new image and bucket, but it did not contain any reference to the other missing functions.

It is not possible to access the source code of a Google Cloud Function if the bucket that contained the code has been deleted. The source code is stored in the bucket, so if the bucket is deleted, the code is also deleted. If the function is still running, it is because the code has been deployed and is being run from a different location. You can view the logs for the function to see what code is being executed, but you will not be able to retrieve the original source code.
If you need to access the source code for the function, you will need to redeploy the code to a new bucket. You can do this by creating a new bucket and uploading the code, or by using the Cloud Functions API or the gcloud command-line tool to deploy the code to the new bucket.

Related

Google Apps Scripts does not save logs

I created a script a few years ago and recently revisited it, to try and get the logging working. I added various console.info functions and tested it. It worked fine and I was able to see all the logs during execution.
However, the logs never appear in the Google Apps Script dashboard.
At the moment this screenshot was taken, more than 3 hours had passed since the execution.
When consulting the GCP console (by clicking View in Cloud Logging), even when setting a generic query, no logs are found.
I have been reading the documentation and trying to understand what the problem may be. From what I understand, some logging features only work for standard GCP projects, mainly accessing info in the GCP console. My script does not belong to a standard GCP project, it belongs to a default GCP project (they're different things from what I understood).
But according to the documentation, a default GCP project allows for logs to be accessed through the Apps Script dashboard. So the logs should be visible in the executions. I'm probably missing something, but I'm having a hard time figuring out what. Any suggestions would be welcome.
Minimal Reproducible Example
Here is a code example, whose logs appear on execution but not after.
function main() {
const xml = UrlFetchApp.fetch('https://www.imdb.com/calendar/?region=pt').getContentText();
const xmlSliced = xml.slice(xml.search('<div id="main">'), xml.search('<div id="sidebar">') - 1);
const document = XmlService.parse(xmlSliced);
const dates = document.getRootElement().getChildren('h4');
console.info(`Parsed ${dates.length} release dates`);
}

How to link a Project Number to a Script dynamically?

I'm creating forms on demand and i need to be able to set up the triggers. If i do it on the main AppsScript script that is handling the forms creation, i'll hit the 20 triggers max quota pretty soon. So i decided to add a dedicated script to every form (associating with parentID) so i won't hit the quota.
My problem is that when i try to add the triggers i'm getting permission denied since the scripts created for each form are using a default GCP project and it needs to be a standard GCP project due to OAuth.
I can add the project number manually, but that kind of defeats the idea of being able to generate google forms on demand.
I want to be able to link the script created with the standard GCP project that is already configured, but i just can't find anything on Google's documentation, i know its a long shot but i've decided to post here in hopes that someone that had the same problem managed to do so.
As requested on the comments below, the triggers that i'm using are:
onFormSubmit - get the response and send it to my endpoint in the backend server
onOpen - ensures that the editor hasn't removed anything default (this only works when an editor opens the form in edit mode)
This is the project number im referring to.
Thanks

Apps Script Activity Reporting/Visualization

I've been developing an apps script project for my company that tracks our time/expenses. I've structured the project like so:
The company has a paid Gsuite account that owns all the spreadsheets hosted on the company's google drive.
Each employee has their own "user" spreadsheet which is shared from the company Gsuite account with the employee's personal gmail account.
Each of the user spreadsheets has a container-bound script that accesses a central library script.
The library script allows us to update the script centrally and the effects are immediate for each user. It also prevents users from seeing the central script and meddling with it.
Each of the user container-bound scripts have installable triggers that are authorized by the company account so that the code being run has full authority to do what it needs to to the spreadsheets.
This setup has been working quite well for us with about 40 users. The drawback to this setup is that since all the script activity is run by the company account via the triggers, the activity of all our users is logged under the single company account and therefore capped by the apps script server quotas for a single user. This hasn't been much of an issue for us yet as long as our script is efficient in how it runs. I have looked into deploying this project as a web-app for our company, but there doesn't seem to be a good way to control/limit user access to the central files. In other words, if this project was running as a web app installed by each user, each user would need to have access to all the central spreadsheets that the project uses behind the scenes. And we don't want that.
SO with that background, here is my question. How do I efficiently track apps script activity to see how close we are to hitting our server quota, and identify which of my functions need to be optimized?
I started doing this by writing a entry into a "activity log" spreadsheet every time the script was called. It tracked what function was called, and who the user was and it had a start time entry and and end time entry so I can see how long unique executions took and which ones failed. This was great because I had a live view into the project activity and could graph it using the spreadsheet graphs tools. Where this began to break down was the fact that every execution of the script required two write-actions: one for initialization and another for completion. Since the script is being executed every time a user made an edit to their spreadsheet, during times of high traffic, the activity log spreadsheet became inaccessible and errors would be thrown all over the place.
So I have since transitioned to tracking activity by connecting each script file to a single Google Cloud Platform (GCP) project and using the Logger API. Writing logs is a lot more efficient than writing an entry to a spreadsheet, so the high traffic errors are all but gone. The problem now is that the GCP log browser isn't as easy to use as a spreadsheet and I can't graph the logs or sum up the activity to see where we stand with our server quota.
I've spent some time now trying to figure out how to automatically export the logs from the GCP so I can process the logs in real-time. I see how to download the logs as csv files, which I can then import into a google spreadsheet and do the calcs and graphing I need, but this is a manual process, and doesn't show live data.
I have also figured out how to stream the logs from GCP by setting up a "sink" that transfers the logs to a "bucket" which can theoretically be read by other services. This got me excited to try out Google Data Studio, which I saw is able to use Google Cloud Storage "buckets" as a data source. Unfortunately though, Google Data Studio can only read csv files in cloud storage, and not the json files that my "sink" is generating in my "bucket" for the logs.
So I've hit a wall. Am I missing something here? I'm just trying to get live data showing current activity on our apps script project so I can identify failed executions, see total processing time, and sort the logs by user or function so I can quickly identify where I need to optimize my script.
You've already referenced using GCP side of your Apps Script.
Have a look at Metric explorer, it lets you see quota usage per resource and auto generates graph for you.
But long term I think re-building your solution may be a better idea. At minimum switching to submitting data via Google Forms will save you on operation.

Permission denied when trying to create a spreadsheet using Google script API

In the corporate environment with a corporate account I have a Google Document. Think of it as a master document that is being maintained.
I want to add a script to it to create a derived spreadsheet that can be used by people who need a subset of data from the master document.
In the master document I created a script file so the script is bound to the document.
Inside the file I defined a function. Function does some search and processing in the master file without problems but once I try to create or open an existing spreadsheet to extract data from the master document to a derived document the call fails.
I tried several approaches:
SpreadsheetApp.openByUrl();
SpreadsheetApp.create("My test sheet");
and approach covered in the solution
I also tried execution from an installed menu trigger or from debugger - same result.
In all cases I get a pop up with :
Authorization required
{doc name} needs your permission to access your data on Google.
with two buttons to review and cancel.
Clicking review leads to a message "An unexpected error occurred" at the top of the screen.
I tried different machines and browsers and switched V8 engine on and off.
I also made sure that the document is open to all in my org if I try to open an existing doc by URL.
What am I doing wrong?
It turned out that there were two things.
Adding scopes solved the problem.
I had an invalid scope that
was already there (I do not know how it got there, probably from an
example) that I preserved in the manifest and might have misspelled.
That was a mistake. It caused all sorts of instability. Only when I
tried to execute the script in FireFox instead of Chrome was I able
to see a popup that gave me a hint that one of the scopes is invalid
and I need to fix it.
Once I fixed it, everything started to load as expected.

ScriptDb in Library accessed by WebApp

I created a Google Apps Script project that is published as a WebApp visible for anyone, running as the accessing user.
This WebApp includes a Library project, which has methods to access ScriptDb as following:
function getDb() {
return ScriptDb.getMyDb();
}
function save(props) {
return getDb().save(props);
}
I intend to have one WebApp running per accessing user (using time-based triggers to access Gmail), however I would like to use the ScriptDb to store data independently from the user running the WebApp.
So I thought the chapter Centralizing a Place to Get a Database Instance from the Google Apps Script Documentation applies here, however I am getting the following error when accessing the WebApp:
You do not have access to library MyDBLibrary, used by your script, or it has been deleted.
Am I doing anything wrong? Was ScriptDb not intended to run independently of the user, when being accessed from a WebApp?
Joscha - you have the set up right and I reproduced everything you've reported. The last thing you were missing was to the share the Library project itself to the public.
The way to do that is to
1. Open the Library project in the script editor
2. Go to File -> Share
3. In the "Who has access" section, set it to anyone with the link can View (dont need edit).
Without this, Apps Script will now know its ok to serve the library code to the public. Hope this helps.
You may need to turn on development mode flag to make it work.
Here's what I did.
User A: Created a Master spreadsheet and created an app script and saved it with version 1.0.
User A: Gave access to this sheet to user B so they can use the common app script.
User A: Gave project access key to user B.
4.User B: Created a spreadsheet and added resource library with project key.
User B: Selected version 1.0.
User B: Turned on development mode.
Note: if I follow steps 1 through 5, then I still see the same error saying I don't have permission or the script has been deleted. However, also doing step 6 will resolve the error and allow you to use the shared script.