Is there any way to make Google Script call functions asynchronously? My scenario is that I have a main spreadsheet that information is entered into and a script then passes the relevant information to other spreadsheets.
There are then other functions that manipulate the data in those other spreadsheets. Unfortunately, because of the high volume of data, calling all the functions on one action causes the script to hit the 6 minute time out.
I tried using the onEdit trigger in the other spreadsheets, but it doesn't seem to work unless the sheets are opened by a user.
The way it is just now the user would have to hit 4 different buttons to trigger the various functions and not get a time out.
Thanks for any help
Blair
Depending on how realtime the updates need to be, you could consider creating a queue that contains all of the updates to be made (perhaps stored in the PropertiesService as a stringified JSON object).
Then your update code could be triggered regularly, say every 5 minutes, and read the next element of the queue and execute the update, before removing that entry from the queue. This would mean each individual update fitted within the 6 minute window, but it would also mean that if there were 4 additional updates for every update to the main sheet it might be up to 24 minutes before all of them had been made.
Related
I've been trying to use importXML in Google Sheets to import specific data (in this case, only player name) from several players via the Steam Web API.
I encountered what seems to be a limit with the number of importXML calls I can make in my sheet, because I get loading errors:
Loading data may take a while because of the large number of requests. Try to reduce the amount of IMPORTHTML, IMPORTDATA, IMPORTFEED or IMPORTXML functions across spreadsheets you've created.
This list will likely grow (currently at about 170) and I need a way for it to be able to handle the calls. I don't need the data to update very frequently (even 2-3 times a day is sufficient).
I've tried the code I found from another SO post, but that seems to refresh all the importxml calls at once, so I still got loading errors.
From what I've researched so far, it seems like I'll need to use an Apps Script to optimize my sheet by creating intervals for the calls. Is there a way I could have a script do the following:
Call 25 rows (or whichever limit is optimal)
Wait some amount of time
Call next 25 rows
Continue till the end of the sheet, then restart loop
I'm not too savvy with writing functions so don't know how to edit the code to achieve that. Any help would be appreciated.
If you'd like to take a look at the spreadsheet I'm working with, here it is. For now, only Column B has the importXML calls and the url's are concatenated using cells in Column H. So there's one importXML call per row.
I have created a script on google sheets App script that will take a form submission, and put it into a new sheet. The way it does this is by calling on the last row with data, but when two forms are submitted at the same time, only one is submitted as the first did not have enough time to go through. Is there a way to still keep google forms open and delay the responses from updating the spreadsheet by 5 or so second intervals? I don't want data to be missed if two forms submit at the same time. Any help is appreciated as I am a COMPLETE beginner.
Casey, I think what you need is LockService. You can set the lock prior to any edits and release it once you've updated the sheet. If any concurrent updates need to happen, they will wait until the lock is released. Full example here.
I am familiar with the Lock Service but that is only for locking scripts.
I have some code that will "process" a large Google Sheet. My script needs to re-order the rows. I need/want to make it so while the script is running nobody else can change the order. However, I still need another script to be able to append rows.
We use a Google Form for our team's intake. It appends rows to a sheet. I have an hourly job that will go through all the rows/records and "process them". I have a column that stores the last time a record/row was "processed". I want to sort on that column such that the "oldest" records are on top and then start processing from the top down. If the script fails or times out then the next iteration will just start over...
I know I could use getValues or getDisplayValues to get an array and then write the array back but I worry what would happen if someone sorted the rows as it would muck things up when writing the array back.
Is there some way to accomplish my goal? I want to be able to process the records, and maintain row order to avoid breaking my processing.
The way to block a spreadsheet "completely" is by changing the spreadsheet sharing settings. Remove all editors or change them to viewers, once your script finish, change them back as editors. In a extreme case, usa a second account to act as the owner of the critical files / spreadsheets and only use it for this purpose,so you could block your regular account for doing changes to the spreadsheet.
NOTE: A Google Form editResponseUrl could be used to edit the linked spreadsheet.
I'm facing a similar situation but I took a different approach, I'm using an index/key column (you could use the timestamp column) and using the index/key to save each edited row to the right position, then write the whole resulting array in a single operation (by using setValues()). In my case this is simple because I only require values, I'm not worried about notes, data validation, conditional formatting, comments, etc. and there isn't a Google Form linked to my spreadsheet.
Related
Google Spreadsheet -- get sharing permissions by script
Any way to share google docs programmatically?
I work in an areas with about ~200 school based collaborators and we all have access to the same Google domain. Using Google AppScript (that pulls from a Google Spreadsheet generated by a python cron job), I am programmatically creating a pre-conference form that is custom-developed to each of the schools, yielding me about 200 forms. This all works fine.
I need to attach a trigger on the submission of each of these forms. I have it written and tested the function. All works fine.
The problem is I cannot propagate the submission logic to all of the forms because of the 20 triggers / user / script quota limit. Each of these forms is only going to have 1 response, and it would make sense to me that I should be able to have 1 trigger / form (I will definitely be under the 6 hrs/day processing limit for triggers).
So, this issue is occurring because the AppScript trigger limit is in terms of the code that is being called with the trigger and not the Form that is triggering it. Is there another way to set up the Google services to avoid this quota? I can be flexible and I've already created a batch service to create the forms without violating the 6 minute maximum execution time, but it seems like any attempt to have one script creating more than 20 forms with 1 trigger will lead to failure.
I have multiple scripts that I want to run on data in my spreadsheet. Some of the data is populated by random number function.. so they change every time the spreadsheet is updated/edited.
It seems that my scripts update the spreadsheet even if it doesn't explicitly edit any cells. This causes all of the data to change mid-script and mess things up.
Is there any way to stop the spreadsheet from updating while a script is running?
No, I don't think so. You'll need to come up with a way of structuring the retrieval and updating of data, and the timing of the flow of events. You could check for certain conditions, and halt the script if need be. You could set key cell values from a script, rather than a function in the cell.