I am totally new to Google-apps-script and this may be very poor question
I am making a basic setup using google forms and google apps script
From the responses of form I change my content in Google Spreadsheet accordingly
For example my query from form needed 10000 records to be selected and produced in whole another spreadsheet
I just wanted to know that is there some kind of delay introduced when I set and get values of any cell of spreadsheet on such a large scale? If so on what it depends and how as a programmer can I remove or optimize them?
Thanks is advance!
The Best Practices article by Google is the primary reference for this. The most important advice is to minimize the number of calls to spreadsheet methods by batching operations. So, select a range that contains all the records you need, get them all at once with getValues, process without further interaction with the spreadsheet, and output the result using setValues.
If you follow this advice, 10000 records is still a reasonable amount of data to process by a script.
Related
I have 50 google sheet files for 50 students. They need to key in their answers in their own google sheet file when they are asked to do so. I have a main google sheet to consolidate their data by using IMPORTRANGE formula. This is my formula:
=QUERY({IMPORTRANGE(...);IMPORTRANGE(...);IMPORTRANGE(...);...},"Select * where Col1 is not null")
I will have 50 IMPORTRANGE in the formula. So as expected, the main google sheet is very lag when the 50 students start to key in their answers at the same time. Sometimes, the formula will show #Value when all the students started to answer the questions at their own google sheet file. I need to keep refreshing the main google sheet so that the data will come out, but it will disappear again in a short while then I need to refresh it again (although it will settle down once most of the students finished answering the questions).
I know that using IMPORTRANGE is really not an efficient way to consolidate their answers in main google sheet file but I don't have other better way.
I tried to write a script so that they can send in their data by clicking the button assigned with the script. However, all the students need to go through the authorization process when they run the script for the first time. They don't know how to proceed when they saw the authorization process (not very good in using computer).
May I know is there any ways or tricks that I can use to solve the IMPORTRANGE issue? Or there are some way to write the script where we are not required to go through the authorization process when we run the script for the first time?
Hope to get some advice and help on this as I couldn't find a better way from Google already. Any help will be greatly appreciated!
If I understood correctly what you are looking is that your spreadsheet show in real-time the data being entered simultaneously on 50 different spreadsheets. I'm afraid that Google Sheets is not the right tool for what you are trying to do the way that you are trying to do it. Basically you have two options : change it or use a different tool.
It's not a good idea to have and array of multiple IMPORTRANGE functions that are being edited simultaneously because while the official docs says that IMPORTRANGE functions are updated every 30 minutes when the source and the spreadsheet having formula are opened at the same time the import is done practically immediately and could happen multiple times during the recalculation making causing it to start over an over again.
Replacing the above array by script might help only if you are open to not have the destination spreadsheet updated on real time as scripts are slow.
Replacing the above array by a program that uses the Google Sheets API also might help only if you are open to not have the destination spreadsheet updated on real time as the spreadsheet refresh.
Regarding running a script without requiring authorization that is only possible when using simple triggers and / or removing all the scopes that require authorization to run. Please bear in mind that you might create installable triggers to run other using the authorization of the user who creates them.
Related
Combining multiple spreadsheets in one using IMPORTRANGE
Why do two users sometimes see different values from importrange?
Multiple IMPORTRANGE
Using that many IMPORTRANGE formulas is definitely a bad idea. What I'd suggest you to do:
keep a list of all your student spreadsheet in your main document
write a script that will browse through all of the spreadsheets from that list and copy/paste values into your main document
create a time based trigger that will run the script every X minutes (or hours), depending on how accurate you want the results to be
This is a simple solution, but efficient. Depending on the amount of data and number of students/spreadsheets you may consider other solutions (like writing a cloud function that will do the same as the script) but I think this will work for your use case
I am familiar with the Lock Service but that is only for locking scripts.
I have some code that will "process" a large Google Sheet. My script needs to re-order the rows. I need/want to make it so while the script is running nobody else can change the order. However, I still need another script to be able to append rows.
We use a Google Form for our team's intake. It appends rows to a sheet. I have an hourly job that will go through all the rows/records and "process them". I have a column that stores the last time a record/row was "processed". I want to sort on that column such that the "oldest" records are on top and then start processing from the top down. If the script fails or times out then the next iteration will just start over...
I know I could use getValues or getDisplayValues to get an array and then write the array back but I worry what would happen if someone sorted the rows as it would muck things up when writing the array back.
Is there some way to accomplish my goal? I want to be able to process the records, and maintain row order to avoid breaking my processing.
The way to block a spreadsheet "completely" is by changing the spreadsheet sharing settings. Remove all editors or change them to viewers, once your script finish, change them back as editors. In a extreme case, usa a second account to act as the owner of the critical files / spreadsheets and only use it for this purpose,so you could block your regular account for doing changes to the spreadsheet.
NOTE: A Google Form editResponseUrl could be used to edit the linked spreadsheet.
I'm facing a similar situation but I took a different approach, I'm using an index/key column (you could use the timestamp column) and using the index/key to save each edited row to the right position, then write the whole resulting array in a single operation (by using setValues()). In my case this is simple because I only require values, I'm not worried about notes, data validation, conditional formatting, comments, etc. and there isn't a Google Form linked to my spreadsheet.
Related
Google Spreadsheet -- get sharing permissions by script
Any way to share google docs programmatically?
I am having a Google script that sets values of specific ranges from sheet B to sheet A on a button press.
The code I have works, but it takes about 40 seconds due to the high amount of getValue/setValue usage I guess.
The code seen below is only a snippet. It goes on like this about four times as shown.
I already have a nice solution for copying values from one large range (say A1:Z30 via loops but I can not figure out a solution for this matter here.
Your support is very much appreciated. Thank you in advance.
ratenprogrammmain.getRange("E1:E18").setValues(vorlage.getRange("E13:E30").getValues());
ratenprogrammmain.getRange("B2").setValue(vorlage.getRange("B14").getValue());
ratenprogrammmain.getRange("B5").setValue(vorlage.getRange("B17").getValue());
ratenprogrammmain.getRange("A21").setValue(vorlage.getRange("A33").getValue());
ratenprogrammmain.getRange("B25").setValue(vorlage.getRange("B37").getValue());
ratenprogrammmain.getRange("A28:G33").setValues(vorlage.getRange("A40:G45").getValues());
ratenprogrammmain.getRange("H35").setValue(vorlage.getRange("H47").getValue());
Three ideas how to make your code more effiient
If your sheets are located in the same Spreadsheet you can use the copyTo() method for ranges.
Apps Script Best Practices provides samples how to use batch operations to make you code faster and more efficient.
Advanced Sheets Service allows you to use the Sheets API batchUpdate Request CopyPasteRequest
I would like to synchronize a google spreadsheet with a map so that I don't have to upload everything everyday.
I found that it's possible to synch a google form to google map using Google Fusion.
See, YouTube: Syncing Google Forms with Google Fusion Tables for Crowdsourced Maps.
But I couldn't replicate the process to my situation (I guess it's maybe because the spreadsheet content is not originated from a google form and maybe the script take that into account)
I don't know much about coding scripts but automating this process would be a blast for me!!
I hope someone will be able to help me out on this
thanks a lot and have a good day
The only thing to account for this situation is the difference of form submit. The guy in the video sets up two triggers: one for onFormSubmit, and one hourly trigger for syncing whenever any manual changes are made.
I haven't looked directly over the code, but all you should have to do is modify the onFormSubmit code and trigger. Change the code to look for and update the fusion table with any new rows from your spreadsheet. And then change the trigger to your desired need, timer would probably be the best option. So every hour, or day, or run it manually after your done adding rows.
Now, if you were to edit the rows of data after they've already been updated, the hourly syncing will take care of those changes.
I could imagine that the hourly sync method could be changed in such a manner to look for rows that need to be added, could be as simple as calling the submit function.
I had the same problem but i could solve it.
A time trigger is not needed if you set the sync function at the end of the function OnFormSubmit (so "sync();" under "insertRowId(rowId, row);" Syncing takes place after each sending of the form automatically.
For larger forms I found out that you should not make a special column Location in the Fusion table. The address column should marked as Location in Fusion table. In the script properties of the spreadsheet give the addressColumn the value of the column title of the address column and the third property keeps unchanged ("latlng">Location. What happens is that the value of the adres is overwritten by "latlng". So if you have trouble to loose the original addresses, add a new column, copy by apps script the same address (that piece of script direct at the beginning of the function OnFormSubmit) and (after syncing) give the addressColumn the value of the column title and in the Fusion table marks the original addess column as Text and the new column as Location.
I have a spreadsheet which collects users' feedback data. According to Google Documentation, size limit for a Google Spreadsheet is 400000 cells. I have written a GAS code which checks the total number of consumed cells in the spreadsheet. In case consumed cells' count is alarming, GAS creates a duplicate copy and clears data from the current spreadsheet.
Now, the spreadsheet suddenly stopped collecting data submissions since Jan 28, 2014. I checked the total cells consumed and found that it still needed approx 2500 cells to be 400000.
I looked through the Google documentation again and found:
"Spreadsheets also have overall storage limits. Some spreadsheets may reach these before hitting the 400,000 cell limit, particularly when individual cells have large amounts of text. In such cases, the spreadsheet will go into read-only mode to prevent data loss."
I tried looking for any method in Google Apps Script or Google Script where I can check a spreadsheet's data size with respect to its storage limit. I didn't find any such reference.
Can you please help on this? Is there any method in Google Apps Script to find out:
1. what is current data size of a Google spreadsheet?
2. what is the storage limit for a Google spreadsheet?
Thanks in advance.
No its not possible by apis.
Its not just byte size. Other things like total number of formulas also affect it. Converting to xls will only be an aproximation.
My test is to download the spreadsheet as an Excel file. I have one that's close to 5.5MB. Anything higher than that usually stops working. I haven't tried to automate that check, but perhaps it's possible to:
Use DriveApp to get the file as xlsx
Measure file's Blob's size