I have 50 google sheet files for 50 students. They need to key in their answers in their own google sheet file when they are asked to do so. I have a main google sheet to consolidate their data by using IMPORTRANGE formula. This is my formula:
=QUERY({IMPORTRANGE(...);IMPORTRANGE(...);IMPORTRANGE(...);...},"Select * where Col1 is not null")
I will have 50 IMPORTRANGE in the formula. So as expected, the main google sheet is very lag when the 50 students start to key in their answers at the same time. Sometimes, the formula will show #Value when all the students started to answer the questions at their own google sheet file. I need to keep refreshing the main google sheet so that the data will come out, but it will disappear again in a short while then I need to refresh it again (although it will settle down once most of the students finished answering the questions).
I know that using IMPORTRANGE is really not an efficient way to consolidate their answers in main google sheet file but I don't have other better way.
I tried to write a script so that they can send in their data by clicking the button assigned with the script. However, all the students need to go through the authorization process when they run the script for the first time. They don't know how to proceed when they saw the authorization process (not very good in using computer).
May I know is there any ways or tricks that I can use to solve the IMPORTRANGE issue? Or there are some way to write the script where we are not required to go through the authorization process when we run the script for the first time?
Hope to get some advice and help on this as I couldn't find a better way from Google already. Any help will be greatly appreciated!
If I understood correctly what you are looking is that your spreadsheet show in real-time the data being entered simultaneously on 50 different spreadsheets. I'm afraid that Google Sheets is not the right tool for what you are trying to do the way that you are trying to do it. Basically you have two options : change it or use a different tool.
It's not a good idea to have and array of multiple IMPORTRANGE functions that are being edited simultaneously because while the official docs says that IMPORTRANGE functions are updated every 30 minutes when the source and the spreadsheet having formula are opened at the same time the import is done practically immediately and could happen multiple times during the recalculation making causing it to start over an over again.
Replacing the above array by script might help only if you are open to not have the destination spreadsheet updated on real time as scripts are slow.
Replacing the above array by a program that uses the Google Sheets API also might help only if you are open to not have the destination spreadsheet updated on real time as the spreadsheet refresh.
Regarding running a script without requiring authorization that is only possible when using simple triggers and / or removing all the scopes that require authorization to run. Please bear in mind that you might create installable triggers to run other using the authorization of the user who creates them.
Related
Combining multiple spreadsheets in one using IMPORTRANGE
Why do two users sometimes see different values from importrange?
Multiple IMPORTRANGE
Using that many IMPORTRANGE formulas is definitely a bad idea. What I'd suggest you to do:
keep a list of all your student spreadsheet in your main document
write a script that will browse through all of the spreadsheets from that list and copy/paste values into your main document
create a time based trigger that will run the script every X minutes (or hours), depending on how accurate you want the results to be
This is a simple solution, but efficient. Depending on the amount of data and number of students/spreadsheets you may consider other solutions (like writing a cloud function that will do the same as the script) but I think this will work for your use case
Related
I have a Google Sheet that uses an IMPORTRANGE query to combine data from multiple other sheets. This combined import sheet is read by Google AppSheet. We have realized that the data AppSheet is reading is always outdated. It only reads the data as of the last time the sheet was manually opened.
I followed the steps in this post to try to fix this issue by creating this function: function refresh() {SpreadsheetApp.flush()}. I then set up a timed trigger to activate it once an hour. Logs show the function is running, but the data is still not updating until I manually open the sheet.
This is my first time using Apps Script. Any tips/ideas? Is there a different or better way to have the formulas update without opening the file?
Thank you for reading.
SpreadsheetApp.flush() only works for the script execution that calls it. If you need to refresh the data results from a formula it's uncertain how exactly the spreadsheet will respond as most of the formula calculations are done on the client side. You could verify this by yourself by using your web browser developer tools.
Anyway, spreadsheet formulas have several caveats so it will not be extrange that at some point you will have to rethink your solution. Assuming that you want to keep using AppSheet:
Use AppSheet for your front end and some no-code / low-code automation. Keep your app small, if you need many forms / views consider to distribute them among several apps.
Use Google Sheets only for data storage for your AppSheet app. Please bear in mind that it has 10 million cells limit for the whole spreadsheet, so you might want to delete the unused sheets and delete the unused columns and rows on each sheet.
You might use Google Apps Script to do the data import and transformation tasks. If you need that something be updated based on actions done on the AppSheet app, you might use an installable change trigger or use webhook from the AppSheet side to and a "simple" web application using Google Apps Script (you could use GET / POST http requests to trigger some Google Apps Script functions).
Also you might use other programming platforms for the data import / transformation tasks and keep using Google Sheets as your AppSheet database by using the Google Sheets API or other automation tools like Zappier, IFTTT, Integromat among many others.
solution #1
You can try this solution :
define a checkbox (for instance in A1 in tab Sheet1)
set this script
function myFunction() {
var chk = SpreadsheetApp.getActiveSpreadsheet().getSheetByName('Sheet1').getRange('A1')
chk.setValue(false);
SpreadsheetApp.flush();
Utilities.sleep(500);
chk.setValue(true);
}
define a trigger on it
define the formula as follows
=if(A1,importrange("1n-rjSYb63Z2jySS3-M0BQ78vu8DTPOjG-SZM4i8IxXI","A:Z"),"")
when A1 is unchecked, the result will be empty, then check A1 to fill once again the result as expected
solution #2
by script, try for instance
function myFunction() {
var sh = SpreadsheetApp.getActiveSpreadsheet().getSheetByName('Sheet9')
var data = SpreadsheetApp.openById('1n-rjSYb63Z2jySS3-M0BQ78vu8DTPOjG-SZM4i8IxXI').getSheets()[0].getDataRange().getValues()
sh.getRange(1,1,data.length,data[0].length).setValues(data)
}
put a daily triger as needed
I am familiar with the Lock Service but that is only for locking scripts.
I have some code that will "process" a large Google Sheet. My script needs to re-order the rows. I need/want to make it so while the script is running nobody else can change the order. However, I still need another script to be able to append rows.
We use a Google Form for our team's intake. It appends rows to a sheet. I have an hourly job that will go through all the rows/records and "process them". I have a column that stores the last time a record/row was "processed". I want to sort on that column such that the "oldest" records are on top and then start processing from the top down. If the script fails or times out then the next iteration will just start over...
I know I could use getValues or getDisplayValues to get an array and then write the array back but I worry what would happen if someone sorted the rows as it would muck things up when writing the array back.
Is there some way to accomplish my goal? I want to be able to process the records, and maintain row order to avoid breaking my processing.
The way to block a spreadsheet "completely" is by changing the spreadsheet sharing settings. Remove all editors or change them to viewers, once your script finish, change them back as editors. In a extreme case, usa a second account to act as the owner of the critical files / spreadsheets and only use it for this purpose,so you could block your regular account for doing changes to the spreadsheet.
NOTE: A Google Form editResponseUrl could be used to edit the linked spreadsheet.
I'm facing a similar situation but I took a different approach, I'm using an index/key column (you could use the timestamp column) and using the index/key to save each edited row to the right position, then write the whole resulting array in a single operation (by using setValues()). In my case this is simple because I only require values, I'm not worried about notes, data validation, conditional formatting, comments, etc. and there isn't a Google Form linked to my spreadsheet.
Related
Google Spreadsheet -- get sharing permissions by script
Any way to share google docs programmatically?
I am having a Google script that sets values of specific ranges from sheet B to sheet A on a button press.
The code I have works, but it takes about 40 seconds due to the high amount of getValue/setValue usage I guess.
The code seen below is only a snippet. It goes on like this about four times as shown.
I already have a nice solution for copying values from one large range (say A1:Z30 via loops but I can not figure out a solution for this matter here.
Your support is very much appreciated. Thank you in advance.
ratenprogrammmain.getRange("E1:E18").setValues(vorlage.getRange("E13:E30").getValues());
ratenprogrammmain.getRange("B2").setValue(vorlage.getRange("B14").getValue());
ratenprogrammmain.getRange("B5").setValue(vorlage.getRange("B17").getValue());
ratenprogrammmain.getRange("A21").setValue(vorlage.getRange("A33").getValue());
ratenprogrammmain.getRange("B25").setValue(vorlage.getRange("B37").getValue());
ratenprogrammmain.getRange("A28:G33").setValues(vorlage.getRange("A40:G45").getValues());
ratenprogrammmain.getRange("H35").setValue(vorlage.getRange("H47").getValue());
Three ideas how to make your code more effiient
If your sheets are located in the same Spreadsheet you can use the copyTo() method for ranges.
Apps Script Best Practices provides samples how to use batch operations to make you code faster and more efficient.
Advanced Sheets Service allows you to use the Sheets API batchUpdate Request CopyPasteRequest
Google sheets changed the size of data that can be imported using the 'importrange' function in late March. We want to write a formula that can search through all our sheets and identify any that will be affected by Google's latest change.
We know there has been a change, as Importranges we had previously setup, are now returning "... to large" type error.
It appears to a limit of data size, not cells - as we experienced the issue on importrange of only 2 columns, but one contained large amount of HTML code in each cell.
Does anyone know what the data size limit is?
If not, then does anyone have an idea for best way to write a script that can find the limit?
Or next step will be to write a script that can search through our 'network analysis' sheet (sheet/tool that shows all Google sheets that are connected by importrange), and identify those with the importrange issue.
FYI - Google appears to have changed quite a bit relating to Google sheets and apps script.
a) We lost the ability to save changes in a sheet (making it basically unusable), which we think is related to having >50 importranges referencing that sheet (google support advised to avoid >50 importranges going to one sheet).
b) Other change we noticed is that google apps script can definitely now run for longer than the previous limit of 5min. We previously were seeing some scripts, sometimes running >5min, all the way up to a max of 30min. But now we see it consistently on some scripts. We built a tool, that can automatically get a script to 'run again' if it did not complete last time. We have had to tweak that 'run again script', to ensure it does not try to run the script again within 30min of the first run, to ensure that first script is definitely still not running.
I am totally new to Google-apps-script and this may be very poor question
I am making a basic setup using google forms and google apps script
From the responses of form I change my content in Google Spreadsheet accordingly
For example my query from form needed 10000 records to be selected and produced in whole another spreadsheet
I just wanted to know that is there some kind of delay introduced when I set and get values of any cell of spreadsheet on such a large scale? If so on what it depends and how as a programmer can I remove or optimize them?
Thanks is advance!
The Best Practices article by Google is the primary reference for this. The most important advice is to minimize the number of calls to spreadsheet methods by batching operations. So, select a range that contains all the records you need, get them all at once with getValues, process without further interaction with the spreadsheet, and output the result using setValues.
If you follow this advice, 10000 records is still a reasonable amount of data to process by a script.