Google-Apps-Script can not write back a database with getDataRange().setValues() - google-apps-script

I have a database loaded from a google spreadsheet:
mydatabase = sheet.getDataRange().getValues()
which then I extend with a new record:
mydatabase.push(mydatabase[x])
and then at the end of the script, I would want to write back the entire database to the google spreadsheet but the
sheet.getDataRange().setValues(mydatabase)
gives me ERROR since the new database is one record higher than the original was when loaded.
Is there any way to force the getDataRange() to write back the database into the sheet? The spreadsheet otherwise would have enough rows to accommodate the bigger dataset.

In general, for .setValues(obj[][]) to work as expected, the Range that it is acting on must be the same size as the obj[][].
Commonly, this is ensured by acquiring a new Range from the desired Sheet:
var ss = SpreadsheetApp.openById("some id");
var sheet = ss.getSheetByName("some name");
var db = sheet.getDataRange().getValues();
/*
* Do some operations on the obj[][] that is db
* these operations can include adding / removing rows or columns.
*/
// If db is smaller than existing "written" data, the existing data should be cleared first.
if(db.length < sheet.getLastRow() || db[0].length < sheet.getLastColumn())
sheet.clearContent();
// Write the current db back to the sheet, after acquiring the
// exact number of rows and columns needed to hold the values.
sheet.getRange(1, 1, db.length, db[0].length).setValues(db);

Related

Re-Insert deleted rows Google Sheet

I would like to prevent the deletion of rows on a google sheet, but this is not possible to set it through permissions especially if it is the same user who can modify the sheet, can accidentally delete one or more rows and/or insert new ones. For inserting of a new row I have a script that works correctly which simply get the "INSERT_ROW" event onChange (), alerts the user and deletes the newly inserted row. For deletion instead,the user can delete a single row or even more rows at the same time ... and I don't know how to restore them in the same range from which they were deleted (and therefore also recover the values).
Here is the first working point:
/** RESTORE ROW OR COLUMN */
function onChange(e){
var ss = SpreadsheetApp.getActiveSpreadsheet();
var sheet = ss.getActiveSheet();
var range = SpreadsheetApp.getActiveRange();
if (e.changeType === 'INSERT_ROW'){
var row = range.getRow();
var lastRow = range.getLastRow();
var numRows = 1+lastRow-row;
sheet.deleteRows(row, numRows)
SpreadsheetApp.getUi().alert("Warning for Row");
}else if (e.changeType === 'INSERT_COLUMN'){
var col = range.getColumn();
var lastCol = range.getLastColumn();
var numCols = 1+lastCol-col;
sheet.deleteColumns(col, numCols);
SpreadsheetApp.getUi().alert("Warning for Column");
}
GmailApp.sendEmail(email#company.com,
'Warning:'+e.changeType+" row:"+ row,
"",
{
name :'WARNING'
});
}
There is no built-in / direct way to prevent that editors delete rows in Google Sheets while allowing them to user other features like filtering, sorting etc.
Under certain scenarios it might work to protect the sheet and setting some protections exceptions to allow some users to edit some cells.
If the above doesn't work for you, you might have to implement a way to backup the rows but that might not scalable and very reliable.
The most reliable approach might be to have a "backend database" (it could be another spreadsheet or another service i.e. BigQuery) then import the data into the "frontend" spreadsheet and push the changes to the backend database after they were validated.
Yes I have a backend database. In the sense that I have a goolge form that fills in the answer sheet. Then I have 4 more sheets which simply read the sheet values ​​from the answers(but on which, other users can take actions). So for example in sheet2 I have something like (= 'Form Answers'! A: A) and next to it, I have a cell with a checkbox like "OPEN" / "CLOSED" . The point is that if the user in Sheet2 inserts/delete a row, the answers are misaligned from the checkboxes...so the values ​​of the checkboxes no longer correspond to the information taken from the sheet "'Form Answers'!".I wanted to avoid having him insert/delete a row by mistake. So with the above procedure, I can do an undo of a new row ... it would have been nice to be able to restore any deleted row.
it would take something like "If you delete row3 of sheet2 then reinsert row3 of sheet 'Form Answer', exactly in the position of row 3 and not in append "(it would be the top to recover the "old value" of the checkbox that was lost due to deletion of the entire row

Query Import Range not updating when script runs automatically - error loading

I have a script to paste the raw data from a csv received by email. When the raw data is pasted on the sheet, I expected that another sheet with a query import range formula updates automatically with the new data.
I have a second script to read data from a pivot table that comes from the sheet with those formulas. However when it tries to read the data from the pivot table I get the error Exception: The number of rows in the range must be at least 1.. This happens because my variable numRows is equal to zero.
When I open the g-doc manually I see an error on the sheet with the formulas mentioned: error loading.
However, after really a few seconds that I open the gdoc, the range updates almost instantaneally without any problem, and If I manually run the script after this happening it runs without any problem.
How can I make sure that after updating the raw data I don't get the formulas stucked on error loading? I would like to run the script automatically and not manually. Any tip is more than welcome.
Notes:
I've tried already every type of recalculations but didn't work (on change, on change and every hour, on change and every minute)
The raw data has arround 2300 rows
The formula I am using is the following: =QUERY(IMPORTRANGE("1OpF8gcrV1Yj8bYP1j5PsHM4VRw2pKZOUmJf6VxGeFdY","raw_data!A2:G"), "select Col1,Col2,Col3,Col4,Col5,Col6,Col7 where Col2 is not null order by Col4 asc, Col1 asc, Col5 asc",0)
function sending_emails(){
var ss=SpreadsheetApp.openById("1OpF8gcrV1Yj8bYP1j5PsHM4VRw2pKZOUmJf6VxGeFdY");
var today = new Date();
if(today.getDay() != 6 && today.getDay() != 0){
//Sending emails to reps:
var data_sheet = ss.getSheetByName("Copy of sending_emails");
var aux = data_sheet.getRange("B3:B").getValues();
var startRow = 3; // First row of data to process
var numRows = aux.filter(String).length;
Logger.log('numRows' + numRows);
// Fetch the range of cells
var dataRange = data_sheet.getRange(startRow, 1, numRows, 5); //I get the error here because startRow = 3 and I get numRows = 0
// Fetch values for each row in the Range.
var data = dataRange.getValues();
for (var a in data) {
var row = data[a];
var message = row[3];
var emailAddress = row[0];
Logger.log('emailAddress'+ emailAddress);
MailApp.sendEmail({
to: emailAddress,
subject: 'Task Manager',
htmlBody: message,
cc: row[4]
});
}
}
}
The issue is likely with IMPORTRANGE
The class of functions IMPORTHTML, IMPORTRANGE etc have been the subject of many questions about auto updating - this approach generally seems to be quite flaky. I can't find it documented anywhere but I suspect that these functions stop calculating when they are closed. Or if a recalculation happens, for some reason they are not authorized because they are no longer linked to a user session.
That said, although I don't use this approach, I have tested it various times and it seems to work for me, though I know there are many people for whom it does not.
Some people have found that by removing all protections and making the sheet public removes errors, though in my experience its just best to remove formulae from the equation (no pun intended).
Suggested fix
In your chain of Mail > Apps Script > Sheet > FORMULA > Sheet.
Change it to Mail > Apps Script > Sheet > Apps Script > Sheet.
I don't have your source data to test with, but to implement your query in Apps Script would look something like this:
const ss = SpreadsheetApp.openById("YOUR ID");
const dataRange = ss.getSheetByName("Sheet1").getRange("A2:G");
const data = dataRange.getValues()
const filteredData = data.filter(row => row[1] !== "")
You could potentially sort the data with formulae once it has been imported with the script.
TLDR: Chaining IMPORTRANGE may work sometimes, but it doesn't seem very reliable. In my opinion, you are better off moving everything to Apps Script at this point.

persistently sporadic out of memory error in Google Apps Scripts when copying data validations from a template

I have hundreds of Spreadsheets that were made from a template sheet. They all have the same number/name of sheets, rows, columns, etc...
I have added some data validations to the template. I want to copy the data validations from the template to each of the Spreadsheets. I have the code, and it works, but it throws a memory error.
It always throws the error -- the only thing that changes is how many destination Spreadsheets it has processed before it throws the error. Sometimes it'll process 4 Spreadsheets before it throws the error, sometimes 50, sometimes more/less. I cannot figure out why.
I trimmed my code down to a working sample. I can't share the source files but they are just normal Spreadsheets with 5 sheets/tabs and various data validations. If it matters, the data validations do use named ranges. For example: =REGEXMATCH(LOWER(google_drive_url) , "^https:\/\/drive\.google\.com\/drive\/folders\/[a-z0-9_-]{33}$").
I have commented the below code but here is a recap:
Get the template Spreadsheet and cache all of the data validations in it
Go through each destination Spreadsheet:
Alear all of the data validations
Apply the data validations from the template
In my real code I have an array of destination file IDs. For testing purposes I am just using one destination file and applying the data validations from the template multiple times.
function myFunction() {
var sourceFileID = "1rB7Z0C615Kn9ncLykVhVAcjmwkYb5GpYWpzcJRjfcD8";
var destinationFileID = "1SMrwTuknVa1Xky9NKgqwg16_JNSoHcFTZA6QxzDh7q4";
// get the source file
var sourceSpreadsheet = SpreadsheetApp.openById(sourceFileID);
var sourceDataValidationCache = {};
// go through each sheet and get a copy of the data validations
// cache them for quick access later
sourceSpreadsheet.getSheets().forEach(function(sourceSheet){
var sheetName = sourceSheet.getName();
// save all the data validations for this sheet
var thisSheetDataValidationCache = [];
// get the full sheet range
// start at first row, first column, and end at max rows and max columns
// get all the data validations in it
// go through each data validation row
sourceSheet.getRange(1, 1, sourceSheet.getMaxRows(), sourceSheet.getMaxColumns()).getDataValidations().forEach(function(row, rowIndex){
// go through each column
row.forEach(function(cell, columnIndex){
// we only need to save if there is a data validation
if(cell)
{
// save it
thisSheetDataValidationCache.push({
"row" : rowIndex + 1,
"column" : columnIndex + 1,
"dataValidation" : cell
});
}
});
});
// save to cache for this sheet
sourceDataValidationCache[sheetName] = thisSheetDataValidationCache;
});
// this is just an example
// so just update the data validations in the same destination numerous times to show the memory leak
for(var i = 0; i < 100; ++i)
{
// so we can see from the log how many were processed before it threw a memory error
Logger.log(i);
// get the destination
var destinationSpreadsheet = SpreadsheetApp.openById(destinationFileID);
// go through each sheet
destinationSpreadsheet.getSheets().forEach(function(destinationSheet){
var sheetName = destinationSheet.getName();
// get the full range and clear existing data validations
destinationSheet.getRange(1, 1, destinationSheet.getMaxRows(), destinationSheet.getMaxColumns()).clearDataValidations();
// go through the cached data validations for this sheet
sourceDataValidationCache[sheetName].forEach(function(dataValidationDetails){
// get the cell/column this data validation is for
// copy it, build it, and set it
destinationSheet.getRange(dataValidationDetails.row, dataValidationDetails.column).setDataValidation(dataValidationDetails.dataValidation.copy().build());
});
});
}
}
Is there something wrong with the code? Why would it throw an out of memory error? Is there anyway to catch/prevent it?
In order to get a better idea of what's failing I suggest you keep a counter of the iterations, to know how many are going through.
I also just noticed the line
sourceSheet.getRange(1, 1, sourceSheet.getMaxRows(), sourceSheet.getMaxColumns()).getDataValidations().forEach(function(row, rowIndex){
This is not a good idea, because getMaxRows() & getMaxColumns() will get the total number of rows in the sheet, not the ones with data, meaning, if your sheet is 100x100 and you only have data in the first 20x20 cells, you'll get a range that covers the entire 10,000 cells, and then calling a forEach means you go through every cell.
A better approach to this would be using getDataRange(), it will return a range that covers the entirety of your data (Documentation). With that you can use a much smaller range and considerably less cells to process.

using query() with apps script keeps adding 500 rows to sheet

I'm working on a large sheet and cells are precious given gsheets quota.
I have a range, and when data updates automatically a script updates the named range automatically to be the full length of the data.
The named range is called "gadatapull". This range is on the tab "datapull".
Two tabs. "datapull" is where fresh data is dumped and "data_prep" is where I do stuff to the data. After a fresh pull just now datapull has 2,733 rows of data, including the headers.
I would like data_prep to have the same length as datapull. Plus 7 rows for text at top of data_prep.
When my script to update data runs I do this:
// clear dataprep sheet for new data
var lastRow = 7;
var maxRows = dataprep.getLastRow();
if(maxRows - lastRow > 0) {
dataprep.deleteRows(lastRow+1, maxRows-lastRow);
}
data_prep has 7 rows (because the script just deleted all rows above 7).
Now, in data_prep cell A7 I have:
=query(indirect("gadatapull"),"select *")
Expected result was that all the fresh data in "gadatapull" would appear in data_prep tab and that data_prep tab would expand accordingly.
But what actually happens is all the data arrive as expected, but then there are an additional blank 500 rows at the bottom. This 500 number is too rounded off. Makes me think Gsheets is automatically adding this number as a default under some condition.
How can I prevent Gsheets from adding these additional 500 rows?
Instead of letting the sheet API expanding the number of rows (which is your hypothesis and might well be true ;-) you can add all the necessary cells before importing data.
I didn't try in real conditions but this should work.
Btw, the script imports data as well.
code :
function copyDataToSheet(){
var ss = SpreadsheetApp.getActiveSpreadsheet();
var dataprep = ss.getSheetByName('data_prep');
var datapull = ss.getSheetByName('datapull');
var lastRow = 7;
var maxRows = dataprep.getLastRow();
if(maxRows - lastRow > 0) {
dataprep.deleteRows(lastRow+1, maxRows-lastRow);
}
var datapullSize = datapull.getLastRow();
dataprep.insertRows(7,datapullSize);// insert exactly the number of rows you need.
var dataToCopy = datapull.getDataRange().getValues()
dataprep.getRange(7,1,dataToCopy.length,dataToCopy[0].length).setValues(dataToCopy);
}

Replace entire sheet with another in Google Apps Scripts

What I'd like to do is warehouse information from a particular sheet within a spreadsheet and copy it to a second spreadsheet at the end of every day. The second spreadsheet will run complex pivots and reports against the copied information that don't need to run throughout the day.
I can set up a time-driven trigger which will run the job every day within an hour block.
I'm working on the following script which uses SpreadsheetApp.getActiveSpreadsheet to get the current Spreadsheet. Then gets the individual sheet to backup with spreadsheet.getSheetByName. And then uses the sheet.copyTo method to add the current sheet to a new spreadsheet. I'm getting the new spreadsheet by looking up the id with SpreadsheetApp.openById all like this:
function startBackupJob() {
var currentSpreadSheet = SpreadsheetApp.getActiveSpreadsheet()
var masterSheet = currentSpreadSheet.getSheetByName("Sheet1")
var backupSpreadSheetId = "#######################################";
var backupSpreadSheet = SpreadsheetApp.openById(backupSpreadSheetId);
// var backupSheet = backupSpreadSheet.getSheetByName("Sheet1");
// backupSpreadSheet.deleteSheet(backupSheet);
masterSheet.copyTo(backupSpreadSheet).setName("Sheet1");
}
The issue I'm having is that copyTo will create a new worksheet rather than overwrite the existing spreadsheet. The point of moving to the new workbook is to run pivot tables off the data and not re-wire them to point to a new sheet.
I can delete the previous sheet to make room for the new one, but this kills the references on the PivotTable as well, so it doesn't help much.
Is there an easy way to transfer the entire contents of one worksheet to another?
This is similar to (but different from) the following questions:
How do I script making a backup copy of a spreadsheet to an archive folder? - However, I don't want to move the whole file, but a specific sheet within the spreadsheet.
How can copy specifics sheet to another spreadsheet using google script & copy one spreadsheet to another spreadsheet with formatting - However copying produces a new sheet, whereas I need to replace the contents of an existing sheet
Scripts, copy cell from one sheet to another sheet EVERYDAY at a specific time - However, I do want to replace the entire sheet, rather than just specific cells within the sheet.
Update
I might be able to do this by calling getRange on each sheet and then using getValues and setValues like this:
var currentValues = masterSheet.getRange(1, 1, 50, 50).getValues()
backupSheet.getRange(1, 1, 50, 50).setValues(currentValues)
But I'm worried about edge cases where the master sheet has a different available range than the backup sheet. I also don't want to hardcode in the range, but for it to encompass the entire sheet. If I call .getRange("A:E") then the two worksheets have to have the exact same number of rows which is not likely.
Your update has you about 90% of the way there. The trick is to explicitly check the size of the destination sheet before you copy data into it. For example, if I did something like this:
var cromulentDocument = SpreadsheetApp.getActiveSpreadsheet();
var masterSheet = cromulentDocument.getSheetByName('master');
var logSheet = cromulentDocument.getSheetByName('log');
var hugeData = masterSheet.getDataRange().getValues();
var rowsInHugeData = hugeData.length;
var colsInHugeData = hugeData[0].length;
/* cross fingers */
logSheet.getRange(1, 1, rowsInHugeData, colsInHugeData).setValues(hugeData);
...then my success would totally depend upon whether logSheet was at least as big as masterSheet. That's obvious, but what's less so is that if logSheet is bigger then there will be some old junk left over around the edges. Ungood.
Let's try something else. As before, we'll grab the master data, but we'll also resize logSheet. If we don't care about logSheet being too big we could probably just clear() the data in it, but let's be tidy.
var cromulentDocument = SpreadsheetApp.getActiveSpreadsheet();
var masterSheet = cromulentDocument.getSheetByName('master');
var logSheet = cromulentDocument.getSheetByName('log');
var hugeData = masterSheet.getDataRange().getValues();
var rowsInHugeData = hugeData.length;
var colsInHugeData = hugeData[0].length;
/* no finger crossing necessary */
var rowsInLogSheet = logSheet.getMaxRows();
var colsInLogSheet = logSheet.getMaxColumns();
/* adjust logSheet length, but only if we need to... */
if (rowsInLogSheet < rowsInHugeData) {
logSheet.insertRowsAfter(rowsInLogSheet, rowsInHugeData - rowsInLogSheet);
} else if (rowsInLogSheet > rowsInHugeData) {
logSheet.deleteRows(rowsInHugeData, rowsInLogSheet - rowsInHugeData);
}
/* likewise, adjust width */
if (colsInLogSheet < colsInHugeData) {
logSheet.insertColumnsAfter(colsInLogSheet, colsInHugeData - colsInLogSheet);
} else if (colsInLogSheet > colsInHugeData) {
logSheet.deleteColumns(colsInHugeData, colsInLogSheet - colsInHugeData);
}
/* barring typos, insert data with confidence */
logSheet.getRange(1, 1, rowsInHugeData, colsInHugeData).setValues(hugeData);
What's going on here is pretty straightforward. We figure out how big the log needs to be, and then adjust the destination sheet's size to match that data.