Can't see the betterlog output in my spreadsheet - google-apps-script

I understand that it writes to the sheet when the log reaches 50,000 lines. But my script doesn't generate 50,000 lines of log, and I need to see the log once the script's finished. Is there a way to flush the log to the sheet once it finishes. Or reduce the threshold below 50k?

Resolved, and don't I feel stupid. I didn't notice Betterlog created another sheet/tab named Log in the sheet, and it's logging to that.

You can use Logger.getLog() to get the entire text at once.
But you will have to split it in smaller parts, because each cell of the sheet has a limit of 50k characters.
https://developers.google.com/apps-script/reference/base/logger#getLog%28%29

Related

what is the google apps script setValues() maximum batch size?

I am running a setValues() function for a sheet with massive data count (20,000 * 50) so I finished the data in batch then used the setValues() function to update the sheet at once but the function is updating data in a manner that looks like (x * 50) sequentially, so I wanted to know the limitation of the maximum cells updated at once when using this function.
It seems that when the values are put to the sheet using setValues, at first, the cells are kept and the values are put to the cells. When I had tested the maximum number of cells for putting the values using setValues before, 2,625,000 cells could be put. But, when I tested over 2,635,000 cells, an error of Exceeded maximum execution time occurred. So at that time, I had resulted that the maximum cells were between 2,625,000 and 2,635,000. By the way, when 4,000,000 cells are tested, it was found that the 4,000,000 cells are kept in the destination sheet while the values couldn't be put. This is the result in my environment.
So, in the case of 20,000 * 50, I thought that 1,000,000 cells can be put with setValues. But I'm worried that when such data is put with setValues, the process cost was high. For example, after the script is finished and I checked the sheet, I was required to wait to completely put the values. I thought that this might be the reason for your issue.
In this case, when Sheets API instead of Spreadsheet service (SpreadsheetApp) is used for putting the values to the sheet, such an issue could be removed. Ref
From the above situation, in your situation, as a workaround, how about putting the values using Sheets API instead of setValues of Spreadsheet service?
Reference:
Benchmark: Reading and Writing Spreadsheet using Google Apps Script
There is no direct maximum for setValues(), but you might want to consider the following three points:
There is a (very high) maximum size of an array in javascript.
There is a maximum amount of columns, cells and new rows in a Google spreadsheet.
There is a maximum script runtime and your function might time out before setValues() finished executing.
Of those 3 limits the amount of values you can set with one setValues() call is most likely to be limited by the maximum script execution time.

How to index a 1 billion row CSV file with elastic search?

Imagine you had a large CSV file - let's say 1 billion rows.
You want each row in the file to become a document in elastic search.
You can't load the file into memory - it's too large so has to be streamed or chunked.
The time taken is not a problem. The priority is making sure ALL data gets indexed, with no missing data.
What do you think of this approach:
Part 1: Prepare the data
Loop over the CSV file in batches of 1k rows
For each batch, transform the rows into JSON and save them into a smaller file
You now have 1m files, each with 1000 lines of nice JSON
The filenames should be incrementing IDs. For example, running from 1.json to 1000000.json
Part 2: Upload the data
Start looping over each JSON file and reading it into memory
Use the bulk API to upload 1k documents at a time
Record the success/failure of the upload in a result array
Loop over the result array and if any upload failed, retry
The steps you've mentioned above looks good. A couple of other things which will make sure ES does not get under load:
From what I've experienced, you can increase the bulk request size to a greater value as well, say somewhere in the range 4k-7k (start with 7k and if it causes pain, experiment with smaller batches but going lower than 4k probably might not be needed).
Ensure the value of refresh_interval is set to a very great value. This will ensure that the documents are not indexed very frequently. IMO the default value will also do. Read more here.
As the above comment suggests, it'd be better if you start with a smaller batch of data. Of-course, if you use constants instead of hardcoding the values, your task just got easier.

Sheets gets stuck Loading... a cell and I have to reload

tl;dr: My custom-formula cells all go Loading... and they start filling in results, but they get stuck. A reload immediately provides the completed results. I don't want to reload every time I change a value.
In my sheet almost all cells are interconnected, so changing one cell triggers a recalc. Each row is a year, each column is an identical formula across all rows. Each cell can refer to earlier columns in its row or anything on the prior row, plus a few absolute locations containing the "inputs". Changing an input triggers a recalc of everything.
The column with my custom function will go Loading..., then one cell at a time it returns the value, almost one second per cell. OK, slow, but fine. But often it just stops completely partway down. Sometimes it starts up again, but often it never does.
But the cells WERE all recalculated. My custom function was called and returned values promptly, execution time of 0.125 secs or less usually. If I reload in the browser, I immediately get the fully recalculated sheet. It looks like some link is being severed between Sheets in my browser and Google's servers, so I stop seeing updates.
This is the first time I've ever used Apps Script -- or JavaScript for that matter -- but I have been programming in other ways for decades. If it matters, the custom function is purely mathematical, calling no services except Math, not even Spreadsheet, getting everything it needs in its arguments, using a few functions in the same file. It's a detailed taxes calculator.
I'm using a recent Chromebook.
In order to avoid to have to reload your spreadsheet it's very likely that you will have to follow the guidelines for custom function optimization:
Summary: Instead of using one formula to calculate a single value use your formula to calculate multiple values and return them as an array.
The above will reduce the number of formulas and will improve your spreadsheet performance, but bear in mind that custom functions have a maximum execution time of 30 secs.
Also reducing the size of the data ranges in your spreadsheets and the number of blank rows and columns at the bottom and right of your data ranges will help in improve the performance of your spreadheet.
Related
How to use a custom function with an ArrayFormula
Room For Optimization?

Google Apps Script for counting Matrix

I have a spreadsheet with a matrix set up to count how many times a student has had a lesson with a particular tutor.
The matrix works fine with this formula:
=ARRAYFORMULA(SUM(IF(TERM4!$B$6:$B$2398=B$1,IF(TERM4!$C$6:$C$2398=$A2,1,IF(TERM4!$D$6:$D$2398=$A2,1,FALSE()))),FALSE()))
however due to the number of students/tutors the matrix is 7000 cells, slowing the working sheet down considerably.
Is there a better way to do this. Can I run a google app script to count the matrix on a trigger (eg. once a week) to count the matrix, so the formulas are not slowing the sheet down.
I would also like the formula to return a blank rather than a 0 if the result is FALSE.
Thanks for your help!
Yes its possible to do it with gas. The only part that gets a little complex is that if your script takes over 5min it wont process all the rows. To avoid that, process it by chunks (say 100 at a time) and use scriptProperties to remember which spreadsheet and row you last processed. Each trigger will process as much as it can until all spreadsheets and rows are processed.

Google Apps Script Spreadsheet too much data - But there isnt

Since end of August 2012 scripts that used to work on GAS have stopped working.
Am using JDBC to retrieve records in my MySQl database and show in a sheet.
Was working and then I started getting "We're sorry, we were unable to
process the operation because it contains too much data."
But the dataset in questions is 2400 rows * 35 columns, which is WAY below the 400 000 cell limit, and way below the 256 columns limit per sheet.
Any ideas? Has something changed recently...
I think the conclusion here is that the
"We're sorry, we were unable to process the operation because it contains too much data."
error can be caused not only be the amount or rows and columns (cells) in a data set, but also be due to internals issues with memory use of objects being applied to the region.
Try adding a portion of the values at a time, with a Spreadsheet.flush(); in between each setValues(). That solved my problem.