tl;dr: My custom-formula cells all go Loading... and they start filling in results, but they get stuck. A reload immediately provides the completed results. I don't want to reload every time I change a value.
In my sheet almost all cells are interconnected, so changing one cell triggers a recalc. Each row is a year, each column is an identical formula across all rows. Each cell can refer to earlier columns in its row or anything on the prior row, plus a few absolute locations containing the "inputs". Changing an input triggers a recalc of everything.
The column with my custom function will go Loading..., then one cell at a time it returns the value, almost one second per cell. OK, slow, but fine. But often it just stops completely partway down. Sometimes it starts up again, but often it never does.
But the cells WERE all recalculated. My custom function was called and returned values promptly, execution time of 0.125 secs or less usually. If I reload in the browser, I immediately get the fully recalculated sheet. It looks like some link is being severed between Sheets in my browser and Google's servers, so I stop seeing updates.
This is the first time I've ever used Apps Script -- or JavaScript for that matter -- but I have been programming in other ways for decades. If it matters, the custom function is purely mathematical, calling no services except Math, not even Spreadsheet, getting everything it needs in its arguments, using a few functions in the same file. It's a detailed taxes calculator.
I'm using a recent Chromebook.
In order to avoid to have to reload your spreadsheet it's very likely that you will have to follow the guidelines for custom function optimization:
Summary: Instead of using one formula to calculate a single value use your formula to calculate multiple values and return them as an array.
The above will reduce the number of formulas and will improve your spreadsheet performance, but bear in mind that custom functions have a maximum execution time of 30 secs.
Also reducing the size of the data ranges in your spreadsheets and the number of blank rows and columns at the bottom and right of your data ranges will help in improve the performance of your spreadheet.
Related
How to use a custom function with an ArrayFormula
Room For Optimization?
Related
I am running a setValues() function for a sheet with massive data count (20,000 * 50) so I finished the data in batch then used the setValues() function to update the sheet at once but the function is updating data in a manner that looks like (x * 50) sequentially, so I wanted to know the limitation of the maximum cells updated at once when using this function.
It seems that when the values are put to the sheet using setValues, at first, the cells are kept and the values are put to the cells. When I had tested the maximum number of cells for putting the values using setValues before, 2,625,000 cells could be put. But, when I tested over 2,635,000 cells, an error of Exceeded maximum execution time occurred. So at that time, I had resulted that the maximum cells were between 2,625,000 and 2,635,000. By the way, when 4,000,000 cells are tested, it was found that the 4,000,000 cells are kept in the destination sheet while the values couldn't be put. This is the result in my environment.
So, in the case of 20,000 * 50, I thought that 1,000,000 cells can be put with setValues. But I'm worried that when such data is put with setValues, the process cost was high. For example, after the script is finished and I checked the sheet, I was required to wait to completely put the values. I thought that this might be the reason for your issue.
In this case, when Sheets API instead of Spreadsheet service (SpreadsheetApp) is used for putting the values to the sheet, such an issue could be removed. Ref
From the above situation, in your situation, as a workaround, how about putting the values using Sheets API instead of setValues of Spreadsheet service?
Reference:
Benchmark: Reading and Writing Spreadsheet using Google Apps Script
There is no direct maximum for setValues(), but you might want to consider the following three points:
There is a (very high) maximum size of an array in javascript.
There is a maximum amount of columns, cells and new rows in a Google spreadsheet.
There is a maximum script runtime and your function might time out before setValues() finished executing.
Of those 3 limits the amount of values you can set with one setValues() call is most likely to be limited by the maximum script execution time.
I have an excel spreadsheet that uses a numbered scale to estimate a level of effort associated with that particular task. I need to be able to calculate a running sum of those items until I go beyond that number, and not include the one that goes beyond in the count. I then need to total up the number of rows included in the sum until that point. I then need to identify that last item, probably by highlighting that row, then start the operation again for the remaining items below. This should continue until I have gone through all items in the spreadsheet.
I know the question is sort of confusing. The level of effort is used as a calculation to determine the amount of work that can be completed in a production cycle. that total number is used to estimate how many items can be completed in the upcoming production cycles, hence user defined amount as this number will change each cycle. The reason I need to be able to identify how many items are included and highlight the row is so we can determine "Milestones" so we can tell our customer, next milestone we can get you all the following items.
I honestly have no idea where to even start with this issue, I know I will need to complete this task in VBA most likely which is fine I am very comfortable with VBA. I am supposing I need to use it like an on demand function through a button click or something that will first generate a database of the items that exist then run SQL queries to get me where i need to be. Any help anyone can provide would be vastly appreciated and thanks in advance to anyone that has ideas.
I have a spreadsheet with a matrix set up to count how many times a student has had a lesson with a particular tutor.
The matrix works fine with this formula:
=ARRAYFORMULA(SUM(IF(TERM4!$B$6:$B$2398=B$1,IF(TERM4!$C$6:$C$2398=$A2,1,IF(TERM4!$D$6:$D$2398=$A2,1,FALSE()))),FALSE()))
however due to the number of students/tutors the matrix is 7000 cells, slowing the working sheet down considerably.
Is there a better way to do this. Can I run a google app script to count the matrix on a trigger (eg. once a week) to count the matrix, so the formulas are not slowing the sheet down.
I would also like the formula to return a blank rather than a 0 if the result is FALSE.
Thanks for your help!
Yes its possible to do it with gas. The only part that gets a little complex is that if your script takes over 5min it wont process all the rows. To avoid that, process it by chunks (say 100 at a time) and use scriptProperties to remember which spreadsheet and row you last processed. Each trigger will process as much as it can until all spreadsheets and rows are processed.
We are trying to get the last N changes for a user, and currently do so by getting the largestChangeId, then subtracting a constant from that and getting more changes.
As an example, we typically are making API calls with the changestamp = largestChangeId - 300, with maxResults set to 300.
We've seen as few as half a dozen changes to 180 changes come back across our userbase with these parameters.
One issue that we're running into is that the number of changes that we get back are rather unpredictable, with huge jumps in change stamps for some users and so we've have to choose between two rather unpalatable scenarios to get the last N changes.
Request lots of changes, which can lead to slow API calls simply because there are lots of changes.
Requests a small set of changes, and seek back progressively in smaller batches, which is also slow as it results in multiple RPC calls, due to multiple API calls.
Our goal is to get the last ~30 or so changes for a user as fast as possible.
As a workaround, we are currently maintaining per user state in our application to tune the max number of changes we request up or down based on the results we got for a user the last time around. However, this is somewhat fragile due to how the rate of changes incrementing for users can vary over time.
So my question is as follows:
Is there a way to efficiently get the last N changes a user, specifically in one API call ?
ID generation is very complex, it's impossible to calculate the ID of the user's nth latest change :) Changes list actually has no features that'd be appropriate for your use case. In my own personal opinion, changes list should be in the reverse chronological order, going to discuss it with the rest of the team.
I'm hoping you can point me in the right direction.
I'm trying to generate a control chart (http://en.wikipedia.org/wiki/Control_chart) using SQL Server 2008. Creating a basic control chart is easy enough. I'd just calculate the mean and standard deviations and then plot them.
The complex bit (for me at least) is that I would like the chart to reset the mean and the control limits when a step change is identified.
Currently I'm only interested in a really simple method of identifying a step change, 5 points appearing consecutively above or below the mean. There are more complex ways of identifying them (http://en.wikipedia.org/wiki/Western_Electric_rules) but I just want to get this off the ground first.
The process I have sort of worked out is:
Aggregate and order by month and year, apply row numbers.
Calculate overall mean
Identify if each data item is higher, lower or the same as the mean, tag with +1, -1 or 0.
Identify when their are 5 consecutive data items which are above or below the mean (currently using a cursor).
Recalculate the mean if 5 points are above or 5 points are below the mean.
Repeat until end of table.
Is this sort of process possible in SQL server? It feels like I maybe need a recursive UDF but recursion is a bit beyond me!
A nudge in the right direction would be much appreciated!
Cheers
Ok, I ended up just using WHILE loops to iterate through. I won't post full code but the steps were:
Set up a user defined table data type in order to pass data into a stored procedure parameter.
Wrote accompanying stored procedure that uses row numbers and while loops to iterate along each data value in the input table and then uses the current row number to do set based processing on a subset of the input data (to check if following 5 points are above/below mean and recalculate the mean and standard deviations when this flag is tripped).
Outputs table with original values, row numbers, months, mean values, upper control limit and lower control limit.
I've also got one up and running that works based on full Nelson rules and will also state which test the data has failed.
Currently it's only been used by me as I develop it further so I've set up an Excel sheet with some VBA to dynamically construct a SQL string which it passes to a pivot table as the command text. That way you can repeatedly ping the USP with different data sets and also change a few of the other parameters on how the procedure runs (such as adjusting control limits and the like).
Ultimately I want to be able to pass the resulting data to Business Objects reports and dashboards that we're working on.