I want to store a large number of key-value pairs in Google Apps Script,which can be later accessed by the script later on.The problem is I have it on a text file(in a Javascript Array format).
words[1]=new Array("String","String");
I have 975 such words.
Any Ideas how to proceed? I wold be grateful if relevant code(if any) is provided.
Build your array then use setValues()
var cronicaArr = new Array();
for (i in data)
{
cronicaArr.push(["string1", "string2"]);
}
myActiveSheet.getRange("A2:A"+cronicaArr.length).setValues(cronicaArr);
The GAS documentation has an entire section dedicated to data storage where your question as well as many more will be answered. Please read through the 'Storing Data' section.
To answer your question specifically, you can use any of ScriptProperties, ScriptDB or a spreadsheet to store your data.
Miturbe's answer only achieves saving the data. I assume you also want to query it (get value from key).
For that store it on scriptdb. Beware of size limits (which deppend on what type of google account the script runs on), then backup the db on a sheet regularly.
Related
I am using google spreadsheet to collaborate on some common data, and processing this data using spreadsheet macros and saving it there itself in cells. This approach is error prone as the macro functions which are processing the data depend on inputs apart from what is given as parameters.
Example of how the common data-looks
Content Sheet
Sections Sheet
Pages Sheet
Explanation of this common data
The three sheets are filled by various collaborators
Content sheet defines base elements of a page, they are referred (using their UUIDs) in the sections sheet (using macros), and finally all sections add together to make publishable html page.
The output html format varies depending upon the destination which are multiple - static html, Google Document, Google Slides, Word-press, MediaWiki etc. The formatting is done using GAS macros.
I tried various solutions, but nothing is working properly. Finally I have thought to keep the google spreadsheet as a data source only and move the entire computation to my local machine. This means I have to download the sheets data (3 sheets) which are some 1,000 rows, but can become voluminous with time.
Earlier when the entire computation was on google spreadsheets, I had to only fetch the final data and use it, which amounted to a lot fewer APIs calls. Referring to the example above, It means I would only fetch the output html of the "Pages sheet".
Q1) So my question is, given that I plan to move the entire computation to local machine, if I make only 3 APIs calls to bulk fetch data of three sheets, and pull the entire data, does it still counts as just 3 API calls, or big requests have a different API cost? Are there any hidden risks of over-running the quota in this approach?
Q2) And let's say i use hooks, both spreadsheet hooks and drive hooks to sync with real-time changes, what are my risks of running out of quota. Does each hook call constitute a API call. Any general advice on best practices for doing this?
Thanks
As per the documentation says by default Google Sheets API has a limit of 500 requests per 100 seconds per project, and 100 requests per 100 seconds per user unless you change it on your project.
A1) If you make a single request it counts as 1 request no matter how large the data is, for that reason I'd make a single request in order to GET the entire Spreadsheet rather than making 3 API calls and then make the whole process you mentioned before.
A2) If you've considered using Push Notifications keep in mind that you will need some extra configuration such as a domain.
As a workaround I recommend you to use Drive File Stream
Since you want move the computation to your local machine, this option would be better but will need some changes. For example, you could change the output to CSV in order to better handle the data and it still will be a Google Sheets valid file type and then host your app in the same machine or make your local machine accessible.
I've been promoted into developing & maintaining a Google Sheets database at work. I know very little about Google Sheets scripting & from asking around, and researching it's looking like GAS is probably the avenue that I need to start heading down.
So we have 3 Workbooks in Google Sheets; 2 contain large amounts of data, the other workbook provides a UI for our sales dpt. to access the data. I really wish I could share these with you, as describing them is difficult. In the UI workbook, separate pages are paired with sheets in the one database (lets call it database A).
A salesman will go to the UI sheet for the product he's selling a device for; the top section of the sheet allows him to select, essentially, a row from database A. After the selection is made, the rest of the sheet is populated with products we manufacture that work with the choice made in the top section; the products we make are stored in the other database ("B"). We have to have two databases, as we've earlier hit the cell-limit in sheets with the two databases combined.
On average each UI page has about 150 Importranges. Looking up done with Query.
Our problem is that this UI is getting pretty slow, initial load time makes it worthless for salesmen on the road, and annoying to the salesmen here in the office. The delay when making the initial selections (querying database A) is usable, but still much slower then we'd like. And we're not finished building UI pages.
I've seen online that most people recommend using Apps Script to replace importrange, but knowing nothing about Apps Script, I haven't been able to understand what is being done, or mainly how to take the apps script and actually put the data in the cells.
So I'd appreciate any help I could get in speeding this up.
First let me say that the Google Apps script documentation has improved greatly over the years and I find it pretty easy to use now. If you open up a code editor in Google Sheets and go to Help menu and select API reference then that links you up to just about everything you need to know. If you go to the Google Apps reference for spreadsheets and look at the SpreadsheetApp object you'll see that there's three commands to open up another Spreadsheet not a sheet but a Spreadsheet. You can do it by file, by id or by URL.
If you click on the Url command it will take you to an example like this:
// The code below opens a spreadsheet using its id and logs the name for it.
// Note that the spreadsheet is NOT physically opened on the client side.
// It is opened on the server only (for modification by the script).
var ss = SpreadsheetApp.openByUrl(
'https://docs.google.com/spreadsheets/d/abc1234567/edit');
Logger.log(ss.getName());
As it points out, it doesn't actually open the file on the client-side it just opens it up on the server. So it may be necessary for you to open them up manually at first just to get an idea of what they look like. Once you know how they are organized then you can use the open command to get a Spreadsheet Object and from it select a specific sheet and then a data range. Once you have a range then you can load an array like this.
var myArray = rng.getValues();
This will load the entire range in one fell swoop into a JavaScript array and of course it would be nice if you can filter out unwanted data from the array and then put it into your current sheet at a desired range. Note that the range sizes have to be exact matches and also please realize that ranges start from 1 and arrays start from 0 so that can cause you some grief. Also let me add a few caveats that I've run into.
If your dealing with a one row or one column range array then you have to get the array's in the correct form. I tried writing them here but the Stack Overflow text converter keeps messing them up so I'd recommend you go to my reference on that issue here.
If you've coded in JavaScript in the past I'm guessing that you'll have no problem coming up to speed with Google Apps Scripting with the new documentation and an occasional visit to Stack Overflow for a question or two. I've gotten some great answers here from other users. If you need a JavaScript reference here's one that I use.
You are probably best off using a WebApp served from Google Apps Script for the UI which I'd be happy to help with if you had some sample data. If you wanted to still use the sheets, then you could replace the importRanges with some Google Apps Script function that runs every 10 minutes or so to keep the UI sheet updated. It should speed up load times. Something like this would work for you:
function importSheetA() {
var ss = SpreadsheetApp.getActiveSpreadsheet();
var database = SpreadsheetApp.openByUrl("DATABASE_A_URL");
var dataToCopy = database.getSheetByName("DATABASE_A_SHEET_NAME").getDataRange().getValues();
var copyToSheet = ss.getSheetByName("UI_SHEET_NAME");
var copyData = copyToSheet.clearContents().getRange(1, 1, dataToCopy.length, dataToCopy[0].length).setValues(dataToCopy);
}
I am totally new to Google-apps-script and this may be very poor question
I am making a basic setup using google forms and google apps script
From the responses of form I change my content in Google Spreadsheet accordingly
For example my query from form needed 10000 records to be selected and produced in whole another spreadsheet
I just wanted to know that is there some kind of delay introduced when I set and get values of any cell of spreadsheet on such a large scale? If so on what it depends and how as a programmer can I remove or optimize them?
Thanks is advance!
The Best Practices article by Google is the primary reference for this. The most important advice is to minimize the number of calls to spreadsheet methods by batching operations. So, select a range that contains all the records you need, get them all at once with getValues, process without further interaction with the spreadsheet, and output the result using setValues.
If you follow this advice, 10000 records is still a reasonable amount of data to process by a script.
When using Google Spreadsheets, and you want to use Validation on a cell based on a range of values, you get a pretty nice autocompletion feature that makes data entry much nicer.
My most common application is in an inventory-like situation, where I reference inventory items through some kind of hash code or part number. My frequently used hashes are committed to my brain, but when i need a new part or I need a variation on an old one, I want a little help making sure I have the correct part# selected.
I always find that I want additional row context with my autocompletion, so now I think I want to make a sidebar addon that has smarter searching rules and also includes more contextual data to ensure that I have the part# I meant. Once I am sure of the part#, one button can push the selected result over to the currently active row.
This solution is a bit "heavier" than data validation, but it does exactly what I want.
Assuming that my inventory source is another spreadsheet, what is a good way to set up my Addon-Script Project?
I was thinking that my sidebar would call an HtmlService function that utilizes Cache Service to hold my "hash list" and a few bits of context in memory. I don't think I am looking at a heavy jQuery solution (only to build the autocomplete dialog as I type), but that is really the whole purpose of this question!
Any thoughts on high level project overview? I am fairly new to Apps Scripts in general, especially since the newer API's have been coming out since 2013.
I did exactly that with my Budget Sheets, moved from Data Validation to Jquery's Autocomplete in a sidebar when the number of compositions jumped from 500 to 2.500, and it is a LOT faster, the search is faster than Autovalidation with 100 itens, the logic I use:
Database:
It's base data is in a Spreadsheet, each time it is updated, there's an OnEdit function that will trigger in several minutes a DB update, this is so that the function won't run unecesserary on several times consectively for the same edit.
The DB is then stored in simple text in JSON format on Google Drive, it is a 2MB file generated from the Spreadsheet data, using DriveApp.getFileById(id).setContet(JSON.stringify(myJsonDataFromSpreadsheet)), the file generation and saving takes up to 30 seconds, the file reading is around 4segs.
Sidebar:
Build a normal HTML - remember to use IFRAME option - and serve it, this is all in the docs, you'll have to send data from the HTML to GoogleScript (eg. the part# to insert) via google.script.run, and get back data (eg. the file with all the part numbers) with SuccessHandler in conjunction with google.script.run.
Main function references:
https://developers.google.com/apps-script/guides/html/reference/run
https://developers.google.com/apps-script/guides/dialogs#custom_sidebars
https://developers.google.com/apps-script/reference/drive/file -> First get the file With DriveApp.getFileById(id).
If I have a mission critical db, that needs to be regularly backed up, and I store it as a scriptdb in GAS, is there any way to back up the actual database file? It seems the db is embedded in a way that makes it invisible outside of scripts?
Well, you can always query all your values and JSON.stringify them.
If you ever need to restore a database you from this, the only difference I can notice is that each item id will change.
Here is an example:
function backupDB() {
var db = ScriptDb.getMyDb();
var res = db.query({});
var array = [];
while( res.hasNext() )
array.push(res.next().toJson());
var dbString = JSON.stringify(array);
Logger.log(dbString); //you'll obviously save this string somewhere else e.g. as a docs file
}
You may also need to do this in chunks, as your db may have too much data for the script to handle at once, like this.
I also feel that this "backup" procedure should be taken care of by the API as well. The code above is just an idea I just had.
I think I found decent soluton for my question above, in an unexpected place. Rather than use a scriptdb, I can use Google Fusion Table/s - these have SQL-type access, are concrete docs that can be exported, backed up, viewed etc, and can act as the data store for my app...
The actual answer is: you do not store mission critical data on a scriptdb, for many reasons:
appscript does not have SLA. Google has many other storages that do have guarantees.
Because that db does not support transactions you will not be able to guarantee that a batch process might process twice the same data (in cases where the script fails in the middle of a chunk backup or restore).
It will get complex if you store the ids inside other objects in the db.
Maybe you can copy your critical data from the scriptdb to a Google Spreadsheet. Given its a example in Google developers I think it is an interesting option.
Here is the link: Copy a Database to a New Sheet in a Spreadsheet.