If I have a mission critical db, that needs to be regularly backed up, and I store it as a scriptdb in GAS, is there any way to back up the actual database file? It seems the db is embedded in a way that makes it invisible outside of scripts?
Well, you can always query all your values and JSON.stringify them.
If you ever need to restore a database you from this, the only difference I can notice is that each item id will change.
Here is an example:
function backupDB() {
var db = ScriptDb.getMyDb();
var res = db.query({});
var array = [];
while( res.hasNext() )
array.push(res.next().toJson());
var dbString = JSON.stringify(array);
Logger.log(dbString); //you'll obviously save this string somewhere else e.g. as a docs file
}
You may also need to do this in chunks, as your db may have too much data for the script to handle at once, like this.
I also feel that this "backup" procedure should be taken care of by the API as well. The code above is just an idea I just had.
I think I found decent soluton for my question above, in an unexpected place. Rather than use a scriptdb, I can use Google Fusion Table/s - these have SQL-type access, are concrete docs that can be exported, backed up, viewed etc, and can act as the data store for my app...
The actual answer is: you do not store mission critical data on a scriptdb, for many reasons:
appscript does not have SLA. Google has many other storages that do have guarantees.
Because that db does not support transactions you will not be able to guarantee that a batch process might process twice the same data (in cases where the script fails in the middle of a chunk backup or restore).
It will get complex if you store the ids inside other objects in the db.
Maybe you can copy your critical data from the scriptdb to a Google Spreadsheet. Given its a example in Google developers I think it is an interesting option.
Here is the link: Copy a Database to a New Sheet in a Spreadsheet.
Related
I'm managing a student registration system based on Google docs. Students enter their registration data in a Google Form, the results are gathered in a Google spreadsheet ("Préinscriptions") that uses additional sheets to normalize the data. The data is then imported by another spreadsheet ("Inscriptions") to handle the registrations.
In two normalization sheets in the Google spreadsheet, I use custom Apps Script functions to normalize the data from the response sheets. The functions are used in a vertical concatenation array to gather normalized data from several response sheets into the normalization sheets. The formulas look like this:
On tab "Nouveaux":
={normalizeNouveau("FR", 'FR - Nouveau'!$A$1:$AB);normalizeNouveau("EN", 'EN - Nouveau'!$A$1:$AB); normalizeNouveau("ES", 'ES - Nouveau'!$A$1:$AB)}
On tab "Anciens":
={normalizeAncien("FR", 'FR - Ancien'!$A$1:$AB); normalizeAncien("EN", 'EN - Ancien'!$A$1:$AB); normalizeAncien("ES", 'ES - Ancien'!$A$1:$AB)}
normalizeNouveau and normalizeAncien are the custom functions that normlized data linewise. "FR - Nouveau", "EN - Nouveau" ... "ES - Ancien" are the response sheets.
Now the problem is that, while this all globally works great, the formulas that gather the normalized data, on both "Anciens" and "Nouveaux" sheets, sometimes get "disconnected" (for lack of a better word) and display error messages for each formula -- either a single one or 3, corresponding I expect to the three blocks of data produced by the various function calls. When that happens (usually several times a day), what I need to do is (simply :-| ) to cut the formulas from their original cells, validate the change, then paste them again. And then the data reappears normally.
When the problem happens:
After cut-pasting:
That is very problematic because, as I said, another file ("Inscriptions") uses that data to handle many aspects of the registration process, and when this happens, that whole files becomes empty and useless -- this can even lead to errors if the problem occurs at a time where I'm running an Apps Script function, because the data is then effectively absent. So when that happens in "Inscriptions", I need to go back to "Préinscriptions" to perform the little trick I was just describing. Doing it myself is annoying enough, but other people might need to use that file too and would simply get stuck or, worse, break something if the data simply disappears without warning.
Sometimes the connection seems to break between "Préinscriptions" and "Inscriptions", which loads the data from yet a third tab where "Nouveaux" and "Anciens" get merged. The same trick applies, although it may need several attempts to successfully show the data:
Can anyone tell me why this happen and if there's anything I can do to prevent it? I'm heavily relying on the stable dataflow between the forms, response sheets and registration spreadsheet for the system to work, and this is just breaking the workflow way too often to be negligible.
I've been promoted into developing & maintaining a Google Sheets database at work. I know very little about Google Sheets scripting & from asking around, and researching it's looking like GAS is probably the avenue that I need to start heading down.
So we have 3 Workbooks in Google Sheets; 2 contain large amounts of data, the other workbook provides a UI for our sales dpt. to access the data. I really wish I could share these with you, as describing them is difficult. In the UI workbook, separate pages are paired with sheets in the one database (lets call it database A).
A salesman will go to the UI sheet for the product he's selling a device for; the top section of the sheet allows him to select, essentially, a row from database A. After the selection is made, the rest of the sheet is populated with products we manufacture that work with the choice made in the top section; the products we make are stored in the other database ("B"). We have to have two databases, as we've earlier hit the cell-limit in sheets with the two databases combined.
On average each UI page has about 150 Importranges. Looking up done with Query.
Our problem is that this UI is getting pretty slow, initial load time makes it worthless for salesmen on the road, and annoying to the salesmen here in the office. The delay when making the initial selections (querying database A) is usable, but still much slower then we'd like. And we're not finished building UI pages.
I've seen online that most people recommend using Apps Script to replace importrange, but knowing nothing about Apps Script, I haven't been able to understand what is being done, or mainly how to take the apps script and actually put the data in the cells.
So I'd appreciate any help I could get in speeding this up.
First let me say that the Google Apps script documentation has improved greatly over the years and I find it pretty easy to use now. If you open up a code editor in Google Sheets and go to Help menu and select API reference then that links you up to just about everything you need to know. If you go to the Google Apps reference for spreadsheets and look at the SpreadsheetApp object you'll see that there's three commands to open up another Spreadsheet not a sheet but a Spreadsheet. You can do it by file, by id or by URL.
If you click on the Url command it will take you to an example like this:
// The code below opens a spreadsheet using its id and logs the name for it.
// Note that the spreadsheet is NOT physically opened on the client side.
// It is opened on the server only (for modification by the script).
var ss = SpreadsheetApp.openByUrl(
'https://docs.google.com/spreadsheets/d/abc1234567/edit');
Logger.log(ss.getName());
As it points out, it doesn't actually open the file on the client-side it just opens it up on the server. So it may be necessary for you to open them up manually at first just to get an idea of what they look like. Once you know how they are organized then you can use the open command to get a Spreadsheet Object and from it select a specific sheet and then a data range. Once you have a range then you can load an array like this.
var myArray = rng.getValues();
This will load the entire range in one fell swoop into a JavaScript array and of course it would be nice if you can filter out unwanted data from the array and then put it into your current sheet at a desired range. Note that the range sizes have to be exact matches and also please realize that ranges start from 1 and arrays start from 0 so that can cause you some grief. Also let me add a few caveats that I've run into.
If your dealing with a one row or one column range array then you have to get the array's in the correct form. I tried writing them here but the Stack Overflow text converter keeps messing them up so I'd recommend you go to my reference on that issue here.
If you've coded in JavaScript in the past I'm guessing that you'll have no problem coming up to speed with Google Apps Scripting with the new documentation and an occasional visit to Stack Overflow for a question or two. I've gotten some great answers here from other users. If you need a JavaScript reference here's one that I use.
You are probably best off using a WebApp served from Google Apps Script for the UI which I'd be happy to help with if you had some sample data. If you wanted to still use the sheets, then you could replace the importRanges with some Google Apps Script function that runs every 10 minutes or so to keep the UI sheet updated. It should speed up load times. Something like this would work for you:
function importSheetA() {
var ss = SpreadsheetApp.getActiveSpreadsheet();
var database = SpreadsheetApp.openByUrl("DATABASE_A_URL");
var dataToCopy = database.getSheetByName("DATABASE_A_SHEET_NAME").getDataRange().getValues();
var copyToSheet = ss.getSheetByName("UI_SHEET_NAME");
var copyData = copyToSheet.clearContents().getRange(1, 1, dataToCopy.length, dataToCopy[0].length).setValues(dataToCopy);
}
When using Google Spreadsheets, and you want to use Validation on a cell based on a range of values, you get a pretty nice autocompletion feature that makes data entry much nicer.
My most common application is in an inventory-like situation, where I reference inventory items through some kind of hash code or part number. My frequently used hashes are committed to my brain, but when i need a new part or I need a variation on an old one, I want a little help making sure I have the correct part# selected.
I always find that I want additional row context with my autocompletion, so now I think I want to make a sidebar addon that has smarter searching rules and also includes more contextual data to ensure that I have the part# I meant. Once I am sure of the part#, one button can push the selected result over to the currently active row.
This solution is a bit "heavier" than data validation, but it does exactly what I want.
Assuming that my inventory source is another spreadsheet, what is a good way to set up my Addon-Script Project?
I was thinking that my sidebar would call an HtmlService function that utilizes Cache Service to hold my "hash list" and a few bits of context in memory. I don't think I am looking at a heavy jQuery solution (only to build the autocomplete dialog as I type), but that is really the whole purpose of this question!
Any thoughts on high level project overview? I am fairly new to Apps Scripts in general, especially since the newer API's have been coming out since 2013.
I did exactly that with my Budget Sheets, moved from Data Validation to Jquery's Autocomplete in a sidebar when the number of compositions jumped from 500 to 2.500, and it is a LOT faster, the search is faster than Autovalidation with 100 itens, the logic I use:
Database:
It's base data is in a Spreadsheet, each time it is updated, there's an OnEdit function that will trigger in several minutes a DB update, this is so that the function won't run unecesserary on several times consectively for the same edit.
The DB is then stored in simple text in JSON format on Google Drive, it is a 2MB file generated from the Spreadsheet data, using DriveApp.getFileById(id).setContet(JSON.stringify(myJsonDataFromSpreadsheet)), the file generation and saving takes up to 30 seconds, the file reading is around 4segs.
Sidebar:
Build a normal HTML - remember to use IFRAME option - and serve it, this is all in the docs, you'll have to send data from the HTML to GoogleScript (eg. the part# to insert) via google.script.run, and get back data (eg. the file with all the part numbers) with SuccessHandler in conjunction with google.script.run.
Main function references:
https://developers.google.com/apps-script/guides/html/reference/run
https://developers.google.com/apps-script/guides/dialogs#custom_sidebars
https://developers.google.com/apps-script/reference/drive/file -> First get the file With DriveApp.getFileById(id).
I am using the Spreadsheet Services in Google app script to retrieve some data from the Internet and then mess a bit with it. The problem is when I set the ImportHtml value if the data set is larger than say a few rows I do not have access right away to the imported range and thus an error is thrown in the script:
example:
// create tmp sheet and import some data.
var sheet = this.createTmpSheet(); // custom method to create a temp sheet.
sheet.getRange('A1').setValue('=ImportHtml("someUrl","table",1)');
// at this point usually I can access the range
var range_to_copy = sheet.getDataRange();
// However if the data is more than 10-15 rows I get invalid dimention for range..
Any ideas how to wait for the 'readiness' of the import? None of the usuall triggers seemd like an appropriate choice. All I need is to have flow control in such a way as to be notified once the import completes, usually under 10 seconds.
As you noted, there's no trigger that will tell you when a spreadsheet recalculation has completed. (That's what is going on after you update a formula in a cell.)
You can induce your script to wait 10 seconds by using Utilities.sleep(10000);. Generally it's bad programming practice to rely on delays, but it's almost your only option here.
The other option would be to perform the html query yourself, parse the table of data into an array, and write it to the new sheet. If you're interested in that, have a look at html div nesting? using google fetchurl, to see how you could obtain your table of information.
Dont do it like that. Use urlFetch to get the data and write it yourself to the spreadsheet.
I want to store a large number of key-value pairs in Google Apps Script,which can be later accessed by the script later on.The problem is I have it on a text file(in a Javascript Array format).
words[1]=new Array("String","String");
I have 975 such words.
Any Ideas how to proceed? I wold be grateful if relevant code(if any) is provided.
Build your array then use setValues()
var cronicaArr = new Array();
for (i in data)
{
cronicaArr.push(["string1", "string2"]);
}
myActiveSheet.getRange("A2:A"+cronicaArr.length).setValues(cronicaArr);
The GAS documentation has an entire section dedicated to data storage where your question as well as many more will be answered. Please read through the 'Storing Data' section.
To answer your question specifically, you can use any of ScriptProperties, ScriptDB or a spreadsheet to store your data.
Miturbe's answer only achieves saving the data. I assume you also want to query it (get value from key).
For that store it on scriptdb. Beware of size limits (which deppend on what type of google account the script runs on), then backup the db on a sheet regularly.