We are a Google Apps for Your Domain enterprise account, and Real Estate brokerage. We want to build a web application that ties together several Google Apps services. It would be great if we could do it all in Google Apps Script, but at the same time may stretch the limits of what's possible in Google Apps Script. We do not have the time or resources to do full application development using Google Web Toolkit (GWT). Is a framework approach the right solution?
We want to build an application that allows our agents to create a real-estate listing record. Each record is large, with 300 - 400 form fields per record depending on the property type. Many of the fields are 'lookup' fields with specific values in either a select-list format, or multiple checkbox format. (e.g. roof-type = choose one: slate, shingle, rolled-roof; appliances = choose all: refrigerator, stove, dishwasher; etc.)
Each record will also need associated photos in original high resolution format and also smaller resolution sets for display in various contexts. Each record would have 24-50 1MB photos. I'm thinking that we could use and integrate Google Drive for the photos because the process can be simplified for the user to drag and drop a folder from desktop to Google Drive. Having the images stored in Google Drive, and only referenced from the application would solve part of the implied storage question. I read that there is a 200MB ScriptDb Quota in Google Apps Script so I can see that being a potential deal-breaker just for the 'data' alone. I don't have an exact database storage requirement, but I know we'll have 700 records to start and that number will grow to several thousand.
The users of the application are all internal, so GAFYD auth integration is a nice benefit.
There is no form api currently, so how do we create the data entry form in the first place? It appears we would need to manually create the form, or else create a sample spreadsheet to auto-generate the form. But then how do we enhance the form to modify the select lists and attach validation rules, and dynamic form behaviors like creating/showing/hiding additional elements based on user input (e.g. enter # of rooms; then enter dimensions for each room).
Another potential showstopper is the resizing of photos. We want access to original photos in order to create further marketing materials, however in the application UI we would need to use various sized images at smaller dimensions for efficiency. (e.g. show list of properties with one thumbnail to represent each record) I guess there would be methods in the Google Drive api to create sub-folders to store the resized images, but is there access to graphics manipulation software like gd (maybe through an API to picasa)?
A record should be viewable in different layouts and views. For example full record detail, summary view, marketing flyer view.
Once a record is created, messages need to be sent to the agent creating the record, plus an internal team who processes the record for workflows including copy writing/review + marketing. That seems to potentially fit with the new Google Groups? Once the record is 'approved' then the application needs to generate a marketing email to several hundred external recipients; selected based on business rules from various pools. So the application would need additional storage, or possibly address-book integration to be able to manage contacts.
Future edits to records (e.g. price changes, photo changes) need to trigger the review/approval workflow.
Is Google Apps Script capable of handling the size, scope and complexity of this type of application? Or, would the recommended route be using a micro framework such as http://bcosca.github.com/fatfree/ to tie together all the Google Apps components using their respective APIs?
There is no form api currently, so how do we create the data entry
form in the first place?
There are two, actually: both UiApp (and the drag-and-drop GUI Builder for it) and HtmlService can show arbitrarily complex forms.
I'm thinking that we could use and integrate Google Drive for the
photos because the process can be simplified for the user to drag and
drop a folder from desktop to Google Drive.
Drive is integrated with Apps Script.
I don't have an exact database storage requirement, but I know we'll
have 700 records to start and that number will grow to several
thousand.
You might want to try Google Cloud SQL as your storage, which is 100% natively supported in Apps Script and is a "real" SQL database. However, several thousands records is tiny if you are storing the photos in Drive... ScriptDb could probably scale to several million records in that case.
Google Groups and Contacts are integrated.
Google's documentation can be found here: https://developers.google.com/google-apps/
Google Apps Script is sometimes surprisingly powerful, but it won't be very fast for the amount of data you're implying.
As you said, ScriptDB has a size limit, so it can't store everything. Spreadsheets are limited to 256 columns per sheet and 400000 cells. My way around this is splitting my data into a set of spreadsheets with a set of sheets. If you're in real estate, you can probably split your data by region and neighborhood/area to achieve something similar. If you really want to compact things, you can store a row of data as a stringified JSON object in one cell. However, it will no longer be human-readable.
Unless you're willing to pay extra for storage, it sounds like your pictures will fill up a Drive account fairly quickly. I'm not familiar with images in gadgets, so I'm not sure if they can be embedded from Drive.
I have no experience with Forms, but you can build a gadget using UiApp and call appendRow on a spreadsheet sheet to add the contents of all your fields. And by building your app like that, you can specify valid values for things (and have those read from a "config" spreadsheet).
Related
I'm an artist and have implemented an art show application for our local art group using Google Forms. The application allows for an artist to submit a variable number of painting entries (up to a maximum amount which varies from show to show). This presents a classic master (single instance of artist information such as name, phone, email, etc.) detail (multiple instances of painting information such as title, media, image, etc.) relationship. It's a classic problem that a relational database solves. However, the ease of creating a Google Form and ease in which folks can work with spreadsheet data makes a compelling case to solve this problem using sheets. As each painting detail is entered into the form (using conditional questions up to max entries) Google adds those detail cells horizontally extending the row. To date, I've managed to address the problem with a hodge-podge of very specific macros and other brute force methods to get this data lined up in columns so that it is workable (i.e., sort, slice/dice). I was about to attempt a crude generic script to try and solve this general problem but as I look at similar questions I see solutions that are 10 times more compact and efficient than anything I could cook up. This is by no means my specific problem but rather a general need by Google forms users who process master/detail information and end up with unmanageable data strung out in rows of endless variable lengths. If Google was smart they would build this master/detail feature in and gain a raft of new form users. Anyway, here is a view of the simulated captured form data: and the desired result.
https://docs.google.com/spreadsheets/d/1Lxuc6uCIkLXyx5evuWIEHULTAOwFwmjT627igC0JbfQ/edit?usp=sharing
Master - Details from Google Forms
The data in columnar format can now be processed with ease. My thinking was to make this generic so I could apply it to any number of form applications that ask for fixed and variable information. To do that I was going to set up variables for the starting cell of the detail data (in this case D or 4) ) the maximum number of detail clusters (in this case it was 5) and the number of cells in each
detail cluster (in this case 3). The master information (name info) gets repeated as rows are inserted for each cell cluster. Ideally, the last cell cluster on a row could be determined on the fly rather than specifying a max. The first cell in a detail cluster would be a required form field so it's absence would indicate the end of the detail clusters on a row.
I get weak in the knees when I think about using arrays and was leaning toward doing this with lots of copying and pasting by way of macros when I thought I might seek some help from those who do this with seeming ease.
I had similar situation: form with the start block, followed by repeated blocks of same questions. I successfully unwrapped it and pushed everything to BigQuery database by using Apps Script.
My guess is that you don't have tons of data, so you can keep everything in Google Sheets. You posted no code, but the strategy should be like this:
Use another sheet in the same spreadsheet for writing your data with the desired structure.
Keep the sheet that form writes into intact, you don't want to mess up GF->GS automation.
Use onFormSubmit event to process new rows that form writes into GS sheet.
Yes, you'll have to play with the arrays and use something like DetailsStartColumn(4) and DetailsWidth(3) to process horizontals blocks, detect filled/empty blocks and write them into your database on another sheet.
Arrays in Apps Script are fine, they are a great tool once you learn them, they are one of the reasons why I like JavaScript ;-)
I am looking to export the analytics data towards a database sql. Do you know one tools who could help me?
Do you know how I can see on Google analytics the traffic resulting from a particular URL??
Thank you all!
You have several options:
UI export: in the top/right corner of your reports you should have an option to download data in various formats (XLS, CSV...)
API: you can use the reporting API to get it out in a programmatic/automated way
One thing you won't be able to do with the free version no matter what you try:
Reconstruct the entire analytics data: whether with the UI or API, you're limited to querying only 7 dimensions maximum at a time (eg ga:country, ga:deviceCategory etc...), and cannot combine certain dimensions together (no official list available, it's trial and error to find out), whereas there are dozens of dimensions available.
So the question for you becomes:
How much resources do I want to invest into partially reverse-engineering Google Analytics vs. the value it brings me vs. what it would cost to get alternative analytics solutions?
I found a cloud based solution which exports raw google analytics data to MySQL database. Setup is simple, all you need to do is add your Google Analytics connection and a database to which the data needs to be exported.
MySQL, PostgreSQL, SQL Server and BigQuery are the supported destinations. It creates a few custom dimensions in your Google Analytics account and Tag in Google Tag Manager to send hits to Google Analytics. Data is exported from Google Analytics to the selected destination every day.
I have been using it for last three months now. Hope this helps.
Exporting the analytics data is a thorny one.
My understanding is that paid GA usage allows the export of all collected GA data.
But free usage does not.
For free usage, all you are going to be able to do, realistically, is to create a report over your GA data (in Data Studio or Google Sheets) that contains the rows and columns you want, and then collect this information and squirt it into a SQL table. You are also liable to come up against sampling.
Re traffic from particular URL, the news is better: just filter on Hostname and Page.
I'm looking into using google sheets as some sort of aggregation solution for different data sources. It's reasonably easy to configure those data sources to output to a common google sheets and it's need to online for sharing. This sheet would act as my raw, un-treated data source. I would then have some dashboards/sub-tables based on that data.
Now, early tests seem to show I'm going to have to be careful about efficiency as it seems I'm pushing against the maximum 2 millions cells for spreadsheets (we're talking about 15-20k rows of data & 100 or so columns). Handling the data also seems to be pretty slow (regardless of cells limits), at least using formulas, even considering using arrays & avoiding vlookups etc...
My plan would be to create other documents (separate documents, not just adding tabs) & refer to the source data through import-range & using spreadsheet-key. Those would be using subsets of the data only required for each dashboards. This should allow me to create dashboard that would run faster than if setup directly off my big raw data file, or at least that's my thinking.
Am I embarking on a fool's errand here? Anyone has been looking into similarly large dataset on google docs? Basically trying to see if what I have in mind is even practical or not... If you have better ideas in terms of architecture please do share...
I ran into a similar issue once.
Using a multi layer approach like the one you suggested is indeed one method to work around this.
The spreadsheets themselves have no problem storing those two million cells, it's the displaying of all the data that is problematic, so accessing it via Import or scripts can be worthwhile.
Some other things I would consider:
How up to date does the data need to be? Import range is slow and can make the dashboard you create sluggish, maybe a scheduled import with the aggregation happening in Google Apps Script is a viable option.
At that point you might even want to consider using BigQuery for the data storage (and aggregation), whether you pull the data from another spreadsheet in this project or a database that will not run into any issues once you exceed 2 million elements would be future proof.
Alternatively you can use fusion tables* for the storage which are drive based , although I think you cannot run sophisticated SQL queries on it.
*: You probably need to enable them in Drive via right click > more > Connect more apps
I'm programming a Google apps script store for tiddliwiki (tiddlywiki.com). It receives files and store them within Google Drive. It never overwrites any file, it just creates a new one on every upload. For performance reasons I want to maintain an index of the latest version of each file and its ID. Currently I'm using properties service to achieve this. When a file is uploaded, I store its Name:ID. That way retrieving a file by name does not require to search in the full folder neither check what is the latest version. I'm worried about how many entries can I store on the script properties store. I'm thinking about using a spreadsheet for save the index, but I don't know what is the difference in terms of performance compared to the properties service.
Here is the question: can I stick to properties service to achieve this task or should I switch to google spreadsheets? Will the performance be much worse? Any alternative for the index?
Thanks in advance.
EDIT:
Since this will store only a few hundred of entries, what about using a JSON file as index. Will take it much time to load the text and parse it?
It depends on the amount of files you're expecting. For a few thousand files, the Property service might do and is surely easier to use than a Spreadsheet, but it has a tighter limitation of 500kb per store.
If you think you'll have more files, than it's probably best not to index at all, and do a fast Google Drive search to retrieve your latest file. The search criteria can be very specific and fast (filter by title only or any timestamp criteria). I think it'll be much less trouble in your script (than trying to build/fit a big index into memory, etc).
I have implemented a time booking system based on spreadsheets which the users fill out and then are consolidated into one central (and big) spreadsheet.
After having had a few performance issues the whole application now runs perfectly since several months. However, I will soon run into the size limitation of spreadsheets (400k cells).
In the consolidated spreadsheet I basically do not need more data than the current month. However for statistical purposes I would appreciate if I could make the data easily accessible for the domains users.
Basically the BigQuery Service would be perfect but I did not find an API to write data to it from a spreadsheet. I hesitate to use the Google provided MySQL database for cost reasons.
Are there any other ideas around?
There's a built-in Google BigQuery API for Apps Scripts, you just have to enable it manually under Resources > Use Google APIs. There's also Fusion Tables, that does not have a built-in API but is somewhat simple to use via UrlFetch.
Anyway, if it's statistical purposes, why don't you just "compile" the data in another spreadsheet? e.g. month - amount of entries - total prices - avg etc - etc...