Add analytics to a desktop application - html

I have developed a desktop application using HTML 5 and node web-kit .
I would like to track parts of the app , such as how long its used , clicks ect.
I would like the analytics system to work both on and offline (storing data until its on-line).
Is there anything that I could use to do this?

The Google measurement protocol allows you to track everything that can send an http request. You need to generate a unqiue client id to group pageviews into session (the part is usually done by the Javascript tracker which does not help you) and can then choose between various interaction types and their related data to be added as parameters in a request to the Google Analytics server.
As far as offline capabilites, there is a "queue time" parameter that allows you to send delayed calls to GA. However as per documentation that delay is 4 hours at most (intended for Smartphones and Tablets that temporarily lose connection rather than to work permanently offline).
In the end it depends what data you need - you might just as well send calls to your own server and log them in a csv file and feed that to Klipfolio or some other dashboard solution (or even use Excel if you expect a low data volume).

Related

Google Drive API /files slow response

I want to ask for help/ideas on the issue I will describe below.
Our iOS app allows users to access their Google Drive files.
We use Changes API (https://developers.google.com/drive/api/v3/reference/changes). The main pre-condition to using this API is to build a local DB that holds the snapshot of the user's Drive file tree and the token. To initially fill the DB we must request the list of all files from user's Drive. Getting the list of all files (with metadata) takes too long for many of our users. This is the issue I want to address.
We request files with the series of Files requests (https://developers.google.com/drive/api/v3/reference/files/list). Most requests are plain files?q=trashed%20%3D%20false.
For example, at my own private Google Drive:
69K files
initial request of all files takes 5+ minutes with my current network speed (Download 527 Mbps, Upload 417 Mbps; ping www.googleapis.com – 40–45 ms)
~150 requests
each request brings information about ~460 files
each request takes around 2-2.5 seconds
Sometimes I observed requests to take up to 6 seconds, which means that getting all files list took 15 minutes at my account.
If I look at the Developer Console, the latency is below 0.1s
Many of our users have Drives far bigger than mine. Standard iOS app user's session is not long enough to complete the initial request. We do save every intermediate page token so that all data received during single app session is not lost if user leaves the app – next session we will keep downloading data from the last saved token. But still there're some cases when our app needs the DB to be filled out with data before starting some operations – in that case our users see "Pending..." progress and they complain that our app is slow.
So, questions:
is it possible to improve the described request speed/latency?
maybe there's some quota that we are missing and it can be changed?
maybe someone can advice a more effective way of getting all files list?
P.S. We could potentially reduce the amount of requests. We have to perform some double checks for Shared with Me folders as we observed that sometimes request of all files doesn't list all files from Shared folders. That's a bit of a side story, and I don't think this will dramatically improve situation for us. I can provide more details on the actual set of requests we perform if necessary.
Are you returning all the fields - I would assume so since the only query param provided is trashed=false as the query param. Do you need all the fields? Can you try to reduce the query to only return the fields you really care about (using a field mask) and see if that improves your performance?

How do I update my web app's database when a change is made on QuickBooks Online?

I have a web app with a MySQL database we maintain in the cloud that we are trying to integrate with our QuickBooks Online account. We want to sync data between or web app's database and QuickBooks online, such as customer names and addresses. If they update their address in or web app, it's easy to then update it in QuickBooks online using the QuickBooks Online API. However, if they tell us their new address over the phone and we change it directly in QuickBooks online, we have no idea how to have that trigger something so that it automatically updates our MySQL web app. How do we go about doing this or learning about this process?
Intuit/QuickBooks has an API that's specifically geared towards this use-case. From the docs:
The change data capture (CDC) operation returns a list of entities that have changed since a specified time. This operation is for an app that periodically polls Data Services and then refreshes its local copy of entity data.
Docs are here:
https://developer.intuit.com/docs/0100_accounting/0300_developer_guides/change_data_capture
Basically you make an OAuth signed HTTP GET request like this:
https://quickbooks.api.intuit.com/v3/company/1234/cdc?entities=Class,Item,Invoice&changedSince=2012-07-20T22:25:51-07:00
And you get back a list of objects that have changed since the given date/time.
Your application can remember the last time you called this, and periodically call this API to get things that have changed since the last time you called it.
You get back something like this:
<IntuitResponse xmlns="http://schema.intuit.com/finance/v3" time="2013-04-03T10:36:19.393Z">
<CDCResponse>
<QueryResponse>
<Customer>...
</Customer>
<Customer>...
</Customer>
</QueryResponse>
<QueryResponse>
<Invoice>...
</Invoice>
<Invoice>...
</Invoice>
</QueryResponse>
</CDCResponse>
</IntuitResponse>

OAuth for Enterprise account

I'm creating a web app for my company that will keep a number of files in sync with the files on Box. This will be done by using a cron job running every hour.
I have the application working by setting the developer token in my account, this was done for testing whilst I was building the application.
Now this is working I want to get the authentication working so I can just leaving this running. So I'm trying to work out if there is a way I can have an API key for our enterprise account or if I will have to implement OAuth and connect one user to the application, which seems to be a bit overkill?
You should probably use one of the SDKs, which take care of refreshing the tokens for you.
Essentially what you'll need is a keystore to store the tokens. You could store the Refresh-token only. When your cron wakes up, use the refresh token to get a new access-token and refresh-token. Store the new refresh token in your keystore. Then make your API calls using the Access-token, and then go back to sleep.

Displaying Tiles data with Time interval using Push Notifications in Metro Apps?

I have metro application in which I implemented Push notification concept for getting single message.If I get more than 1 notification,still my application tile is able to show only 1 notification(msg).Am not able to do how to display multiple notifications for time-specific.Means do I need to write any extra code for displaying multiple notifications on my tile.If so, where should I need do write either client-side or server-side?
Thank you.
There are several ways to look at updating, and depending on what your end goal is, you may end up implementing the code either on the client, or the server, or a little of both.
For the scenario you describe, you need to use Windows Notification Services to push the notification each time you want a new tile notification. Typically, this is done by having a service running in the cloud (a website, or a Windows Azure service, or similar), that calls Windows Notification Service and sends a tile update to the app when something of interest occurs.
If what you want is for multiple notifications to cycle on the tile, that's enabled by calling the enableNotificationQueue method on the TileUpdater class:
http://msdn.microsoft.com/en-us/library/windows/apps/windows.ui.notifications.tileupdater.enablenotificationqueue.aspx
Per the comment below, enableNotificationQueue works for any notification source. But if you want to pull information from a remote service, rather than using push, you can use scheduled polling as means of updating the tile using remote information, as described here:
http://msdn.microsoft.com/en-us/library/windows/apps/Hh761476.aspx
Combined with the call to enableNotificationQueue, it may also enable the scenario you're looking for.

Why is a Push Engine Required

Our Architecture is using a Push Engine to send data to the browser ,
Could anybody please tell me what is the use of Push Engine ??
( Why is it required , as the same thing can be achivied using a normal AJAX programming )\
Please guide me .
let's say your visiting a website, and the website is updated continuously. Your browser needs to keep updating the data that you're viewing, meaning that the browser needs to keep communicating with the server, and get the updates.
you can use ajax to make requests every few seconds, each time fetch more data from the server. Problem is - you need to make a lot of ajax calls, and you open a connection (a socket) for each, and eventually, it is a very slow process. if the interval between the requests is large, you will have a delay between the updates on the servers, and the updates in your browser.
to solve that, we can manipulate the HTTP calls - keep the request (the connection) open, and continuously send data. that way, when the server wants to send something to the client (browser), there's an open connection, and it doesn't need to way for the next ajax call by the browser.
HTTP servers have a timeout on the requests, so just before request times out, browser will close it and make a new one.
another (better) method is using XMPP protocal, which is used in chats like facebook's and msn.
AJAX is a pull method - it requires the client to connect to the server. If you have some information that you want to display live - for example a live score in a football game - the AJAX call has to be made at regular intervals - even when there is no data waiting on the server. A Push Engine is the reverse - the client and server maintain a connection and the server pushes data when there is data to be sent.