Good ways to capture website traffic statistics - language-agnostic

I tried AWStats but its painfully difficult to configure it(nginx logs dont rotate, stats dont get updated) and Google analytics has a lag of almost 12-14 hours. Are there any other options?

One way you could do this is making an extra request like a small image 1x1 px beacon on a object, from like amazon s3 and enable logging for the bucket. an then use that log file to filter your stats. eg http://docs.amazonwebservices.com/AmazonS3/latest/dev/LogFormat.html

Related

Is the pingdom uptime monitor increasing my GA bounce rate

I've been looking at my Google analytics account and noticed a surge in the bounce rate. Is it possible the PINGDOM uptime monitor is causing this increase? There seems to be a correlation between the two.
Many thanks
That is very likely since Pingdom's uptime monitor makes a HTTP(S) call to page(s) on your website to verify that they're up. You can avoid this simply by excluding their IP addresses from being logged as visits by Google Analytics by adding them to GA's exclusion list, as documented in the Google Analytics help pages.
I think it cant increase your bounce rate. cause analytics filtering bots, but if you sure that pingdom run analytics.js and was not filtering by default, you can add they domain/ip in custom filter (in view).
As a first step, you could check, if increased bounce rate can be credited to any traffic sources. This might already be linked to Pingdom as a referrer. You should also check with Pingdom FAQ or Support, whether they run your scripts during their measurements. E.g. this product or product feature of theirs claims to run javascripts.
If scripts are executed during measurement, then Analytics scripts are also very likely to executed, and therefore a call will be made to GA servers for your site. In this case, solutions mentioned in other responses can be used, e.g. filterin based on traffic source's domain or IP.
If scripts are not run, then you'll have to look for other reasons behind. Again, traffic source base breakdown of bounce rate could be a good place to start.

Efficiently loading ZCTAs from a shape file

I'm working on an google map application that uses a geojson file that stores the ZCTA (Zip Code Tabulation Areas) of states. These files are fairly large and take some time to load so I was trying to find ways to reduce the loading time of the geoJson file.
I have looked at this question here:
Is there a memory efficient and fast way to load big json files in python?
so I am aware that reducing the loading time isn't easy to do.
but I came across this link that quickly loads ZCTA:
http://www.trulia.com/home_prices/
So my question is this: What has the developer done on this site to quickly load the ZCTA data? Can anyone see offhand how it was done?
You are basically asking, "how do I write the user interface for a GIS that will offer high performance?"
The developer of this website is not storing this information in a JSON and then loading it each time in response to each user's clicks. The developer is probably storing all of these objects in memory. To make things even faster, the developer has probably created the various KML files for the Google map in advance. The developer may be interfacing directly with Google, or may be using a commercial GIS that has an output format that employs Google Maps.
For example, you might check out https://gis.stackexchange.com/questions/190709/arcgis-to-google-maps.

Alternative for Background Transfer Service to run uploads in background

I've used background transfer service (BTS) API for Windows Phone in two apps and experienced very bad problems. It became one of the main source of bug in the two apps as for some reasons, download are often refusing to start, whatever I set in the flags (Connected to wifi, not connected, connected to a power outlet, etc.), and it was random from a user to another. This and bad response from the servers.
Is there a more customized way to achieve it? Which threads or loop remains alive in my app when I'm navigating to the external:// world? I should probably check with counters.
My main question remains: appart from the BTS, is there something to allow a 3-4 megs file to upload even if I navigate out from my app to play an mp3 from an external:// app?
Once you exit your app, you are pretty much shut down. You can masquerade as a location tracking background agent to remain in the background when you get deactivated, though you'll suck battery and I believe there can only be one of these active at a time. Generally, highly not recommended (and you'll probably fail certification).
A better way to do this if BTS is not to your liking is to use a ResourceIntensiveTask. This will only be triggered when the user is plugged in and has WiFi but will allow you to run whatever you want for as long as the conditions are met (for example, at night) which should be plenty of time to upload a 3-4 MB file.

Where to host large files without limit?

I'm looking for a way to host MP4, OGG and WEMB movies (HTML5 videos) (each is like 5-15MB large) externally.
Can you recommend some free services like Dropbox? Dropbox has bandwidth limits unfortunately and I'd like to be able to show those movies to at least 100-200 people daily.
Can you recommend any services?
Youtube. Its built for this.
(I joke not)
I have had great results using Amazon S3 for this. Just make sure you check the little box in the S3 dashboard that makes the file public.

Salesforce: Google maps query status 620 G_GEO_TOO_MANY_QUERIES

In Salesforce I have created a future method that makes a Google Maps geocode callout. I know there is a limit of 2,500 requests per day but our instance has made no more than 100 requests today. Maybe the same number of requests yesterday. Yet the response code is 620 G_GEO_TOO_MANY_QUERIES.
Could it be that Google is seeing the IP address of the instance of Salesforce and aggregating all of these requests as coming from one location. So other companies that are sharing the address are causing my instance to hit this limit?
If not can anyone suggest another cause?
This discussion suggests that it is the shared origin from salesforce that is messing it up.
This only makes sense though if you are doing the geocode lookup from the server and not from a client. If you would do it fromt the client it would use the ip from the client and you are dealing with the local clients lookup limits (see below).
If you are doing it on the server you might also have to check if you are actually doing something that is legal and not breaking the ToS from google. In this discussion you will get some background on that and also a solution if you need to fix this on the server (buy a licence)
To be complete the G_GEO_TOO_MANY_QUERIES can mean one of 2 things:
You exceeded the daily limit (too many in a day)
You exceeded the speed limit (too many request is too short period)
Google has not specified an exact limit as far as I know but they don't seem to enjoy automated lookups. If you look at various libraries and plugins you'll see that all of them force a delay between each request and often they add a little randomness to the delay. You could experiment with this if it makes any difference
Did this end up working for you? we're hitting a server-side 620 and all configuration looks a.o.k... we have a premier license and upgraded to 250k requests per day.