Google fit aggregate limit of 90 days - google-fit

I have understood that there is a limit of 90 days when trying to fetch aggregated data (Google fit heart points). I know that since a request of more than 90 days fails. But it would be great if someone could point me to a documentation somewhere that defines this limit. I have tried and failed to find.
Thanks & Regards,
Chitra

Related

importXML is making too many Mapquest transactions

I am trying to use the following formula and a key I obtained from mapquest (for the free service) to initiate their location services and calculate the distances between one variable location and 20 of my plant locations.
=importXML("http://mapquestapi.com/directions/v2/route?key=*****************&outFormat=xml&from=" & $B$3 & "&to=" & 56244,"//response/route/distance")
This is working flawlessly expect after using it for a short period of time I have received an email stating I have used 80% of my allotment (15000 transactions) for the month.
The variable location has only been changed around 20-25 times this month so I don't see how I could have used that many transactions. Can someone explain what exactly this formula is doing and how I could make it more efficient if possible? I feel like it has to be using transaction that are unnecessary. Keep in mind I do not need the actual directions all I need is the driving mileage required.
Thanks in advance.

Alerting framework for incoming traffic

Currently I put hourly traffic (total number of input requests) of my website in a MySQL table. I keep data for the last 90 days.
I want to check every hour, lets say 6th hour, that has the traffic increased/decreased beyond some threshold than last 7 days or last 30 days 6th hour traffic. Basically, I see a pattern of traffic. Different hours have different values.
-> Is there some alerting framework which I can use as it is for this purpose?
-> If yes, can someone suggest some open source?
-> If no, I am thinking of keeping running average of last 7 days / last 30 days in a MySQL table for every hour. And according, write a script to generate alerts based on those numbers. I am not very sure whether I should be mean, median or standard deviation. Can someone enlighten me there?

Rails - how to search through a big data collection and display them in a few-second-intervals?

I have a database of hundreds of thousands records that I fetch from database and counting a geographic distance between these records. The problem is that this search takes like 15 - 20 seconds, so I am trying to speed it up.
I think I can't do more with indexing of columns as I am grabbing the whole database table. The most time consume to count the geographic distance (through longitude and latitude). I don't know if there's a way to speed up this computation.
Because this task is - in my mind - almost the same like searching fly tickets, where you set a "from city" point to a "destination city" point and the search engine will gradually display found results to the user in time interval, like:
it displays some results
in line 2 seconds it will add another computed results
in another 2 seconds another computed results
and so on
I think this way of displaying results would be the best for my case - however, how this engine works? How can one make the search that will like every 2 seconds display another and another new results?
As the application is written in Ruby on Rails, for this kind of search would be:
AJAX
delayed_job
Possibly something else yet?
Or am I thinking about this problem in a wrong way and is there a better one to solve it?
Thank you.

MySQL: Arithmetic Operator Multiplication

I am trying to calculate from a field within the table. I am trying to do some date range stuff. Currently I have a table with the a field named interval which stores minutes. I would like to convert this field to seconds. I know I can probably store the data as seconds but out of curiosity how come this is not working for me?
SELECT table.`interval`*60 AS interval_seconds FROM table;
If the fields contained 60 I was expecting 3600 to output. But instead I get 60. If I change the value to 5 instead of 300 I get 360. Any idea what I might be doing wrong?
Glad you found that. I used to pepper my tables with enum's, a long time ago. But issues similar to what you just faced brought me to dislike them very much. For a little more details on the dirty aspects of enums, see this post.
Happy coding!

What's a good temporal distance to no longer display relative dates?

Relative dates are great to display the temporal incidence of recent activity, but at what distance is it an inconvenience for the user to see a relative date rather than an absolute one?
Let's assume the context is a forum.
This is completely relevant to whatever the dates are associated with. If it's an update feed (like your SO recent activity), then it may be a good idea to display relative dates by the hour. If it's articles talking about software updates, then days would be more appropriate...
For me, a week is about the limit. I don't know of any industrial pysch studies to support it.
I just display today and yesterday as relative dates, like facebook does.
3 days ago makes me wonder which day it was, and I find it confusing.
Not to mention Flickr's "three months ago" without any detailed info :(