Converting Decimal Value to Hours and Minutes and also to Hours - sql-server-2008

I store time like this
1.30 == 1 Hour 30 minutes
2.70 == 3 Hours 10 minutes
I need to convert 2.70 to 3 Hours 10 minutes
And also 3 hrs 10 minutes in to Hours
How to solve this in SQL server 2008
Please help me

If 1.30 = 90 minutes then 1 = 69.23076923076923 minutes
Then 2.70 to minutes ->
2.70 * 69.23076923076923 = 186.9230769230769 (minutes)
186.9230769230769 / 60 (minutes in an hour) = 3.115384615384615
From 3.115384615384615 you take the integer part as hour (3) and the decimal as the minutes (1153.....)
I don't know exactly how you came up with the formula but it seems that the conversion factor is going to give you decimal places that you are going to have to take into account (round or eliminate them)
Use the same procedure but INVERSED to get from 3 hours 10 minutes to 2.70
BUT like it was mentioned before, I don't understand why you are using SQL to do this formula calculation.

Related

How to sum time interval stored as string in MySQL like 5 days 10:20:00

I need to write a select query to sum the time interval from MySQL table where the time interval is stored as text and in the format similar to following
10 days 3:28:31
In the PostgreSQL query we can simply use ::interval and it converts above to interval and we can use Sum method over it in PostgreSQL query but I am unable to do the same in MySQL. Any help would be appreciated.
MySQL does not have an interval data type. It does use the interval keyword -- which is a bit confusing. But that is a syntactic element, rather than a data type.
One thing you can do is use the time data type. This supports times up to 838 hours -- or about 35 days. That is enough for many purposes.
Otherwise, the recommendation is to use a single time unit and to store the value as a numeric quantity. For instance, "5 days 10:20:00" would be:
5.43055555556 days (5 + 10 / 24 + 20 / (24*60))
130.333333333 hours
7820 minutes

Get average day or week values

I have statistical data like this:
time val1
1424166578 51
1424166877 55
1424167178 57
1424167477 57
time is a unix timestamp. There is one record every 5 minutes excluding nights and sundays. This continues over several weeks.
Now I want to get these values for an average day and an average week. The result should include values for every 5 minutes like normal but for average past days or weeks.
The result should look like this:
time val1
0 43.423
300 46.635
600 51.887
...
So time could be a timestamp with relative time since day or week start. Perhaps it is better to use DATETIME... not sure.
If I use GROUP BY FROM_UNIXTIME(time, '%Y%m%d') for example I get one value for the whole day. But I want all average values for all days.
You seem to be interested in grouping dates by five minute intervals instead of dates. This is fairly straightforward:
SELECT
HOUR(FROM_UNIXTIME(time)) AS HH,
(MINUTE(FROM_UNIXTIME(time)) DIV 5) * 5 AS MM,
AVG(val1) AS VAL
FROM your_table
WHERE time > UNIX_TIMESTAMP(CURRENT_TIMESTAMP - INTERVAL 7 DAY)
GROUP BY HH, MM
The following result will explain how date is clamped:
time FROM_UNIXTIME(time) HH MM
1424166578 2015-02-17 14:49:38 14 45
1424166877 2015-02-17 14:54:37 14 50
1424167178 2015-02-17 14:59:38 14 55
1424167477 2015-02-17 15:04:37 15 00
I would approach this as:
select date(from_unixtime(time)) as day, avg(val)
from table t
group by date(from_unixtime(time))
order by day;
Although you can use the format argument, I think of that more for converting the value to a string than to a date/time.

Calculate time difference from MySQL time

I am using the following call in my query to calculate the amount of time between now and a timestamp:
(NOW() - bu.banusr_expire)
bu.banusr_expire is a TIMESTAMP field.
I'm a little confused about the number it is returning.
ex; it returns -928 when there is about a 9.5 minute difference.
This makes me think that -928 = -9mins and 28 seconds(or 15 seconds. This set of digits seems to go from 0-99), but that seems completely wrong.
My question is, how can this value be converted to minutes?
If you can be confident that the difference between the two times will always be less than 839 hours, then you can use TIMEDIFF().
(UNIX_TIMESTAMP() - UNIX_TIMESTAMP(bu.banusr_expire)) / 60
should give you the number of minutes ;)
use unix timestamp
select UNIX_TIMESTAMP(NOW()) - UNIX_TIMESTAMP(fieldname)
this will give you the diff in seconds you will have to divide by 60 for minutes

mysql: SELECTing rows X hours old to the nearest hour

I have a clause in mysql query to select with a timestamp of 3 days ago, to the nearest day:
WHERE TO_DAYS(wit_matches.created) = TO_DAYS(NOW() - INTERVAL 3 DAY))
I want to change this so that it selects rows with a timetamp of 3 days about, but to the nearest hour - i.e. 72 hours to the nearest hour (this is a cron job that will run ones per hour).
What's the best way of achieving this?
You can try this.
WHERE wit_matches.created BETWEEN (NOW() - INTERVAL 73 HOUR) AND (NOW() - INTERVAL 72 HOUR)
71.5 < 72 < 72.5 it is very easy once u get the hang of it

mysql time comparison

I have job_start and job_end times and timediff will give me the time difference. Now I want to see if that job took more than 2 hrs 30 min. How do I compare it? I am getting errora if I do it like this:
timediff(job_start,job_end)> '2:30:00'
timediff(job_start,job_end)> time(2:30:00)
timediff(job_start,job_end)> time_format(2:30:00)
Nothing of the above syntax is working.
From mysql docs for function TIMESTAMPDIFF:
The unit for the result (an integer) is given by the unit argument.
The legal values for unit are the same as those listed in the description of the TIMESTAMPADD() function.
Which should be one of the following values: MICROSECOND (microseconds), SECOND, MINUTE, HOUR, DAY, WEEK, MONTH, QUARTER, or YEAR.
Result if integer.I recommend MINUTE:
TIMESTAMPDIFF(MINUTE, job_start, job_end) > 150
(2 * 60 + 30) mins = 150 mins = 2,5 hours