I have a LINQ 2 SQL query that's getting me a list of results for the month of February 2012. The resulting where clause is
DECLARE #p0 DateTime = '2012-02-01 00:00:00.000'
DECLARE #p1 DateTime = '2012-02-29 23:59:59.999'
....
WHERE (CONVERT(DATE, [t0].[DatePlaced]) >= #p0) AND (CONVERT(DATE, [t0].[DatePlaced]) <= #p1)
When this runs I'm getting results for 3/1/2012 showing up as well as all the results for 2/2012.
If I change the where clause to use BETWEEN then the results only contain dates for February.
WHERE [t0].[DatePlaced] BETWEEN #p0 AND #p1
I'm using .net 4 and SQL Server 2008 R2 with and without SP1.
Switching the dates to 3/1/2011 and my query's end date to '2011-02-28 23:59:59.999' yielded the same results.
Is there another way to get the results for just 2/2012 aside from using BETWEEN which LINQ 2 SQL doesn't support?
.999 rounds up to midnight of the next day. You can check this:
DECLARE #p1 DateTime = '2012-02-29 23:59:59.999';
SELECT #p1;
What do you get?
Instead of trying to figure out the last instant of today (which will be different depending on the data type and precision), what you want instead is an open-ended date range:
DECLARE #p0 DATE = '2012-02-01',
#p1 DATE = '2012-03-01';
....
WHERE [t0].[DatePlaced] >= #p0
AND [t0].[DatePlaced] < #p1
Even easier would be to just pass in the starting date and say:
DECLARE #p0 DATE = '2012-02-01';
....
WHERE [t0].DatePlaced >= #p0
AND [t0].DatePlaced < DATEADD(MONTH, 1, #p0)
For some elaborate ideas about datetime best practices:
Bad habits to kick : mis-handling date / range queries
For some info on why BETWEEN (and by extension >= AND <=) is evil:
What do BETWEEN and the devil have in common?
If you need to select by month often, you could consider adding two computed columns to your table - one for the month, one for the year:
ALTER TABLE dbo.YourTable
ADD DatePlacedYear AS YEAR(DatePlaced) PERSISTED
ALTER TABLE dbo.YourTable
ADD DatePlacedMonth AS MONTH(DatePlaced) PERSISTED
Those two new columns are automatically computed by SQL Server, they're persisted (e.g. part of the table's storage), and you can even put an index on them, if that makes sense for you.
With those in place, you could now use a query like:
SELECT (columns)
FROM dbo.YourTable
WHERE DatePlacedYear = 2012 AND DatePlacedMonth = 2
to get all data from February 2012.
It's a classic space-vs-speed trade-off - by storing the two extra columns for each row, you need more space - but in return, querying gets easier and if you have an index on (DatePlacedYear, DatePlacedMonth), your queries should (ideally) be quite fast.
Instead of using AddMilliseconds(-1) try use AddMilliseconds(-3)
See this question how SQL Server treats the milliseconds
Related
I have a weird problem in mysql!
my query is
SELECT * FROM aa WHERE problemTime>= '2016/03/20' AND problemTime<= '2016/04/20'
the result of this query is nothing , but when I change the first time to 2016/03/19 or 2016/03/21 I have the following result! I mean these queries
SELECT * FROM aa WHERE problemTime>= '2016/03/21' AND problemTime<= '2016/04/20'
or
SELECT * FROM aa WHERE problemTime>= '2016/03/19' AND problemTime<= '2016/04/20'
the result in both time ( 19th and 21th) is
but when I use 20th the result is noting
my main table is
I change the format of time from 2016/03/20 to 2016-03-20 ( I mean change / to - ) but it doesn't have change too!
whats the problem?
You should really be running a query like this if your problemTime column is datetime type:
SELECT * FROM aa
WHERE
problemTime>= str_to_date('2016/03/20', '%Y/%m/%d') AND
problemTime <= str_to_date('2016/04/20', '%Y/%m/%d')
Don't rely on implicit conversions between string and date.. leave your table data alone and ensure you explicitly convert your where clause parameters to the same data type as in the table. Also remember that a date "without" a time is actually midnight on the day in question, and midnight is like zero, it's the first thing that happens on any given day. A time of 6am on a given date, is after midnight, so a query that asks for dates less than or equal to midnight on a particular date means the 6am date will be excluded
This is general good DB practice; do not convert table data where possible, because it can cause huge performance hits and wrong results
Your column "problemTime" have date with time. Do not convert table data, change your where clause (add time).
SELECT * FROM aa WHERE problemTime>= '2016/03/20 00:00:00' AND problemTime<= '2016/04/20 23:59:59'
Try this as per SQl Server.
SELECT * FROM aa
WHERe cast(problemTime AS date) between '2016/03/21' AND '2016/04/20'
I have a week column with week numbers as w0, w1, w2.... I am trying to get last last six weeks data. Here's the sql query I am using.
SELECT * FROM week
WHERE uid = '9df984da-4318-1035-9589-493e89385fad'
AND report_week BETWEEN `'w52' AND 'w5'`;
'w52' is essentially week 52 in December 2015 and 'w5' is Jan 2016. The 'between' seems to not work. Whats the best way to get data from the above two weeks?
Here's the CREATE TABLE statement:
CREATE TABLE `week` (`uid` VARCHAR(255) DEFAULT '' NOT NULL,
`report_week` VARCHAR(7) NOT NULL,
`report_files_active` BIGINT DEFAULT NULL);
Essentially this table is getting populated from other table which has date column. It uses dates from other table and summarizes weekly data into this.
Any help is appreciated.
Refer to this SO Discussion which details the reasons for a problem similar to yours.
BETWEEN 'a' and 'b' actually matches to columnValue >='a' and columnValue <= 'b'
In your case w52 is greater than w5 due to lexicographic ordering of Strings - this means that the BETWEEN clause will never return a true (think about it as equivalent to saying BETWEEN 10 and 1 instead of BETWEEN 1 and 10.
Edit to my response:
Refrain from storing the week value as a string. Instead here are a couple of approaches in order of their preference:
Have a timestamp column. You can easily then use MySQL query
facilities to extract the week information out of this. For a
reference see this post.
Maintain two columns - YEAR, WEEKNO where YEAR will store values
like 2015, 2016 etc and WEEKNO will store the week number.
This way you can query data for any week in any year.
please show me table structure and DB name because it different for other, if it is any timestamp then we can use BETWEEN 'systemdate' AND 'systemdate-6'
I'm connecting from MS SQL Server 2014 to a (ServiceNow) MySQL database via OpenQuery(). I would like to filter out records more than 24 hours old.
When I set a static date, it returns the thousands of rows I expect to see. However, when I try to use a calculated field, it runs but returns zero records.
select number, sys_updated_on
from OPENQUERY(ServiceNowUAT,
'Select number, sys_updated_on
FROM DATABASE.[SCHEMA].[TableName]
WHERE sys_updated_on > DATEADD(d, -2, NOW()) ')
I have also used the DATE_SUB() function, and various other forms of syntax. I've tried casting the calculated date as date, datetime, timestamp, varchar, and more. I've tried this in MS Query and SSIS as well. All fail to return results with this query, and other, similar queries once I add the "sys_updated_on > DATEADD(d, -2, NOW())" segment.
If I cast the sys_update_on field as timestamp, it works, but cranks up the processing time from about 10 seconds to 30+ minutes, which, of course, is not ideal (there are a few million rows in the table
The sys_update_on field is in the format "2015-02-10 10:24:17.000000".
The other relevant part is that I am pulling from a ServiceNow MySQL database using ODBC drivers provided by ServiceNow, not MySQL. I do not have a data map, so I cannot say for sure what the data type is. At this point, I'm guessing it's a string of some sort, and not a true timestamp/datetime, but I can't confirm this.
Does anyone have any ideas how to make this work so that it
a. returns results
b. does not take half an hour to run?
SELECT *
FROM OpenQuery(ServiceNowUAT,
'SELECT name FROM DATABASE.[SCHEMA].[TableName]
WHERE CAST(sys_updated_on as TIMESTAMP) BETWEEN DATEADD(DAY, -2, now()) and now()'
)
Spark Interactive SQL Reference.pdf
I know there is a function called ISDATE to validate DATETIME columns, but it works only for the SMALLDATETIME and DATETIME types.
Is there a similar way to validate the new data type DATETIME2 in SQL Server 2008 and 2012?
In SQL Server 2012, you can use TRY_CONVERT:
SELECT TRY_CONVERT(DATETIME2, '2012-02-02 13:42:55.2323623'),
TRY_CONVERT(DATETIME2, '2012-02-31 13:42:55.2323623');
Results:
2012-02-02 13:42:55.2323623 NULL
Or TRY_PARSE:
SELECT TRY_PARSE('2012-02-02 13:42:55.2323623' AS DATETIME2),
TRY_PARSE('2012-02-31 13:42:55.2323623' AS DATETIME2);
(Same results.)
Sorry that I don't have a clever answer for you for < SQL Server 2012. You could, I guess, say
SELECT ISDATE(LEFT('2012-02-02 13:42:55.2323623', 23));
But that feels dirty.
TRY_CONVERT documentation on Microsoft Docs
TRY_PARSE documentation on Microsoft Docs
Be careful using the LEFT(..., 23) solution on database systems using another dateformat than mdy (and SQL-Server 2008). You can see the dateformat of the current session using the DBCC USEROPTIONS command.
On a database system using the german dateformat (dmy) the LEFT(..., 23) solution isn't working (detected on dates with day > 12). See the following test case:
-- test table using a DATETIME and DATETIME2 column.
CREATE TABLE dt_vs_dt2 (
dt DATETIME,
dt2 DATETIME2
);
-- set a datetime values with a day > 12.
DECLARE #date_value AS DATETIME = DATEADD(DAY, 18 - DAY(GETDATE()), GETDATE());
-- insert the current date into both columns using GETDATE.
-- note: using the following on a day > 12
INSERT INTO dt_vs_dt2 VALUES (#date_value, #date_value);
-- let's have a look at the values.
-- the values look the same (the datetime2 is more precise as expected).
SELECT dt, dt2 FROM dt_vs_dt2;
-- now we expect both values are valid date values.
-- to validate the datetime2 value, the LEFT(..., 23) solution is used.
SELECT ISDATE(dt), ISDATE(LEFT(dt2, 23))
FROM dt_vs_dt2;
How to solve that?
You can use a CAST(column_name AS DATETIME) instead of the LEFT(..., 23) to make this work:
-- using a CAST(... AS DATETIME) instead of `LEFT(..., 23)` seems to work.
SELECT dt, CAST(dt2 AS DATETIME) AS dt2
FROM dt_vs_dt2;
-- now both values are valid dates.
SELECT ISDATE(dt) AS dt, ISDATE(CAST(dt2 AS DATETIME)) AS dt2
FROM dt_vs_dt2;
demo on dbfiddle.uk (using dmy) / demo on dbfiddle.uk (using mdy)
On SQL Server 2012 and later you should use the TRY_PARSE / TRY_CONVERT solution described in #Aaron Bertrand answer. The CAST(... AS DATETIME) solution explained in this answer should also work.
Is there an easy way to do a GROUP BY DATE(timestamp) that includes all days in a period of time, regardless of whether there are any records associated with that date?
Basically, I need to generate a report like this:
24 Dec - 0 orders
23 Dec - 10 orders
22 Dec - 8 orders
21 Dec - 2 orders
20 Dec - 0 orders
Assuming you have more orders than dates something like this could work:
select date, count(id) as orders
from
(
SELECT DATE_ADD('2008-01-01', INTERVAL #rn:=#rn+1 DAY) as date from (select #rn:=-1)t, `order` limit 365
) d left outer join `order` using (date)
group by date
One method is to create a calendar table and join against it.
I would create it permanently, and then create a task that will insert new dates, it could be done weekly, daily, monthly, etc.
Note, that I am assuming that you are converting your timestamp into a date.
Instead of using GROUP BY, make a table (perhaps a temporary table) which contains the specific dates you want, for example:
24 Dec
23 Dec
22 Dec
21 Dec
20 Dec
Then, join that table to the Orders table.
you need to generate an intermediate result set with all the dates in it that you want included in the output...
if you're doing this in a stored proc, then you could create a temp table or table variable (I don't knoiw MySQL's capabilities), but once you have all the dates in a table or resultset of some kind
Just join to the real dataa from the temp table, using an outer join
In SQL Server it would be like this
Declare #Dates Table (aDate DateTime Not Null)
Declare #StartDt DateTime Set #StartDt = 'Dec 1 2008'
Declare #EndDt DateTime Set #EndDt = 'Dec 31 2008'
While #StartDt < #EndDt Begin
Insert #Dates(aDate) Values(#StartDt)
Set #StartDt = DateAdd(Day, 1, #StartDt)
End
Select D.aDate, Count(O.*) Orders
From #Dates D Left Join
OrderTable O On O.OrderDate = D.aDate
Group By D.aDate
In a data warehouse, the method taken is to create a table that contains all dates and create a foreign key between your data and the date table. I'm not saying that this is the best way to go in your case, just that it is the best practice in cases where large amounts of data need to be rolled up in numerous ways for reporting purposes.
If you are using a reporting layer over SQL Server, you could just write some logic to insert the missing dates within the range of interest after the data returns and before rendering your report.
If you are creating your reports directly from SQL Server and you do not already have a data warehouse and there isn't the time or need to create one right now, I would create a date table and join to it. The formatting necessary to do the join and get the output you want may be a bit wonky, but it will get the job done.
There's a pretty straightforward way to do this… except that I can't remember it. But I adapted this query from this thread:
SELECT
DISTINCT(LEFT(date_field,11)) AS `Date`,
COUNT(LEFT(date_field,11)) AS `Number of events`
FROM events_table
GROUP BY `Date`
It works in MySQL too