stanl
04-24-2008, 04:52 AM
is probably not the question, but more like the sliding scale. I have large related SQL Server [2005] tables where a date is stored as varchar, i.e. '4/24/2008' -
and I run queries for data from several tables based on a >= of that date.
If run via ADO I use Convert(datetime,[field])>='mm/dd/yyyy' and if tables are linked to an Access DB, cDate([field])>=#mm/dd/yyyy#.
You would expect the Native query would return rows faster, but in fact the Access query is about 4 times faster, even though there appears more overhead is involved. However, there has to be a threshold where ADO would surpass.
Just wondering if anyone has seen similar behavior. Stan
and I run queries for data from several tables based on a >= of that date.
If run via ADO I use Convert(datetime,[field])>='mm/dd/yyyy' and if tables are linked to an Access DB, cDate([field])>=#mm/dd/yyyy#.
You would expect the Native query would return rows faster, but in fact the Access query is about 4 times faster, even though there appears more overhead is involved. However, there has to be a threshold where ADO would surpass.
Just wondering if anyone has seen similar behavior. Stan