SQL View Linked in Access Times Out - ms-access

I have a SQL view, which takes 4-5 seconds to run, with no filters, if I run it within SSMS. If I try to open the linked "table" in Access 2010, it times out.
In the Options - Client Side Settings, I set the OLE/DDE timeout to 0, and the ODBC timeout to 0. I still get the ODBC--call failed. [Microsoft][ODBC SQL Server Driver]Query timeout expired (#0). Once I click ok, I get another message [current application name] can't open the table in Datasheet view.
I just don't understand how I can't open this linked table in Access, but the underlying view only has 88 records right now. There are a lot of columns, but few results, and it only takes a few seconds to run in SSMS. Why does it timeout and have such a problem as a linked table in Access?
Any help is greatly appreciated.
Thanks!

So I was looking at this puzzle, with a colleague. It would be difficult and still poor performance, to translate this 118 line query, with 30 table joins, into an Access query.
Instead, I am breaking the sections of the giant view, into separate smaller views. Each independent view, will be joined in an access query, so that each section of the query can be filtered independently, and allow for smaller sets of results, thereby improving the overall performance.

Related

SSRS Report Timing out in Production Server (except after refreshing 3 times)

The report works fine in the DEV and QA server but when placed in Production the following error comes up:
An error occurred during client rendering.
An error has occurred during report processing.
Query execution failed for dataset 'Registration_of_Entity'.
Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding.
The strange part was that the Admins have assured me that this report has now been set so there is no timeout at all.
Refresh the report 3 times every morning and the error message goes away.
What can I do to fix this issue so that the report never receives this error?
There are several steps to resolve correctly this issue.
I advise following them in the following order:
1. Reduce the query execution time
Execute the query of the DataSet Registration_of_Entity in SSMS and see how long it takes to complete.
If your query requires more time to execute than the timeout specified for the DataSet, you should first try to reduce this time, for example:
Change the query structure (rethink joins, use CTEs, ...)
Add indexes
Looking at the execution plan can help.
2. Reduce the query complexity
Do you need all those rows/columns?
Do you need to have all these calculations on the database side?
Could it be done in the report instead?
You could try to:
Reduce the query complexity
Split the query in smaller queries
Again, looking at the execution plan can help.
3. Explore additional optimizations not related to the query itself
You really need this query, but do you need the data real-time?
Are there a lot of other queries being executed on this server?
You could look into:
Caching
Replication / Load Balancing
Note that from SSRS 2008 R2, the new Shared DataSets can be cached. I
know it doesn't apply in your case but who knows, it could help
others.
4. Last resort
If all the above steps failed to solve the issue, then you can increase the timeouts.
Here is a link to a blog post explaining the different timeouts and how to increase them.
Do you know if your query is becoming deadlocked? It could be that the report gets blocked on the server during peak times.
Consider optimizing your query or, if the data can be read uncommitted, add WITH (NOLOCK) after each FROM and Join Clause. Be sure to google WITH(NOLOCK) if you are unfamiliar with it so you know what read uncommitted can do.

SQL Server 2008 Reporting Services slow reports

I have a problem in SQL Server 2008 Reporting Services. The problem is that the report is sometimes too slow to render (it takes more than 30 min), although I took the query and executed it in SQL Server Management Studio and it didn't take more than 25 seconds.
The query returns a large table (about 5000 rows) and I use it to draw a pie chart in the report, I tried to optimize the query so that it returns only 4 rows but the report was slow again.
What confuses me is that sometimes the report (with different input) is as fast as the query (about 30 sec), I thought it might be because of low number of users so I tried with some colleagues to view it at the same time but the reports still are fast, I tried to change in the configuration but I had no luck.
I've been searching for a solution for this problem for more than two months, so if anyone could help me on this I will be very thankful.
If you have access to the ReportServer sql database execute the following query or similar against the ExecutionLog view:
select TimeStart, TimeEnd, TimeDataRetrieval, TimeProcessing, TimeRendering, Status, ReportID from executionlog
This will provide you with a good breakdown of your report rendering (with different parameters).
Pay close attention to TimeRendering, TimeProcessing and TimeDataRetrieval.
Large or high values for any of these columns will illustrate where your bottleneck is.
One problem that I have experienced in the past is when you are returning a fairly large dataset to the report (5000 rows is large enough for this scenario) and then you are using the inbuilt ssrs filtering, the rendering is very slow and this would result in a very high TimeRendering value.
All rendering should be done at the database layer, grouping and filtering does not perform well will large amounts of data when performed in the ssrs report itself.

Vtiger Report generation failed! Too many tables; MySQL can only use 61 tables in a join

Hello I am usign VTiger CRM version 5.1 and when a user i trying to generate Report called "Last Month activities" in the result area only following error is given:
Report generation failed!
Too many tables; MySQL can only use 61 tables in a join
The strangest thing about this behaviour is that under ADMIN user everything works as expected.
Can anyone advise me on this one?
61 is a hard coded limit of tables used in a single query. No way around it short of changing MySQL's source code and recompiling. If you have this report working from one user, but not from other, that probably means VTiger CRM joins some more tables, when the user is not ADMIN.
You can also increase the value of open tables.
The cache of open tables is kept at a level of table_cache entries.
The default value is 64; this can be changed with the --table_cache
option to mysqld. Note that MySQL may temporarily open more tables
than this to execute queries.
http://dev.mysql.com/doc/refman/5.0/en/server-system-variables.html#sysvar_table_cache

Access 2007 First cn.Execute Statement very slow

I am currently working on an application in Access 2007 with a split FE and BE. FE is local wiht BE on a network share. To eliminate some of the issues found with using linked tables over a network, I am attempting, through VBA using ADO, to load two temp tables with data from two linked when the application first loads using the cn.Execute "INSERT INTO TempTable1 SELECT * FROM LinkedTable1" and cn.Execute "INSERT INTO TempTable2 SELECT * FROM LinkedTable2".
LinkedTable1 has 45,552 records in it and LinkedTable2 has 45,697 records in it.
The first execute statement takes anwhere from 50-85seconds. However the second execute statement takes no more than 9 seconds. These times are consistent. In an attempt to see if there were issues with one of the tables and not the other, I have switched the order of the statements in my code and the times still come out the same (first execute is way too long and second execute is very fast). (As a side note, I have also tried DAO using the CurrentDB.Execute command with no different results.) This would make sense to me if the first statement was processing more records than the second, but although a small number, the second table has more records than the first!
Does anyone have ANY suggestions on why this is happening and/or how to get this first execute statement to speed up?
Thanks in advance!
ww
What indexes do you have defined on the two temp tables, as well as primary key definitions? Updating the indexes as the data is appended could be one reason one table is slower.
My guess is that there are two sources for the difference:
the initial creation of the remote LDB file when you execute the first INSERT statement. This shows up as overhead in the first SQL command, when it's actually something that persists through both.
caching: likely the file is small enough that Jet/ACE is pulling large chunks of it across the wire (the header and metadata, plus the requested data pages) during the first operation so that there's much less data that is not already in local memory when the second command is issued.
My question is why you are having problems with linked table performance in the first place. Solve that and you then won't have to muck about with temp tables. See Tony Toews's Performance FAQ.

How to increase SQL Server database performance?

I have table in a SQL Server database with only 900 record with 4 column.
I am using Linq-to-SQL. Now I am trying retrieve data from that table for this I have written a select query.
Its not querying data from database and its showing time out error.
Please give me idea for this. First how can I increase time and second how can increase performance of query so can it easily access.
Thanks
That is a tiny table, there is either something very wrong with your database, or your application.
Try seeing what is happening in the database with SQL Profiler.
If you have just 900 records and four columns then unless you are storing many megabytes of data in each field the query should be very fast. I think your problem is that the connection is failing, possibly due to a firewall or other networking problem.
To debug I'd suggest running a simpler query and see if you can get any data at all. Also try running the same query from the SQL Server Management Studio to see if it works there.