My application uses MS SQL server 2008 and it is hosted in a Windows 2003 Enterprise Server SP2 (32 bit) 2-CPU 8 gig Ram VM machine. The application has 2 or more windows services.One of those service access the DB frequently. When the load of the DB is set 65k or something , the CPU usage hikes upto 75-95% and it doesnt seem to reduce until unless the service is stopped.
This issue we have not faced in Oracle 10 g, with the same application and same load.
How to reduce the cpu usage ?
Is there something I need to do with the application code or with the SQL server.?
Any help will be appreciated.
Thanks,
Priya.
When it accesses the database, is it logging in, doing it's work, and then logging out? If so, see if you can retain the same connection rather than tearing down each time.
To see if it is an issue with the work it is doing, run SQL Profiler against the server and look for high read counts, high cpu count or long duration queries.
Related
We have a ASP.NET application deployed internally and it is being used by almost 500 users. There are instances that when a lot of users are using the site, there's an error connection.
We have tried several options like pooling, optimizing our code, but we still don't know if this has something to do with the server. Right now, we are using shared hosting (dedicated 1.5 gb ram). Do you think upgrading the server will resolve this issue?
Also, is database computation dependent on ram and # of cores?
Thanks!
I am deploying a MVC2 application under IIS7 on a Windows Server 2008 R2 server with plenty of horsepower. It's connecting to a SQL Server 2008 DB and the applications performance is dramatically slower than when running in the debugger on a developer workstation (connecting to the same SQL Server DB). I've already checked network connections and nothing in the eventlogs indicate an issue with Windows. I've also run the Profiler on the DB server and the queries are firing off fast.
Any help with diagnosing this performance issue would be appreciated. I've even built a new 2008 R2 server to test it on in hopes that it was the server itself but the performance was the same.
Thanks
Edit 1:
IIS is running on a Dell R710 server running Windows Server 2008 R2 Standard, 32GB ram. SQL Server 2008 is hosted on a separate R710 running Server 2008 R2 Standard, 12GB ram. Initially I had IIS running on a VM but I moved it to the physical machine to see if performance degradation was due to the VM. I'm experiencing the same performance on both so it looks like it wasn't a factor.
Edit 2:
It appears that opening the connection to the database is part of the bottleneck with the subsequent firing of stored procedures also taking a considerable amount of time:
-Open db connection: 5 sec (subsequent connections are cached so they don’t require 5 seconds)
-First sproc : < 1 sec
-Second sproc: 5 sec
-Third sproc: < 1 sec
-Fourth sproc: < 1 sec
-Fifth sproc: 6 sec
First, Windows Server 2008 R2 has IIS 7.5, not IIS7. Probably doesn't matter. However, it is a different version.
Second, you need to find out where the bottleneck is. Try removing the queries and just using dummy data. Is it still slow? If not, then it has nothing to do with the data access.. if so, then you know it's a data access issue.
Here are slides from a presentation about optimizing ASP.NET MVC applications. They were able to improve their app's performance from 8 req/sec to 400 req/sec.
They mention in the slides how they profiled it and identified bottlenecks (query compilation, many calls to RenderPartial, url generation etc.) and give some tips at the end of the presentation.
I have a very odd performance issue with an SSIS package in Sql Server 2008 R2.
Here the facts:
Recently we migrated from a Sql Server 2005 (on Windows Server 2003 R2 32bit) to Sql Server 2008 R2 (on Windows Server 2008 R2 64bit).
Everything seems to be fine except some performance issue on one SSIS package.
If i run it from my PC it runs fine in some minutes (around 4/5) and the same seems to happen if i connect to Integrations Services through SqlServer Management Studio and start the package from there.
However when i run it from Sql Server Agent i have execution time that goes from 5 minutes to more than one hour... I had no such problems with the old server! I also tried to run the package in 32bit mode, on some run it seems to be faster, but it's pretty random... However it never reach the good performance it had on Sql Server 2005.
I have no clue... initially i think it was a memory problem because i did not give maximum memory limit to Sql Server and some other package had problems to run at the same time, so I expanded the RAM used by the server (it runs on VMWare) and now the machine has 8GB of RAM and maximum server memory for Sql Server is 4GB. The other package has no crash now, but this is still giving random execution time...
Any guess?
Following a table of execution time through days
Start Time Execution Time
12/17/2010 06:15 00:49:43
12/16/2010 17:54 01:12:26
12/16/2010 17:18 00:06:29
12/16/2010 16:53 00:05:23
12/16/2010 16:10 00:24:23
12/16/2010 06:15 00:19:26
12/15/2010 06:15 00:07:19
12/14/2010 06:15 00:11:26
12/13/2010 06:15 00:17:30
12/12/2010 06:15 00:44:59
12/11/2010 06:15 00:11:59
12/10/2010 06:15 00:34:19
What else is running? Do you have users running queries (who may lock tables) or other packages touching related tables when this package is scheduled to run? When you run it, do you see blocking locks or anything else like that?
It's more likely to me that the production batch execution environment isn't as nice and quiet and controlled as your development environment.
A user or related package holding data locks might explain the random distribution of execution times.
I found a solution to this.
The problem was a memory issue with the buffer creation on the server.
I partially solved it increasing the default buffer size (both in MB and rows number) and completely solved it removing all the Sort and Merge component i used replacing them with Lookup on custom Cache built with Transform Cache component.
I still do not understand why the allocation in Sql Server 2005 with Windows Server 2003 and in development works fine, however now the package is fixed!
I have a SQL Server 2008 in production environment (Windows 2003 -64 bit) and
it is consuming 10 GB memory of installed 20GB. Is this normal behavior or is there anything wrong with the configuration ?
P.S. I have hosted one web application which is used by hundreds of users concurrently everyday .
SQL Server reserves memory which is why you are seeing high peaks. It might show up as using 10GB in your Task Manager, but the real memory usage can be checked from within the Management Studio.
Also, you can establish upper and lower limits to the amount of memory (buffer pool) used by the SQL Server database engine with the min server memory and max server memory configuration options.
Check this article out http://support.microsoft.com/kb/321363
Microsoft has adopted the strategy for memory management that any unused memory is wasted memory. Microsoft's newer OS's and SQL Server versions will allocate more memory for caching, until the system requests it for other purposes.
So, what you are seeing is probably normal.
Much of that allocated memory can be released to other applications as needed. As distressing as that memory usage may seem, it is not as dire a situation as it may appear.
There is nothing wrong with that behavior, SQL is just caching your data. If there is something else you'd like to use that memory for you can configure SQL Server to use less, however, configuring it that way may make queries slower.
I have already a website and moving that to the new VPS. I have option either to go with MySql database or Sql Server Express 2008 edition database (I do not want to pay for SQL Server as I can't afford as I have other expenses also). I have around 10K hits per day for my knowledge based website.
My questions are
If I go with ASP.NET with MySql, will this be able to handle current load as well as load in the future?
If I go with ASP.NET with Sql Server Express 2008, will the Sql Server express 2008 be able to handle this much of load as I see the sample application with Sql Express database is slower. I am aware that there is limitation in terms of CPU and Memory (1 GB RAM) in Sql Server express
My current server has Windows Server 2008 R2, 1GB RAM. Increasing RAM will help with Sql Server Express?
Any suggestions please
I don't know MySQL, but 10K hits/day is nothing (unless, of course, they all happen within 10 minutes and the rest of the day is without traffic).
Same as 1.
SQL server express will use 1GB, but RAM is also used for other things in the system. Upgrading to 2GB would certainly help, above that is uncertain, but RAM is cheap so buy 4GB to be safe.
Edit: 10K hits/day, evenly distributed, is 0.12 hits/second, or 1 hit every 8.5 seconds.
I would go for SQL Server Express so you can easily take advantage of the ASP.NET Membership elements. It will handle loads higher than you mention but like Erik says, give it some more RAM.
Kindness,
Dan