Drupal Site Slow, Drupal Tweaks - mysql

my site is very slow and i disabled the bulk of modules, so I think it was my hosting, first I upped the memory limit from 90M to 256M which helped with "500 errors" but the site is still very slow. I am thinking it may be the database settings, I am changing them with the module "drupal tweaks" for convenience.

Define "slow". Is your page slow to appear? If so, is it taking a while to load the page's HTML, or is much of the time spent trying to load CSS, JS, images, etc.? The "Net" panel in Firebug can help here, as can wget.
If the page itself is slow, is the slowness encountered at the network, Web server, PHP, or database level? Network slowness is easy to diagnose; try a ping. Either the network or the web server can show faults through the simple test of just throwing a big, static file somewhere on the server and then downloading it. PHP or DB slowness will show up when you look at the page with the devel module, but DB slowness can generally also be seen as long-running (as opposed to long-sleeping!) processes in the output of SHOW PROCESSLIST.
Alternately, give NewRelic a try. It will automate several of these steps, and give you fancy graphs, to boot.
Lastly: Consider getting a better hosting provider, ideally one with a great deal of Drupal experience (like BlackMesh or Acquia). At a great host, fixing this issue (if it arose in the first place) would be a collaborative effort between the hosting provider and the customer, rather than something you have to figure out on your own. It doesn't take many (billable) hours spent debugging hosting before you've paid for better hosting than you're getting now.

If you have a staging or development copy of the site, try installing the Devel module, which has an option for logging and appending stats about each query to the bottom of each Drupal page. That should help you track down the offending query or queries.

Here are some optimisation tips you might try.
First it is possible that your cache table should be messed up. Try to log in your phpmyadmin and try to flush all your cache tables (don't delete the tables just empty them).
Check inside the MySql installation folders config file alternative like: my-large.ini, my-huge.ini and try to rename them and copy them as bin/my.ini they are out of the box optimisations for your mySQL database.
You never mentioned the Drupal version you use. Drupal 7 has a brand new database API build on top of PHP PDO objects.
Here is a link that will help you a lot: http://drupal.org/node/310075
Rather than writing a SQL query in raw text and calling a helper function to execute the query, you can create a "query" object, call methods on it to add fields, filters, sorts, and other query elements, then call the query object's "execute" method.
Others
On my website PHP memory limit is 512Mb, 215Mb however should be ok for a production website.
Also why don't you enable APC logging?
Sometimes masive scripts could slow down your pages. in Performance, check your option: agregate CSS and JS file.
you might consider Drupal Pressflow over Drupal.
Boost up your website by installing Boost module: http://drupal.org/project/boost
Cache your Views
Other advanced techniques could include:
Cacherouter ( a module that replace the default Drupal cache).
Using Elysia Cron to reduce cron impact
Eventually switch to a superior hosting package or change your hosting provider
Regards...

Drupal site performance is obvious slow, but if its slow to a extreme level then you can go with other options that improved its performance.
1) You can install the memcache module on server,that is very fast in terms of site caching. Refer: How to install memcache
2) Change the PHP ini settings or mysql conf.
3) Check the custom modules if have, because execution of frequent queries create deadlock.
4) Check the mysql table storage engine & I will recommend the InnoDB which will provide row level locking.

Related

Options for speeding up slow SQL queries

We're having issues with a few queries - relatively simple queries - that take too long processing. Everything from 3 000ms to 30 000ms. We are using PHP 5.5 and MySQL 5.5.28-29.1.
We have a few options, but I am posting here to see if anyone has any experience on each of them:
Currently we are accessing views to get our data, this was done to move the processing load from the PHP to the MySQL. Would accessing the tables directly improve the query processing speed? I'm thinking not, cause it would lead to a lot more queries, due to the fact that the views are just collations of data.
If we were to install a cache DB, such as SQLite3, to cache it locally, then sync it to a RDBMS, how would we do that? And would the speed improve?
Thinking about a NodeJS version as well, using Node WebKit. As far as I can understand there is npm packages out there that can act as cache or a db connection, which would rule out the need for PHP. But how about the speed?
Another option is to set up a dedicated server for this environment (we're using a virtual server environment for the moment). Which would most likely speed some parts of it up. But if the MySQL will still be slow on that server - it's kind of wasted.
These are the alternatives I can think of at the moment. Any suggestions are appreciated.
(I can post the slow SQL queries if need be, but would like to see if anyone has anything to say about our options first)

How to speed up joomla slow mysql queries?

i installed an joomla website on an vps with nginx+php-fpm, mysql....with low traffic
everything load pretty fast; except the generated joomla page because of slow mysql querys. is there a way to cache/speed them up?
mysql query does not work because there is a low traffic(maybe an user/one hour)
I've built a lot of sites with Joomla (although admittedly not paired with nginx) and never had to even consider this. How much content does your installation have? It would have to be an awful lot before any serious thought needed to be given to database tuning.
Are you sure the database engine is the cause of the problem? You can verify this by turning on the debug output in Joomla's global configuration (under 'System'), which will show the queries and some timing info in the footer. Don't forget to turn it off when you're done with it!
Have a look at the timing for the Application afterRender entry. Unless you're consistently seeing multi-second values there, I'd be looking elsewhere for your problem.

How to benchmark and optimize a really database-intensive Rails action?

There is an action in the admin section of a client's site, say Admin::Analytics (that I did not build but have to maintain) that compiles site usage analytics by performing a couple dozen, rather intensive database queries. This functionality has always been a bottleneck to application performance whenever the analytics report is being compiled. But, the bottleneck has become so bad lately that, when accessed, the site comes to a screeching halt and hangs indefinitely. Until yesterday I never had a reason to run the "top" command on the server, but doing so I realized that Admin::Analytics#index causes mysqld to spin at upwards of 350+% CPU power on the quad-core, production VPS.
I have downloaded fresh copies of production data and the production log. However, when I access Admin::Analytics#index locally on my development box, while using the production data, it loads in about 10 - 12 seconds (and utilizes ~ 150+% of my dual-core CPU), which sadly is normal. I suppose there could be a discrepancy in mysql settings that has suddenly come into play. Also, a mysqldump of the database is now 531 MB, when it was only 336 MB 28 days ago.  Anyway, I do not have root access on the VPS, so tweaking mysqld performance would be cumbersome, and I would really like to get to the exact cause of this problem. However, the production logs don't contain info. on the queries; they merely report the length that these requests took, which average out to a few minutes apiece (although they seemed to have caused mysqld to stall for much longer than this and prompting me to request our host to reboot mysqld just to get our site back up in one instance).
I suppose I can try upping the log level in production to solicit info. on the database queries being performed by Admin::Analytics#index, but at the same time I'm afraid to replicate this behavior in production because I don't feel like calling our host up to restart mysqld again! This action contains a single database request in its controller, and a couple dozen prepared statements embedded in its view!
How would you proceed to benchmark/diagnose and optimize/fix this action?!
(Aside: Obviously I would like to completely replace this functionality with Google Analytics or a similar solution, but I need fix this problem before proceeding.)
I'd recommend taking a look at this article:
http://axonflux.com/building-and-scaling-a-startup
Particularly, query_reviewer and newrelic have been a life-saver for me.
I appreciate all the help with this, but what turned out to be the fix for this was to implement a couple of indexes on the Analytics table to cater to the queries in this action. A simple Rails migration to add the indexes and the action now loads in less than a second both on my dev box and on prod!

How to keep databases synchronized between hosting account and a local testing server?

I have several databases hosted on a shared server, and a local testing server which I use for development.
I would like to keep both set of databases somewhat synchronized (more or less daily).
So far, my ideas to solve the problem seem very clumsy. Anyway, for reference, here is what I have considered so far:
Make a database dump from online databases, trash local databases, and recreate the databases from the dump. It's a lot of work and requires a lot of download time (which guarantees I won't do it as much as I would like it to be done)
Write a small web service to access the new data, and write a small application locally to communicate with said web service, download the newest data, and update the local databases.
Both solutions sound like a lot of work for a problem that is probably already solved a zillion times over. Or maybe it's even an existing feature which I completely overlooked.
Is there an easy way to keep databases more or less in synch? Ideally something that I can set up once, schedule and forget about.
I am using MySQL 5 (MyISAM) databases on both servers.
=============
Edit: I had a look at replication, but it seems that I can't go that route because the shared hosting does not give me enough control on the server itself (I got most permissions on my databases, but not on the MySQL server itself)
I only need to keep the data synchronized, nothing else. Is there any other solution that doesn't require full control on the server?
Edit 2:
Sorry, I forgot to mention I am running on a LAMP stack on the shared server, so Windows-only solutions won't work.
I am surprised to see that there is no obvious off-the-shelves solution for this problem.
Have you considered replication? It's not to be trifled with but may be what you want. See here for more details... http://dev.mysql.com/doc/refman/5.0/en/replication-configuration.html
Take a look at Microsoft Sync Framework - you will need to code in .net, but it can resolve your issues.
http://msdn.microsoft.com/en-in/sync/default(en-us).aspx
Here is a sample for SQL server, but it can be adapted to mysql as well using ado.net provider for Mysql.
http://code.msdn.microsoft.com/sync/Release/ProjectReleases.aspx?ReleaseId=4835
You will need the additional tables for change tracking and anchors (keeping track of last synchronization) for this to work, in your mysql database, but you wont need full control as long as you can access the db.
Replication would have simpler :), but this might just work in your case.

Magento: server requirements for a quite big shop to run smoothly

I'm working on a quite big magento: it will have 50 different shops (1 magento install, 1 admin to rule them all) for start, this number is expected to raise in the future, and a catalog of more than 1k products. This catalog will be shared by all shops.
I'm concerned about the server requirements I need for this to run smoothly. So far this is what I've found to get the most of it:
Caching: using magento's cache with APC, MySQL's querys
use FastCGI instead of mod_php
database clustering: I don't think it will be necesary for 1k products, what do you think?
using Zend Server
Are there other thing I can do in order to improve magento's performance? I'd like to know all I need from the beginning so I can find the right server.
thanks in advance.
Make sure also to use block-level caching for the sites. Beyond this, one of the suggestions that I've seen implemented is to change dynamic blocks (such as blocks that grab product data dynamically) over to statically defined HTML if they don't change often.
Once you've coded a site, tune it using YSlow and Firebug to make sure that as many files as possible are cached, and that the page size is minimized. Minimizing the number of HTTP requests to Apache will increase the capacity of your server.
Finally, enable the flat catalog and flat category functions in Magento. This will force Magento to use fewer joins when retrieving catalog data, so your database load will go down and speed will increase considerably.
Hope that helps!
Thanks,
Joe
In testing, I noticed amazing improvements using an Amazon instance running ubuntu php-fpm REST and nginx. The only reason I didn't go there with our recent Magento upgrade is the host I'm on still works ok and I really don't want to be sysadmin for my site again.
Also, did you know there is http://magento.stackexchange.com ? :D