How to speed up Magento 1.9 Website? - magento-1.9

I am new in Magento & I am facing this problem. I have cleared log files, cache , etc. But then also site is very slow. I have near about 150 products (simple). Please guide me on this. The site is not live, I am working on localhost.
Thanks in advance.

There are many points which you should take care of, when optimizing magento's shop performance.
Enable Gzip compression
Enable Flat Catalog
Enable Magento Cache
Optimize images
Merge CSS and Javascript Files
Clean Up Magento Database & Logs
Try to remove/disable modules which are of no use, or try to merge small modules because if number of modules are higher then your website's performance will be slower.
Use any PHP Accelerators like APC, Xcache, eAccelerator etc.
Apart from that, check gtmetrix.com or google page speed tools recommendations for more.

Related

How to find web server slowness with magento website?

We have a magento + marketplace website and a good web server by 128G RAM and other hardwares in this level and about 15000 products + not many visitors (about 5000 per days).
BUT we have a trouble on performance.
so...
I would like to know where is our problem?
1- apache webserver? (and we can solve by Engin-X or Litespeed/litemage)
2- mysql many quesries?
3- coding problems?
4- magento is just this!
Is there any webserver monitoring software to see where is exactly our problem?
thanks.
yeah. newrelic apm or any other profiler like blackfire.io or tideways.io
It could be that you are not effectively using the 128GB.
It could be that you have slow queries.
It could be that Magneto is doing a sloppy job of creating SQL, or what you are telling Magneto leads to sloppy SQL.
Is that 5000 unique visitors_ per day? Or 5000 visits per day?
Try to see which of Apache, Magneto, or MySQL is the slow part.
Adding a cache in front of MySQL sometimes slows things down; don't bother.
For more help from the MySQL point of view, follow the steps in http://mysql.rjweb.org/doc.php/mysql_analysis

IDE like system - Database or Filesystem for Storage

The project is an open source IDE like system to edit code, alter images,, run code, etc. for students in school.
It contains an upload of Archive/Folders which characteristics are:
Upload Archive, contains up to 30 files (txt,js,xul,xml,php, gif/png/jpg), av.size in total 500kb
Users can edit and fork
all the files are editable through Codemirror (like used on jsfiddle)
all the Gif/Png/Jpg are replaceable (10 - 40 per archive)
we expect daily at least 1'000 new uploads/forks with an average of 20 files, total-size min. 500 mb
Our enironment:
PHP
most likely MySQL database
Linux
To consider:
We don't require Searching through the folders/files on global scope
the User saved Data is as it is, never any changes from our side necessary
State-of-Development:
Ready besides the storage question and all their dependencies
Would you advise on SQL or a simple Filesystem?
Before starting the project we were 100% sure using MySQL, but with the added feature of Image modification and a growing database (atm 80/k files,2GB) we struggle. Also reading here let us hesitate too.
Advantages of MYSQL are surely the easy maintainment and simpler future restructure of the system.
Though it will get a huge database fast to search within.
By using a global php entryfile to read the Filesystem based on URL parameters, the searching can be ommitted and straight go displaying the fetched directory with its content.
We are not experienced in managing large data and rely on experience of people who faced such a situtation already.
Rather than just vote for Database or Filesystem please consider your own tips to make this environment more efficient to run (eg indexing, structure, tables, etc..) or elaborate your decision.
very thankful for any tips of you
btw, the project will be hosted at git
I believe that the best option to go is the file system taking into consideration your requirements. The pros of this choice are
Better performance of your database since the table will only hold a link to your file system where the file is stored.
It is easier to backup your database.
You have the possibility of storing the files in different locations by defining rules. This gives you a flexibility in managing your storage.
When we faced a similar problem for storing attachments in a service desk application, we have chosen to go with the filesystem. Till now everything is working as expected.
I hope that my answer is helpful to you.

Drupal Site Slow, Drupal Tweaks

my site is very slow and i disabled the bulk of modules, so I think it was my hosting, first I upped the memory limit from 90M to 256M which helped with "500 errors" but the site is still very slow. I am thinking it may be the database settings, I am changing them with the module "drupal tweaks" for convenience.
Define "slow". Is your page slow to appear? If so, is it taking a while to load the page's HTML, or is much of the time spent trying to load CSS, JS, images, etc.? The "Net" panel in Firebug can help here, as can wget.
If the page itself is slow, is the slowness encountered at the network, Web server, PHP, or database level? Network slowness is easy to diagnose; try a ping. Either the network or the web server can show faults through the simple test of just throwing a big, static file somewhere on the server and then downloading it. PHP or DB slowness will show up when you look at the page with the devel module, but DB slowness can generally also be seen as long-running (as opposed to long-sleeping!) processes in the output of SHOW PROCESSLIST.
Alternately, give NewRelic a try. It will automate several of these steps, and give you fancy graphs, to boot.
Lastly: Consider getting a better hosting provider, ideally one with a great deal of Drupal experience (like BlackMesh or Acquia). At a great host, fixing this issue (if it arose in the first place) would be a collaborative effort between the hosting provider and the customer, rather than something you have to figure out on your own. It doesn't take many (billable) hours spent debugging hosting before you've paid for better hosting than you're getting now.
If you have a staging or development copy of the site, try installing the Devel module, which has an option for logging and appending stats about each query to the bottom of each Drupal page. That should help you track down the offending query or queries.
Here are some optimisation tips you might try.
First it is possible that your cache table should be messed up. Try to log in your phpmyadmin and try to flush all your cache tables (don't delete the tables just empty them).
Check inside the MySql installation folders config file alternative like: my-large.ini, my-huge.ini and try to rename them and copy them as bin/my.ini they are out of the box optimisations for your mySQL database.
You never mentioned the Drupal version you use. Drupal 7 has a brand new database API build on top of PHP PDO objects.
Here is a link that will help you a lot: http://drupal.org/node/310075
Rather than writing a SQL query in raw text and calling a helper function to execute the query, you can create a "query" object, call methods on it to add fields, filters, sorts, and other query elements, then call the query object's "execute" method.
Others
On my website PHP memory limit is 512Mb, 215Mb however should be ok for a production website.
Also why don't you enable APC logging?
Sometimes masive scripts could slow down your pages. in Performance, check your option: agregate CSS and JS file.
you might consider Drupal Pressflow over Drupal.
Boost up your website by installing Boost module: http://drupal.org/project/boost
Cache your Views
Other advanced techniques could include:
Cacherouter ( a module that replace the default Drupal cache).
Using Elysia Cron to reduce cron impact
Eventually switch to a superior hosting package or change your hosting provider
Regards...
Drupal site performance is obvious slow, but if its slow to a extreme level then you can go with other options that improved its performance.
1) You can install the memcache module on server,that is very fast in terms of site caching. Refer: How to install memcache
2) Change the PHP ini settings or mysql conf.
3) Check the custom modules if have, because execution of frequent queries create deadlock.
4) Check the mysql table storage engine & I will recommend the InnoDB which will provide row level locking.

MySql post archives. What's the best way to handle massive archived posts that need to be kept?

If my wordpress site generates thousands(perhaps millions) of posts a day, what is the best way to keep the site from taking a performance hit with posts that only need to be seen if someone searches old posts or for legal purposes?
My first thought was to run a cron job during a lull and move the out-of-date posts to an archive database. If anyone wanted to view an older post, the code would automatically look in the archives.
Is there a better way?
Also, any links to tuts on handling massive site data would be helpful!
You are not alone with this, see the Coding Horror post on the same topic: http://www.codinghorror.com/blog/2008/04/behold-wordpress-destroyer-of-cpus.html
The easiest thing to do is to install the WP-Cache plugin:
WP-Cache is an extremely efficient WordPress page caching system to make your site much faster and responsive. It works by caching Worpress pages and storing them in a static file for serving future requests directly from the file rather than loading and compiling the whole PHP code and then building the page from the database.
http://mnm.uib.es/gallir/wp-cache-2/
Perhaps use these recommended DB settings:
http://www.codinghorror.com/blog/files/matt-mullenweg-wordpress-mysql-recommendations.txt
Then perhaps make a few changes to your query cache settings in MySQL.
To see how to do this: http://dev.mysql.com/doc/refman/5.0/en/query-cache-configuration.html

Magento: server requirements for a quite big shop to run smoothly

I'm working on a quite big magento: it will have 50 different shops (1 magento install, 1 admin to rule them all) for start, this number is expected to raise in the future, and a catalog of more than 1k products. This catalog will be shared by all shops.
I'm concerned about the server requirements I need for this to run smoothly. So far this is what I've found to get the most of it:
Caching: using magento's cache with APC, MySQL's querys
use FastCGI instead of mod_php
database clustering: I don't think it will be necesary for 1k products, what do you think?
using Zend Server
Are there other thing I can do in order to improve magento's performance? I'd like to know all I need from the beginning so I can find the right server.
thanks in advance.
Make sure also to use block-level caching for the sites. Beyond this, one of the suggestions that I've seen implemented is to change dynamic blocks (such as blocks that grab product data dynamically) over to statically defined HTML if they don't change often.
Once you've coded a site, tune it using YSlow and Firebug to make sure that as many files as possible are cached, and that the page size is minimized. Minimizing the number of HTTP requests to Apache will increase the capacity of your server.
Finally, enable the flat catalog and flat category functions in Magento. This will force Magento to use fewer joins when retrieving catalog data, so your database load will go down and speed will increase considerably.
Hope that helps!
Thanks,
Joe
In testing, I noticed amazing improvements using an Amazon instance running ubuntu php-fpm REST and nginx. The only reason I didn't go there with our recent Magento upgrade is the host I'm on still works ok and I really don't want to be sysadmin for my site again.
Also, did you know there is http://magento.stackexchange.com ? :D