I would know what are the queries that make slow my Joomla website. I know that there is the possibility to log all the slow query.
Unfortunately, I have only the access to the ftp where the website is hosted. Can I see this log also in ftp? Or I have to access to the server?
There are other way to see this log?
Thanks
As Rinos already said, there can be different reasons for a slow Joomla-Site.
If you cannot find a db-query being responsible for that check the network-tab in your developer-tools for resources that slow down your site.
One possible reason can be, that you are loading http-resources via https (so if you have hardcoded integrations of images, script-files etc. that will load via http while your site running in https, the developer tools will bark at you something about 'mixed content' ;) )
Depending on your Joomla there also might be some Modules/Components/Plugins that are not well designed... maybe deactivating them by chance and refreshing (yep, might be lot of work) will give you a hint. BUT: Please be careful, since there are some Plugins like the authentication-plugins that are needed and if you deactivate them you might "lock" yourself out. Normally core components and plugins shouldn't be responsible for that at all.
If you have a look at the queries from the debug-console, there are some queries that perform a full-table-scan. Maybe you'll find one among these that are performance-hungry.
If not so, please check your global configuration in the Joomla Backend under System -> Global Configuration and try check the following things:
Is caching enabled?
What kind of caching do you use?
Under the Server-Tab check for gzip-compression of your page
if you force https, do you have any http-resources on your site (like mentioned above)
Some of the possibilities here might help you to gain some performance, but if you still have performance problems, my next look would be the server-config.
There are still other things you might give a "heads up":
What PHP-Version are you running? PHP 7.x brings a remarkable performance-boost
Have a look at your php.ini file. What about your memory-limit and other options (have a look here for the technical requirements that are recommended: https://downloads.joomla.org/technical-requirements
Back on your site, are there any javascript-errors (the developer-tools of your browser will tell you)
Well, these are several possibilities you might pay some attention... Performance-Issues can have many reasons but hopefully some of the above said things might lead you on the right track ;)
regards
You can active debug in joomla from Configuration system, then you can see at the end of pages in "live site" all query performed with time and memory used, and much more you can help for understand what is slowing down your site
Related
With HTML5's offline capabilities is it possible to create an app that will persist after the connection is lost and the browser is closed? Specifically, here's what I'd like to do:
Connect to the app while online. Download the entire app including a small database it runs on.
Close the browser and disconnect.
Open the browser again while offline and load the app from the local cache.
Thanks to Mark Pilgrim's excellent book I believe I have an idea of how to accomplish the first step, I'm mainly wondering if the last step is possible. If this is possible, I'm guessing it requires some configuration of the browser. Any settings I should be aware of that aren't obvious?
Thanks very much for any help offered.
The last step should be possible - it just depends on what extent you want to implement it to. To my knowledge it shouldn't require any browser settings. You just have to be aware of the limitations of local storage, which I believe is 5mb max at the moment (for most browsers). Obviously you'd have to perform the checks for such permissions as outlined int the Dive Into Html5 guide you linked.
The quickest and dirtiest way is to simply issue a GET request to your online app. If it responds correctly, then use the online version. If not, use the local cache. Just disguise the timeout/failed response as a 'loading' screen.
As far as I know, at the current moment, late 2011 the max-connections-per-server limit remains 6. Please correct me if I am wrong. This is bad that we cannot fix this easily as in Firefox. As far as I know this value is hardcoded.
One of the solutions is to download the Chromium's sources and rebuild them. Is there a more easy solution?
Is there any tricky way to hack this without creating a dozen of mirror-domains?
Why I'm asking the question: My task is to create a html-javascript slideshow that will run inside a fullscreened browser, and a huge monitor is hanging on the wall. The javascript is really complicated, it preloads photos and makes a lot of ajax calls to my web services. If WIFI connection is slow, if 6 photos are loading, the AJAX calls fail, the application runs bad. I want a fast solution based, on http or browser or ubuntu tweak something else, because rebuilding the javascript app will take days.
Offtopic: do you know any other things that can be tweaked in my concrete situation?
IE is even worse with 2 connection per domain limit. But I wouldn't rely on fixing client browsers. Even if you have control over them, browsers like chrome will auto update and a future release might behave differently than you expect. I'd focus on solving the problem within your system design.
Your choices are to:
Load the images in sequence so that only 1 or 2 XHR calls are active at a time (use the success event from the previous image to check if there are more images to download and start the next request).
Use sub-domains like serverA.myphotoserver.com and serverB.myphotoserver.com. Each sub domain will have its own pool for connection limits. This means you could have 2 requests going to 5 different sub-domains if you wanted to. The downfall is that the photos will be cached according to these sub-domains. BTW, these don't need to be "mirror" domains, you can just make additional DNS pointers to the exact same website/server. This means you don't have the headache of administrating many servers, just one server with many DNS records.
I don't know that you can do it in Chrome outside of Windows -- some Googling shows that Chrome (and therefore possibly Chromium) might respond well to a certain registry hack.
However, if you're just looking for a simple solution without modifying your code base, have you considered Firefox? In the about:config you can search for "network.http.max" and there are a few values in there that are definitely worth looking at.
Also, for a device that will not be moving (i.e. it is mounted in a fixed location) you should consider not using Wi-Fi (even a Home-Plug would be a step up as far as latency / stability / dropped connections go).
BTW, HTTP 1/1 specification (RFC2616) suggests no more than 2 connections per server.
Clients that use persistent connections SHOULD limit the number of simultaneous connections that they maintain to a given server. A single-user client SHOULD NOT maintain more than 2 connections with any server or proxy. A proxy SHOULD use up to 2*N connections to another server or proxy, where N is the number of simultaneously active users. These guidelines are intended to improve HTTP response times and avoid congestion.
There doesn't appear to be an external way to hack the behaviour of the executables.
You could modify the Chrome(ium) executables as this information is obviously compiled in. That approach brings a lot of problems with support and automatic upgrades so you probably want to avoid doing that. You also need to understand how to make the changes to the binaries which is not something most people can pick up in a few days.
If you compile your own browser you are creating a support issue for yourself as you are stuck with a specific revision. If you want to get new features and bug fixes you will have to recompile. All of this involves tracking Chrome development for bugs and build breakages - not something that a web developer should have to do.
I'd follow #BenSwayne's advice for now, but it might be worth thinking about doing some of the work outside of the client (the web browser) and putting it in a background process running on the same or different machines. This process can handle many more connections and you are just responsible for getting the data back from it. Since it is local(ish) you'll get results back quickly even with minimal connections.
This is a prize winner.
I have Apache on my PC and on my remote server.
I have a page that has been a real nuisance to get looking right in both IE (IE7) and FireFox (FF3.0). At the moment it is close enough in the two browsers to live with.
Here's the crazy part. When I look at the page as served by my localhost server and as served by my production remote host server it looks just a little different. The differences all look like matters of margins and/or placement and/or image size.
It doesn't matter whether it is IE or FF. I get 2 close but not identical renderings on IE and the same on FF.
I've done all the basic commonsense trouble shooting I can think of. I've repeated the FTP transfer several times and checked all the file permissions. I've even downloaded the CSS files into the browsers directly just to make sure they still look the same.
By any principles I can think of, what I am seeing SHOULD be impossible. I can't think of anything that would account for this completely bizarre effect.
i'd be very much obliged if you have any thoughts on the subject.
Thanks for reading.
It could be that you actually do not see the version you uploaded to the server but a cached one. (servers cache, your local cache, isp's cache...)
You can try to save the page and the css you get from both, the server and your local machine, and then compare it. I would use a diff tool like tkdiff. Then you know if there are differences in the markup.
When you've downloaded both versions to a local directory you can also check the differences in the browser again. Is it still there?
You can also instruct apache to send the page with the headers "no cache". Then all caching is disabled. The keyword to look for is "cache-control" here. There are many ways to send this information in the headers. (But note: you should not disable caching on a live system as it will slow down the whole thing...)
Tatu Ulmanen & Pekka -- My enthusiastic thanks.
You were right.
a) The content types were, as one would expect identical -- but I hadn't thought of checking it, so thanks for that.
b) Pekka gets the cigar. The browser zoom setting had been altered from window to window in the rush of work, and this made a noticable difference in the layouts -- as one would expect.
Frankly I doubt I would have thought to check that. Ironic no?
May luck smile upon you both.
This is an odd freeze. When I switch from source view to design view for an HTML or ASPX file, the client area freezes, but I can still click on other tabs and menus.
What am I missing here? Really don't feel like reinstalling VS2008.
I had the same problem, and found one resolution.
In VS 2008, In a page that was using a master page, the either frequency while working in source view or switching to design view, IDE would freeze for 10-20 seconds.
In my master template, I had references to the Google hosted JQuery, Jquery UI, and one or two more scripts off site. These were placed directly in my master page's head section.
I downloaded the js and then by deleting any offsite references, my IDE would be smooth again in both design and source mode.
I also discovered I could put the scripts inside my ToolkitScriptManager (I'm using AjaxControlToolkit) and added the Mode="Release", and could place the http://www.google.com references for the scripts. The IDE is still working fine for me.
This is often due to the Design mode downloading external resources that are timing out. As #JonK mentioned, for him it was jQuery references. I have seen this when the ConnectionString was set to production databases that could not be accessed on my development machine, even though I wasn't debugging (running) the site only editing code, it would still try to connect and because it couldn't it would stall waiting for the timeout.
VS2008 is mostly single-threaded for UI operations like this, so if it is downloading a slow or non-existent network path it hangs like this.
VS2008 can make all kinds of network requests, so these two examples may not solve it for you. The best way I have found to diagnose the problem is to use the Microsoft tool Process Monitor, filter by the Process webdev.exe, and watch for I/O requests that are long running and/or throwing errors. In my case, I could find the place that was having a problem because there would be a 20 second gap in between the hundreds of I/O entries in Process Monitor. Then, just back-tracked from when that gap began and I eventually found the request that was causing the problem.
This may not be possible for you, but if you can, an upgrade to VS2010 would help; it does a much better job of running process on multiple threads in more places so you don't have to worry about this as much.
Have you tried restarting your computer and then reopening your project?
I am thinking to save server load, i could load common javascript files (jquery src) and maybe certain images from websites like Google (which are almost always never down, and always pretty fast, maybe faster than my server).
Will it save much load?
Thanks!
UPDATE: I am not so much worried about saving bandwidth, as I am reducing Server Load because my server has difficulty when there are a lot of users online, and I think this is because there are too many images/files it loads from my single server.
You might consider putting up another server that does nothing but serve your static files using an ultra efficient web server such as lighttpd
This is known as a content delivery network, and it will help, although you should probably make sure you need one before you go about setting it all up. I have heard okay things about Amazon S3 for this (which Twitter, among other sites, use to host their images and such). Also, you should consider Google's API cloud if you are using any popular javascript libraries.
Well, there are a couple things in principle:
Serving up static resources (.htm files, image files, etc) rarely even make a server breathe hard except under the most demanding of circumstances (thousands of requests in a very short period of time)
Google's network is most likely faster than yours, and most everyone else. ;)
So if you are truly not experiencing any bandwidth problems, I don't think offloading your images, etc will do much for you. However, as you move stuff off to Google, then it frees your server's bandwidth up for more concurrent requests and faster transfer on the existing ones. The only tradeoff here is that clients will experience a slight (most likely unnoticable) initial delay while DNS looks up the other servers and initiates the connection to them.
It really depends on what your server load is like now. Are there lots of small web pages and lots of users? If so, then the 50K taken up by jQuery could mean a lot. If all of your pages are fairly large, and/or you have a small user base, caching jQuery with Google might not help much. Same with the pictures. That said, I have heard anecdotal reports (here on SO) that loading your scripts from Google does indeed provide noticeable performance improvement. I have also heard that Google is not necessarily 100% uptime (though it is close), and when it is down it is damned inconvenient.
If you're suffering from speed problems, putting your scripts at the bottom of the web page can help a lot.
I'm assuming you want to save costs by offloading commonly used resources to the web at large.
What you're suggesting is called Hotlinking.. that means directly linking to other people's content. While it can work in most cases, you do lose control of the content, that means your website may change without your input. Since image hosted on google are scoured from other websites, the images may be copyrighted, causing some (potential) concern, or they may have anti-hotlinking measures that may block the images from your webpage.
If you're just working on a hobby website, you can consider hosting your resources on a free web account to save bandwidth.