I normally leave a website alone and check every now and then after it is finished. I wasn't paying attention to my google analytics, and then on april 18-20 there is a significant drop on number of users & the bounce rate became 100%.
I'm a web developer by profession, and i didn't spot anything wrong.
The site doesn't seem to be hacked, and just before i reinstall everything (files and DB), I wanted to ask if you can find something is wrong with it?
Site in question
http://www.graciesingapore.com/
Regards,
Jairus
The site loads a lot of different javascript and css files, maybe you can try a caching plugin like WP-Rocket or WP Fastest Cache to bundle and cache the files.
Ask some questions...
When you put the site live, was the performance good?
If it was what changed in-between then and now?
Was it an automated core/plugin/theme update that slowed down the
site?
What happened to traffic after 20th April?
Did bounce rate decrease and visitors increase?
What do your server logs show for the 18th to the 20th?
etc etc...
Your site does load a ton of resources and on the programs page there are two images each over 1MB that load in. Try WP Smush over your media folder
Related
I have three websites that i hosted in a shared hosting. The websites are loading fast in first time. But when you trying to reload the page, it took up to 40 seconds. I reinstalled databases, themes, plugins, i deleted the whole site and reinstalled again but nothing changed. Also i tried different browsers, did some pagespeed and gtmetrix analysis but its still sucks when i reload. I opened a ticket to hosting provider and they said there's nothing wrong with our servers. I'm using LiteSpeed server with PHP 7.2. There's a lot of system resources so theres nothing wrong about it. As a newbie, what could be the problem?
Here's my website https://yazilimcilarinmolayeri.com
Note: if i did any English mistakes, please forgive me. I'm from Turkey.
I tried disabling CloudFlare and it worked. Idk maybe cloudflare's caching system and the plugin's cache overlapped. So yeah, my problem solved. Thank you for all your help guys.
Btw i activated cloudflare again and its working well for now.
Although your website is working fine right now, this usually happens when you have cached resources, but I'm not sure as to why.
The browser recognizes that a resource is cached and it checks with the server if it needs to be updated. This checking might sometimes take too long (again i'm not sure why) and the browser sometimes won't timeout the request, meaning the page is still waiting for the response to be received.
As #Buttered_Toast mentioned, try clearing your cache, and that should fix it.
I'm experiencing the same issue from time to time across a number of WordPress sites - so I figure it's worth starting a conversation as Google doesn't seem to have much to say about this.
In a number of sites, at differnt times, using ACF Pro, certain data seems to go missing from the front-end. Simply resaving the post in admin and then refreshing on the front-end resolves the display issue - per post.
As this is happening in hundreds of posts on a brand new site this time, it's worth seeing if anyone has had this - and what they've done to resolve it. Finding each post amongst thousands is going to be too much effort to hunt and resave each from the admin panel.
After being in touch with the developer of the plugin, they pointed us to a caching plugin as being an issue. I haven't had any issues since disabling this caching.
I would know what are the queries that make slow my Joomla website. I know that there is the possibility to log all the slow query.
Unfortunately, I have only the access to the ftp where the website is hosted. Can I see this log also in ftp? Or I have to access to the server?
There are other way to see this log?
Thanks
As Rinos already said, there can be different reasons for a slow Joomla-Site.
If you cannot find a db-query being responsible for that check the network-tab in your developer-tools for resources that slow down your site.
One possible reason can be, that you are loading http-resources via https (so if you have hardcoded integrations of images, script-files etc. that will load via http while your site running in https, the developer tools will bark at you something about 'mixed content' ;) )
Depending on your Joomla there also might be some Modules/Components/Plugins that are not well designed... maybe deactivating them by chance and refreshing (yep, might be lot of work) will give you a hint. BUT: Please be careful, since there are some Plugins like the authentication-plugins that are needed and if you deactivate them you might "lock" yourself out. Normally core components and plugins shouldn't be responsible for that at all.
If you have a look at the queries from the debug-console, there are some queries that perform a full-table-scan. Maybe you'll find one among these that are performance-hungry.
If not so, please check your global configuration in the Joomla Backend under System -> Global Configuration and try check the following things:
Is caching enabled?
What kind of caching do you use?
Under the Server-Tab check for gzip-compression of your page
if you force https, do you have any http-resources on your site (like mentioned above)
Some of the possibilities here might help you to gain some performance, but if you still have performance problems, my next look would be the server-config.
There are still other things you might give a "heads up":
What PHP-Version are you running? PHP 7.x brings a remarkable performance-boost
Have a look at your php.ini file. What about your memory-limit and other options (have a look here for the technical requirements that are recommended: https://downloads.joomla.org/technical-requirements
Back on your site, are there any javascript-errors (the developer-tools of your browser will tell you)
Well, these are several possibilities you might pay some attention... Performance-Issues can have many reasons but hopefully some of the above said things might lead you on the right track ;)
regards
You can active debug in joomla from Configuration system, then you can see at the end of pages in "live site" all query performed with time and memory used, and much more you can help for understand what is slowing down your site
I maintain a local intranet site that among other things, displays movie poster images from IMDB.com. Until recently, I simply had a perl script download the images I needed and save them to the local server. But that became a HUGE space-hog, so I thought I could simply point my site directly to IMDB servers, since my traffic is very minimal.
The result was that some images would display, while others wouldn't. And images that were displayed, would sometimes disappear after a few refreshes. The images existed on the IMDB servers, they just wouldn't display on my page.
It seems unlikely to me that IMDB would somehow block this kind of access, but is that possible? Is there something that needs to be configured on my end?
I'm out of ideas - it just doesn't make sense to me.
I'm serving my pages with mod_perl and HTML::Mason, if that's relevant.
Thanks,
Ryan
Apache/2.2.14 (Unix) mod_ssl/2.2.14 OpenSSL/0.9.8l DAV/2 mod_perl/2.0.4 Perl/v5.10.0
Absolutely they would block that kind of access. You're using their bandwidth, which they have to pay for, for your web site. Sites will often look at the referrer, see that its not coming from their site, and either block or throttle access. Likely you're seeing this as an intermittent problem because IMDB is allowing you some amount of use of their images.
To find out more, look at the HTTP logs on your client. Either by using a browser plugin or by scripting it. Look at the HTTP response codes and you'll probably see some 4xx or 5xx responses.
I would suggest either caching the images in a cache that expires unused images, that will balance accesses with space, or perhaps getting a paid IMDB account. You may be able to get an API key to use to fetch images indicating you are a paying customer.
IMDB sure could be preventing your 'bandwidth theft' by checking the "referer". More info here: http://www.thesitewizard.com/archive/bandwidththeft.shtml
Why is it intermittent? Maybe they only implement this on some of the servers in their web farm.
Just to add to the existing answers, what you're doing is called "hotlinking", and people who run websites don't like it very much. Google for "hotlink blocking".
I am thinking to save server load, i could load common javascript files (jquery src) and maybe certain images from websites like Google (which are almost always never down, and always pretty fast, maybe faster than my server).
Will it save much load?
Thanks!
UPDATE: I am not so much worried about saving bandwidth, as I am reducing Server Load because my server has difficulty when there are a lot of users online, and I think this is because there are too many images/files it loads from my single server.
You might consider putting up another server that does nothing but serve your static files using an ultra efficient web server such as lighttpd
This is known as a content delivery network, and it will help, although you should probably make sure you need one before you go about setting it all up. I have heard okay things about Amazon S3 for this (which Twitter, among other sites, use to host their images and such). Also, you should consider Google's API cloud if you are using any popular javascript libraries.
Well, there are a couple things in principle:
Serving up static resources (.htm files, image files, etc) rarely even make a server breathe hard except under the most demanding of circumstances (thousands of requests in a very short period of time)
Google's network is most likely faster than yours, and most everyone else. ;)
So if you are truly not experiencing any bandwidth problems, I don't think offloading your images, etc will do much for you. However, as you move stuff off to Google, then it frees your server's bandwidth up for more concurrent requests and faster transfer on the existing ones. The only tradeoff here is that clients will experience a slight (most likely unnoticable) initial delay while DNS looks up the other servers and initiates the connection to them.
It really depends on what your server load is like now. Are there lots of small web pages and lots of users? If so, then the 50K taken up by jQuery could mean a lot. If all of your pages are fairly large, and/or you have a small user base, caching jQuery with Google might not help much. Same with the pictures. That said, I have heard anecdotal reports (here on SO) that loading your scripts from Google does indeed provide noticeable performance improvement. I have also heard that Google is not necessarily 100% uptime (though it is close), and when it is down it is damned inconvenient.
If you're suffering from speed problems, putting your scripts at the bottom of the web page can help a lot.
I'm assuming you want to save costs by offloading commonly used resources to the web at large.
What you're suggesting is called Hotlinking.. that means directly linking to other people's content. While it can work in most cases, you do lose control of the content, that means your website may change without your input. Since image hosted on google are scoured from other websites, the images may be copyrighted, causing some (potential) concern, or they may have anti-hotlinking measures that may block the images from your webpage.
If you're just working on a hobby website, you can consider hosting your resources on a free web account to save bandwidth.