My webpage taking too much time while reload - html

I have three websites that i hosted in a shared hosting. The websites are loading fast in first time. But when you trying to reload the page, it took up to 40 seconds. I reinstalled databases, themes, plugins, i deleted the whole site and reinstalled again but nothing changed. Also i tried different browsers, did some pagespeed and gtmetrix analysis but its still sucks when i reload. I opened a ticket to hosting provider and they said there's nothing wrong with our servers. I'm using LiteSpeed server with PHP 7.2. There's a lot of system resources so theres nothing wrong about it. As a newbie, what could be the problem?
Here's my website https://yazilimcilarinmolayeri.com
Note: if i did any English mistakes, please forgive me. I'm from Turkey.

I tried disabling CloudFlare and it worked. Idk maybe cloudflare's caching system and the plugin's cache overlapped. So yeah, my problem solved. Thank you for all your help guys.
Btw i activated cloudflare again and its working well for now.

Although your website is working fine right now, this usually happens when you have cached resources, but I'm not sure as to why.
The browser recognizes that a resource is cached and it checks with the server if it needs to be updated. This checking might sometimes take too long (again i'm not sure why) and the browser sometimes won't timeout the request, meaning the page is still waiting for the response to be received.
As #Buttered_Toast mentioned, try clearing your cache, and that should fix it.

Related

Html changes are not updating in chrome only

So there is some kind of caching issue going on with chrome lately.
I updated an image on a web page, but it will not update in chrome. Other browsers are fine.
Also I'm going to note that I changed the html, and gave the image a new name to force browsers to grab the updated version.
i.e image.jpg > image-v1.jpeg
How on earth does chrome not grab the updated version of this?
Anyone seen this going on lately and know of a fix.
Thanks
When you clear browsing data, make sure to check the box to clear cached images, also make sure the duration is set to "All time".
https://i.ibb.co/jzYk6DK/ex.png
If that doesn't work, try renaming the image.
If that also doesn't work, take a look at the web server configuration expiry headers, and check the duration for cached images in your format. It could be in .htaccess, or it could be elsewhere depending on if you're using apache2 or Nginx.
For anyone stumped on this kind of an issue, if you are using a web farm check to make sure DFS replication across the servers is working correctly.
It was failing for one server in my case, this is why some users got the updated version while others didn't.
to verify if replication worked, log into the servers via RDP then check the last modified date time stamps for the file(s) in question.

Having trouble clearing cache to refresh webpage

This may sound like a very basic question but I feel like I've tried everything.
This a follow-up to this post I made earlier, where I resolved the issue, only for it to come back again.
To summarize, I was making some change to the contact.css file on the contact page of my website when I noticed the changes were working offline but didn't appear online. I narrowed this issue down to a caching issue with the above post (others could see the changes but I couldn't).
In the above example I couldn't get my website to show up as background-color:blue - eventually it worked and I thought I'd fixed it... So I go to change the color back to normal and boom, it stops refreshing the changes again.
So I think it's some sort of caching issue but for the life of me I can't get my cache to clear properly so that I can refresh and see the changes.
Here are the things I have tried already:
Clearing cache (many times) on Chrome, Firefox, and Opera
Hard refresh on Chrome, Firefox, and Opera
Disabling cache through dev tools on Chrome and Firefox (this worked initially then stopped working when I re-updated the website)
Checked multiple times that the CSS file uploaded correctly and the file path was correct. This was confirmed because the correct changes were seen by other people.
Flushed my DNS
Changed from my ISPs DNS to google's 8.8.8.8 + 8.8.4.4
I'm using HostGator to host my website, I'm wondering at this point whether it's something to do with them? I really just have no idea at this point.
Here's what I see online:
Here's what I should be seeing and what I do see on the offline version of my website:
I noticed you said "I'd really like to get to the bottom of the underlying issue" so I figured I'd write an answer to provide a few options (and if anyone wants me to add others, please feel free to add a comment). Overall though, determining your root cause is likely much harder than just solving your overall problem, but let's start with possible causes that I can think of off my head:
Multiple CDN servers taking a while to update so some are returning the old data (your current session) and some are returning new (incognito)
Server session caching so when you reload the page within one http context session you get back the same content (I've seen this in product search queries for example)
The solution to this is relatively simple though, it's called cache busting. Basically, every time you update your source code just add a unique key in either the query string, file name or something to make the url unique. For example, for your css you can link https://path/to.css?v2.0.1 and just keep increasing the version number as you go. If you use webpack for your build outputs, they have a content hash variable that you can use as a token in the file names.
As for the CDNs possibly caching things out of date... the content hash solution will solve that problem as it's an entirely different file name so the CDN will go get it from the root if it doesn't have it in it's cache. I'm unsure of the url version query parameter will do the same, maybe someone else could shed some light on that.
Have you tried using Incognito in Chrome?

How to solve cache problem on modern browsers?

we are developing a VueJS based application. We have huge caching problem.
Team members are constantly updating the site but we are getting feedbacks about the solved problems such as typos and miss placed elements.
I personnaly tried the inspect this situation, I found that Chrome reads the files from disk cache or memory cache until the page is refreshed. Even though sometimes chrome still loads the old page when we are re entering the site again (after the refreshing process (ctrl + shift + r)).
I' m sorry for my bad english but I tried my best to explain what I encounter. Also I found a topic about the problem, OP has explained the what I was encounteing. You can also check that out.
How to clear cache of service worker?
I created a website on IIS (local machine windows 10), published the project and tried to reaching it with local ip adress (127.0.0.1:8093), in the network tab I can see the .js and .css files being downloaded then I restart the browser and tried again, this time files are being served from disk cache, I tried couple of times and sometimes files are served from cache and sometimes downloaded.
I tried to add serviceWorker but I got empty handed. Also I created a base project to test some vuejs features and I added same serviceWorker code to the project. It cached again.
Our servers is windows 2012 server with IIS 8.
If it is possible we want no-caching approach or we want to manage what's cached and not. If you can help we would be appreciated.
You can checkout the base project
vue-base project
What I tried
As I said above I tried to add service workers as github commit,
https://github.com/vuejs-templates/pwa/pull/21/files
Also I tried deleting the cached data caches.delete(cacheName) did not seem to work.
I don't know if the serviceworkers related to this problem but did not solve my problem. May be I could not add the code properly. If you can help I would be very appreciated.
Thank you for your helps.
Edit1: Screen GIF
I dont know what you have been using to bundle your code and assets, but with webpack it is possible to create the files with a hashcode, which means that everytime the browser finds a new file reference in your browser it will download it.
Ex: you deployed yesterday a code which contained main.34534534534.js
Today you deploy again but the file is main.94565342.js. Your browser will automatically invalidate cache.

joomla slow query log

I would know what are the queries that make slow my Joomla website. I know that there is the possibility to log all the slow query.
Unfortunately, I have only the access to the ftp where the website is hosted. Can I see this log also in ftp? Or I have to access to the server?
There are other way to see this log?
Thanks
As Rinos already said, there can be different reasons for a slow Joomla-Site.
If you cannot find a db-query being responsible for that check the network-tab in your developer-tools for resources that slow down your site.
One possible reason can be, that you are loading http-resources via https (so if you have hardcoded integrations of images, script-files etc. that will load via http while your site running in https, the developer tools will bark at you something about 'mixed content' ;) )
Depending on your Joomla there also might be some Modules/Components/Plugins that are not well designed... maybe deactivating them by chance and refreshing (yep, might be lot of work) will give you a hint. BUT: Please be careful, since there are some Plugins like the authentication-plugins that are needed and if you deactivate them you might "lock" yourself out. Normally core components and plugins shouldn't be responsible for that at all.
If you have a look at the queries from the debug-console, there are some queries that perform a full-table-scan. Maybe you'll find one among these that are performance-hungry.
If not so, please check your global configuration in the Joomla Backend under System -> Global Configuration and try check the following things:
Is caching enabled?
What kind of caching do you use?
Under the Server-Tab check for gzip-compression of your page
if you force https, do you have any http-resources on your site (like mentioned above)
Some of the possibilities here might help you to gain some performance, but if you still have performance problems, my next look would be the server-config.
There are still other things you might give a "heads up":
What PHP-Version are you running? PHP 7.x brings a remarkable performance-boost
Have a look at your php.ini file. What about your memory-limit and other options (have a look here for the technical requirements that are recommended: https://downloads.joomla.org/technical-requirements
Back on your site, are there any javascript-errors (the developer-tools of your browser will tell you)
Well, these are several possibilities you might pay some attention... Performance-Issues can have many reasons but hopefully some of the above said things might lead you on the right track ;)
regards
You can active debug in joomla from Configuration system, then you can see at the end of pages in "live site" all query performed with time and memory used, and much more you can help for understand what is slowing down your site

Debugging why the HTML Cache isn't working

I was wondering if anybody could direct me to any tools for debugging the cache.manifest file in offline HTML5 access. I recently downloaded a program called Manifesto which allows me to look up the cache manifest on loading a page. Everything seems to working fine however it keeps on saying that the status is "uncached". Obviously, it seems like although it is checking to make sure the cache files are there, it isn't actually caching them upon load. Whats going on and more importantly, how do I figure out how to solve it?
I got it to work. To be honest, it might have been working before but because I was having some different php scripts running in the background I had to be a little careful with pulling it up.