Html load plaintext mime css file - html

I trying to load a css library from github (not uploading to my server due limited bandwidth)
It keeps saying strict mime checking stuff is theres any workaround that i can get this working, without using any bandwidth from my server?

What you are asking for is CDN, github in it self is not a cdn and by design responds with plain text mime type which is why you getting errors etc. On top of that if you get frequent plain text request hits on repository then github takes it as something other then intended is going on and begins flagging things.
some cdn netwroks will allow you to turn github repo into cdn, they fetch and store code base on their server behind the scene.
Example (which I often use):
https://cdn.jsdelivr.net/gh/{username}/{repo}/
It will save your server bandwidth and also turn any github repo into cdn with adjusted mime type. Do keep on mind though since it would cached at CDN if you are making frequent changes to library they might not be updated in real time time lag could be anywhere from 5 minutes to 48 hours. You dont have control over this.
Secondly, now days if you use your headers right resources are vigorously cached by browsers (you have to work in it though) hence why we use hard refresh to purge/update cache.

Related

PDF in Object tag does not read latest version

I am hosting a website where a couple of PDF-files which are currently in Object-tags are updated weekly.
The name of these PDF-files stay the same, but the data changes.
Currently I'm using:
<object id="men" data="seasons/S2223/Men2023.pdf?" type="application/pdf" width="100%" height="750px">
<p>The file could not be read in the browser
Click here to download
</p>
</object>
When I update the PDF I'm expecting the
data="seasons/S2223/Men2023.pdf?"
to be reading the latest PDF however it stays the same as before.
I added the ? at the end of the filename which should check for the latest version but it doesn't seem to work.
When I clear my browser's cache it's updated but ofcourse this isn't a suitable option for users.
All help is appreciated.
Caching, in this context, is where the browser has loaded the data from a URL in the past and still has a local copy of it. To speed things up and save bandwidth, it uses its local copy instead of asking the server for a fresh copy.
If you want the browser to fetch a fresh copy then you need to do something to make it think that the copy it has in the cache is no good.
Cache Busting Query String
You are trying to use this approach, but it isn't really suitable for your needs and your implementation is broken.
This technique is designed for resources that change infrequently and unpredictable such as the stylesheet for a website. (Since your resources change weekly, this isn't a good option for you.)
It works by changing the URL to the resource whenever the resource changes. This means that the URL doesn't match the one the browser has cached data for. Since the browser doesn't know about the new URL it has to ask for it fresh.
Since you have hardcoded the query to n=1, it never changes which defeats the object.
Common approaches are to set the value of the query to a time stamp or a checksum of the file. (This is usually done with the website's build tool as part of the deployment process.)
Cache Control Headers
HTTP provides mechanisms to tell the browser when it should get a new copy. There are a variety of headers and I encourage you to read this Caching Tutorial for Web Authors and Webmasters as it covers the topic well.
Since your documents expire weekly, I think the best approach for you would be to set an Expires header on the HTTP resource for the PDF's URL.
You could programmatically set it to (for example) one hour after the time a new version is expected to be uploaded.
How you go about this would depend on the HTTP server and/or server-side programming capabilities of the host where you deploy the PDF.

Bootstrap CDN Rendering Delay

Have been working on a site and have used Boostrap via MAX CDN by putting
<link rel="stylesheet" href="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.7/css/bootstrap.min.css">
in the head section.
Google Pagespeed is showing:
Your page has 1 blocking CSS resources. This causes a delay in rendering your page.
None of the above-the-fold content on your page could be rendered without waiting for the following resources to load. Try to defer or asynchronously load blocking resources, or inline the critical portions of those resources directly in the HTML.
Optimize CSS Delivery of the following:
https://maxcdn.bootstrapcdn.com/bootstrap/3.3.7/css/bootstrap.min.css
Is there anyway of fixing this?
Thanks
The other answer is too incorrect, so I am writing this if anyone still needs help on this.
CDNs speed up your production websites too, not only your local development environment. Hosting a static file locally on your own host does not solve the Render Blocking issue that pagespeed tool suggests here. A static file hosted locally is still render blocking.
The solution here is inlining the render blocking CSS files completely. However, I would recommend to not inline the resources for several reasons:
Resources delivered over CDN are delivered faster. If you inline them, the entire payload is expected to deliver at slow pace.
Resources delivered separately are cached by browser, while if you inline CSS resources browser will not be able to reuse them across requests.
Resources inlined are compressed on every request along with the other dynamic contents of the file. You cannot take advantage of pre-compression that comes with CDN.
Bigger HTML document due to inlined CSS files is expected to slow when it comes to parsing.
What should one do to resolve this issue? You might think differently, but I would suggest you to do nothing.
Its not that bad if the browser has to wait a little for resources to download. If you are using CDN, chances are visitors will not perceive the download as most of the CDNs now a days have less than 50ms average global latency.
Resources are cached by browser. While pagespeed tool loads fresh resources on every request, please note that browser may be loading some of the resources from cache, completely eliminating the CDN request.
Resources are shared across websites. If you are using Bootstrap from a public CDN, chances are that the same bootstrap file is already available in browser cache of the visitor, that is downloaded when the user visited another website that used bootstrap. This gives 0ms latency and 100% bandwidth saving for that particular resource for even your first time visitors that have no other resources of your site in their browser cache. Browser can now spend the saved bandwidth elsewhere to speed other things up.
Manually inlining external libraries make it little more difficult to keep traces of all inlined copies of the library and makes the edits and updates hard.
Dynamic inlining adds few more disk seeks per request. This just adds to the server load. These IOs can be saved if the files are served separately. Your OS may just cache the files if they are hot enough. Using CDN will completely eliminate this load.
Although not exactly a solution that will remove this pagespeed warning, its possible to reduce the impact of render blocking resources for real visitors (not performance measurement tools):
Serve resources with aggressive compression to reduce the payload size.
Serve resources with immutable cache-control header to tell the browser to confidently store this file for longer period as this is not going to change in the future. If you use bootstrap cdn by pagecdn, these two optimizations are enabled by default.
If you know a file is going to be loaded immediately after a page load, you can use HTTP/2 Server Push to deliver the file before even the browser asks for it. However, if you do this, you will need to make sure that the same files are not aggressively been pushed on every request (that is not a good option as the files should load from browser cache on second request onwards).
Either change the CDN (which will most likely do nothing for you) or, simply store the file locally. This will up your Google Page Speed rate.
Download a copy of the bootstrap file you are using and store it like so root/css/bootstrap.min.css
Where root is your project folder.
CDN's are used mainly for test purposes (instant access to files) or in larger-scale projects which have multiple requirements that can't always be met locally.
Read this thread to better understand.
While the error that Google gives you might not be a serious issue for your project, it is always a good practice to use your resources locally, so that your website may load by itself, without referencing external sources that resemble in a separate query.
Static files = better load times = happy Google.

How much slower is it to put .js in an external file rather than directly on the page?

I assume that if you put some Javascript code in an external source (and use the src="") that it's a little slower b/c the page has to then download another portion, but I'm wondering whether that's inconsequential.
From testing I've done online (with webpagetest.org) seems quite small (< 5% of the total page time loading).
But just wondering about what's happening "under the hood" and whether the browser (I assume) is spinning up another process to download that bit separately rather than coming across from the server with the rest of the page is actually just as fast (b/c it's happening in parallel).
Not slow enough to matter.
I think the speed difference question is a red herring. Generally, you should keep your script separate from your html:
Separation of concerns: the html is the structure of the site, whereas the script is its behavior. It mixes concerns to mingle them together, and it's best practice to keep your script in a separate file.
It might seem counter-intuitive that script served separately from html could be just as fast or even faster, but things like caching proxy servers, content delivery networks, and even new web protocols like SPDY can make the speed question completely moot.
If you test in Firebug in Firefox you'll see that Firefox is downloading multiple files at the same time (the number of concurrent files is different for each browser). But the main reason why you should put js code in external files is that it can be minified and compressed on the server side, and also cached by the browsers. Loading it from an external file has also the benefits of being able to load it from a static domain (cookie less) and use a CDN to speed up the delivery. So to reply to your question it'll be slower to put it in the page as the browser will need to download it every time it loads the page.

Browsers don't refresh html

I'm building a website whose hosting supports only html and javascript. I've corrected some mistakes but visitors probably won't see them because browsers can't show the updated pages. They show the old pages.
I remember "html expires" code but it's too late. Because many visitors saw our site.
I'm assuming your HTML files aren't being updated and not CSS style sheets or JS files (for which the answer would be different).
This is mainly a server side issue - you would have to check with your hosting provider what caching headers the server emits. Ideally, the server would listen to if-modified-since requests from browsers so it can serve updated content if there is any, and make the browser use the cached copy if there is none.
To remedy the problem at hand, you may need to rename your existing HTML files, or put them in a new directory. That will force the browser to actively re-fetch them. Of course, that still means that the entry point (usually index.html) will have to be re-fetched by the client - otherwise they will never notice the new structure, and hence not re-load anything.
A hosting configuration that indiscriminately dishes out caching instructions that prevent frequent updates from being made is not really useful. Talking to your hosting provider would probably be a good idea.

About HTML5 offline storage

I have a few questions regarding HTML5 offline storage, which I could not figure out.
Where exactly these files are stored in the Windows? I could not find here:
C:\Documents and Settings[User Name]\Application Data\Mozilla\Firefox\Profiles\
Is there any expiration time, after that browser deletes these files automatically? Or do the files remain forever?
What if I change the contents of the page, is there anyway to refresh the refresh the data which is stored offline?
Thanks.
I found them in %AppData%/Profiles/<currentprofilename>.default/OfflineCache. I'm using Windows 7.
This depends on the expires headers your web server sends for the files in question. It is recommended you set the expires header to one week, but it is up to you, you can make it never expire. Note that the manifest file itself should be set to never be cached.
In order to refresh the data you must actually change the manifest file. It is recommended that somewhere in the manifest file you put a comment with the version number, then update it every time you change any of your other files.
Edit: I had answered these questions thinking you meant offline application cache, not local storage.
Well, for the sake of accuracy, it should be mentioned that although localStorage was indeed part of the HTML5 specification, it was split into its own after getting slightly over-complicated to be include alongside with the rest of HTML5.
It really depends on your browser, but it should be found on your AppData folder, in /profiles//OfflineCache. (for Windoes 7).
There is generally NO expiration date for localStorage, it can remain forever unless specifically removed by the website.
Javascript changes the localStorage data, (assuming you don't touch the actual file), in which case, the website you are using (or writing) needs to be smart enough to refresh the localStorage along with the page's content.