I have been trying to host my website using firebase. I have a folder in which there is an index.html page and another html page which i have connected to the index through a button link. This "another html" page contains a large image. Apparently, when i upload these files two things take place really slow:
The firebase takes time to host the site(around 9-10 min)
The image inside the "another html" page takes an era to load on my desktop
but is fast in my mobile device.
What should i do to speed things up? For reference i have some images. If you need any more reference please do mention.
P.S The other html is index1.html
The time it takes to deploy your site to Firebase Hosting is dependent on:
the size of the site assets you're trying to deploy
the bandwidth that you have available
And to a lesser degree:
the time it takes your local machine to compress the site assets
In my extended usage, deploys typically take seconds, not minutes. But my sites are usually quite small and I tend to deploy when I'm on a connection with good bandwidth.
To make deploys faster, you'll either have to reduce the total size of what you're trying to deploy, or deploy when you're on a faster connection.
Related
I have a website that runs just fine on my local server. It's quick, responsive, and overall runs great.
But when I put it on the server of my domain host, sometimes it takes excessively long to load assets. For example, one 1MB png file took 2.31 seconds to load:
Chrome's Network Developer Tool reveals to me the following:
So is this likely due to poor implementation of my code or is it possibly a crappy server? (The company is subscribing at the lowest tier possible to host their content) My internet connection is quick so I doubt it's that.
I think it is probably a problem with your host. An image is an image :) there aren't a 100 ways to implement one!
Oversized images always take longer to load, so you should keep your images as small as possible.To lower down the content download time, you can optimize/compress the image without degrading it's visual quality.If you are using any graphics software to optimize the images, you should use “Save for Web” option. This will reduce the size of images and hence image load time.
Furthermore, you can use CDN to serve static assets of your website like, images, CSS,JS, videos, etc. A CDN populates your website files to geographically distributed network of servers called POPs. CDN serves the website resources from the nearest geographical location of a visitor, that means your website assets will load more faster.
Use SSD based host. SSD has excellent read/write rates compared to that of traditional HDDs. Hence, solid state drives perform better than hard disk drives and they are almost 100 times faster than traditional HDDs.
Here's a question for you
Is the image you're trying to load in background, or its an image tag?
Adding BackgroundTransferRequest's to the BackgroundTransferService once the app is in the background is successful, but the new requests don't actually run until the app comes back to the foreground. Not so great for my scenario of downloading lots of small files that may take a fair amount of time to complete.
I imagine Microsoft has probably implemented this behavior by design(?), but does anyone know a way around this or an alternative approach?
A bit of background to the problem:
I'm developing a Windows Phone 8 map app that allows sections of maps to be downloaded and cached for offline use. This process can lead to 1,000's of map tiles needing to be downloaded.
I've created a process that spawns the full limit of 25 BackgroundTransferRequest's, then adds more to the BackgroundTransferService as requests complete. This all works fine until the app actually goes in to the background.
I have considered doing some web server side processing to allow tiles to be bundled in to a zip and downloaded as a single request, but this is extra complication and will result in twice the space being required on the phone to complete the download and then extract the files before deleting the original package. But, ideally I'd like to find a way to force new BackgroundTransferRequest's to start running in the background.
It's not clear what your actual question is but I'd definitely recommend bundling them into a zip file and then downloading that. It's almost always easier to work with a single file than thousands.
If disk space is really a genuine issue (not just a theoretical one - I've put thousands of map tiles in under 20mb before, but it will depend on image complexity & quality) then you could make a few zip files. then you'd avoid the BTR issue and not take up as much disk space (even temporarily).
I have single page web-app that currently consists of four files:
index.html
main.js
style.css
sprites.png
This means that every user who loads the site has to request index.html, parse it for the other three files, and then make three more http requests (serially, I believe) to fetch the remaining files.
It seems to me that it might be (a tiny bit) faster to embed the javascript, css and sprite image (base64 encoded) directly in the index.html file.
The main reasons I can think not to do this, along with my reasons why I don't think they apply in this case, are as follows:
Would prevent any of these additional files from being cached separately. This is not an issue for me because they will never be loaded from another page (since there is only one html page)
If the files were on different CDN servers, they could be downloaded in parallel. (Currently this project is not large enough to merit multiple servers)
I should disclose that this site is a small pet project that is no-where near large enough to merit this kind of meticulous performance tuning, but I like using pet projects as an avenue to explore problems (and their solutions) that I may face in my day job.
This isn't usually done because you increase the size of the entire HTML page. You'll save a couple requests on the first visit, but you'll force the client to reload everything every time they fetch the HTML file.
It would improve performance for users who visit your site once, and only once. For any kind of long-term strategy, it's unsuitable.
When your page is reloaded js, images, and CSS are cached on the client and doesnt need to reload. Also, base64 requires your clients to activate JavaScript to see your page. Lastly, it may very well take a weak client longer to decode your base64 than downloading the files.
So in short, dont overthink some things.
I've noticed that on mobile Safari, when I deliver my assets via Cloudfront they load noticeably slower than if my assets are served just from my EC2.
Specifically, my site has a main background image that appears noticeably slower than the text delivered by EC2. The loading of this background image doesn't noticeably lag behind the text on Chrome on my laptop presumably b/c of the greater performance of Chrome vs. mobile Safari.
I'm at a loss about what to do about this since the whole point of Cloudfront is to serve assets fast and take load of my EC2 but the delay in this background image appearing makes for quite an ugly, i.e., unacceptably bad, UX.
NOTE: Please don't reflexively vote to migrate this question to another SE site as the whole point is that it is not clear what approach would be best.
We've done some comparative tests and it seems that the advantage of using cloudfront depends on the size of the requested file.
For a small file (2kb) CF response time is greater than requesting directly to EC2.
For a 15kb file the response time is almost the same.
For a 57kb file (jquery-1.3.2.min.js), cloudfront was 4x to 5x faster than EC2.
I have been searching to see if it is possible to include a script that will push a low bandwidth version of a web site for those who have dial up. There are two sites (one with lots of JS, images, code, etc.) and a site that is limited. I would like those who have dial up to view that page instead of the slow loading page.
Is there a script for this?
Thanks.
You could send the client a tiny ajax script that calls back the server. The process can be timed and you can make your decision based on the result time. However some time will be lost in the process, but the time saved for the slow browser could be significant.