I am implementing a very simple download speed tester.
To do that, I simply download a large image file from the web and see if it's received within a reasonable time.
I'm using the following file because I saw that in a source code somewhere:
https://cdn.pixabay.com/photo/2017/08/30/01/05/milky-way-2695569_1280.jpg
However, I am afraid that the image might go away some time.
Which image could I use to make sure it will always be there?
Is there perhaps an image that a foundation or so has created especially for such a purpose and who promises that it will be there for a long time?
I was even thinking about downloading a really popular js file because I was thinking that it will be there for a long time like https://code.jquery.com/jquery-3.6.0.min.js, but I am not sure about this either.
How could I handle this task in the most reliable way?
Thank you!
I would recommend Wikimedia commons.
Reasons:
Wikimedia has a very strict guideline that allows only high quality uploads which of course are higher in size.
It's free and persistent (It perisist for years )
You can go with videos for even bigger sizes.
For images it's https://commons.m.wikimedia.org/wiki/Category:Images
Related
I'd like to make an HTML document with all of my own photos saved on my computer, and use it as an offline catalog, as most image viewing software (commercially available) don't satisfy my needs.
There's about 1000 photos and the vast majority of them are PNGs, about 10-15MB each.
I'm suspecting that each time I'd open this HTML document, loading would take an enormous amount of time.
To prevent that, I was told I should create thumbnail versions of those photos (e.g., 200x200px JPGs), and use them instead, while enclosing them in links that redirect me to the original locally saved PNG files.
Just asking to verify if there's any better alternative to this, or should I really go through this thumbnails solution? If there isn't, how would I mass generate those thumbnails, so I don't do it individually?
I'm on a High Sierra Mac and a Windows 7 PC.
Extra question: Should I, by any chance, use a document coded in a language other than HTML, for this purpose? For example, a language faster or more efficient or more cross-platform, that I didn't think of myself? I'm open to suggestions.
Thank you.
So, I've made my first website, but I've run into a sort of a problem. In the website, here, I have photos. Now, the image dimensions are fairly large, and I would like to keep it that way, so they would be resized to fit as many screen sizes as possible. Now, as you can see in the site, I have uploaded my pictures to imgur. That was before I had a server to work with, for testing purposes. Will uploading the pictures to my server improve load time of the photos?
Also, on a side note, what are the advantages of uploading them to a server? (one with limited space, mind you. This is mainly why I am asking.)
Would uploading plugins, like say, Jquery, to the server, instead of using, say, <script src="CDN of choice" /> and <link> for css, improve initial load time?
My apologies if this is a stupid question, but bare in mind that I am new to this. If you'd like to, any other critique on the site would be appreciated.
The biggest improvement you can get here is by compressing your images. When saving in photoshop, select save for the web, and to further decrease the file size. Use something like https://tinyjpg.com/ . I was able to compress one of your images that was 2.5mb to 947kb. Do this for all your images and you won't have to worry about CDN and load time. Your javascript and css files are much much smaller than your images and I doubt you will get any real load time improvement by optimizing them.
Uploading your image (and/or other assets) to your server. Although it makes sense at first, but will make you have the following problems:
You lose lots of your allocated bandwidth, the more people visiting your website.
The speed of yourwebsite for your visitors will depend on your server and how far they are from it.
For the second issue, I would always suggest using CDN relay like cloudflare or cloudfront. it makes the load time much faster and also saves the bandwidth.
Also, if bandwidth and spare id a real problem, you should either consider using a better host, or simply use something like Amazon s3 to host your static files.
I'm testing a website speed using PageSpeed Insights tool.
In the result page, one of the warnings suggested me to reduce byte size of css, html and js files.
At the first I tried to remove comments, but nothing changed.
How can I do that?
Should I remove spaces and tabs?
It seems to be a very long operation, worth it?
The action of removing spaces, tabs and useless chars is called minify.
You don't need to do that, there are a lot of services that can minimize files for you.
for example:
http://www.willpeavy.com/minifier/
Be care if you have jquery code: sometimes it removes spaces in wrong place.
You have two things to do to reduce page size:
Minify CSS & JS files
In server side, if you are running your website via Apache, you can install APC, for page cahing. You'll have better parformances
APC
In addition to CSS minifier/prettifier tools above, I recommend using proCSSor for optimizing CSS files. It offers variety of advanced options.
Never found those tools to be much use beyond giving some tips for what might be slowing it down. Minifying is unlikely to achieve much. If you want to speed up your site, save the page and see what the largest files are. Generally they will be the image files rather than the code, and see if you can reduce these.
Also, try and test it on two servers - is your host slow?
If your html file is massive, that suggests a problem with the site's structure - it is rare that a page needs to be large.
Finally, large javascript files are most likely to be things like jquery. If Google hosts these, then use the hosted version. That way, it will probably be already in a user's cache and not impact on your loading time.
EDIT, after further testing and incorporating the issues discussed in the comments below:
PageSpeed Insights is an utterly amateurish tool, and there are much more effective ways to speed up the rendering time than minifying the codes.
PageSpeed Insights is an utterly amateurish tool, that as a matter of standard advises to reduce HTML, CSS and JS file sizes, if not minified. A much, much better tool is Pingdom Website Speed Test. That compares rendering speed to the average of the sites it is asked to test, and gives the download times of the site's components.
Just test www.gezondezorg.org on both, and see the enormous difference in test results. At which the Google tool is dead wrong. It advises to reduce the CSS and JS files, while its own figures (click the respective headers) show that doing so will reduce their sizes with 3.8 and 7.9 kB, respectively. That comes down to less than 1 millisecond download time difference! (1 millisecond = 1/1000 of a second; presumed broadband internet).
Also, it says that I did do a good thing: enable caching. That is BS as well, because my .htaccess file tells browsers to check for newly updated files at every visit, and refresh cached files whenever updated. Tests confirm that all browsers heed that command.
Furthermore, that site is not intended to be viewed on mobile phones. There is just way too much text on it for that. Nevertheless, PageSpeed Insights opens default with the results of testing against mobile-phone criteria.
More effective ways to speed up the rendering
So, minifying hardly does anything to speed up the rendering time. What does do that is the following:
Put your CSS codes and Javascripts as much as possible in one file each. That saves browser-to-server (BTS) requests. (Do keep in mind that quite a number of Javascripts need the DOM to be fully loaded first, so in practice it comes down to putting the scripts as much as possible in 2 files: a pre- and a post-body file.)
Optimize large images for the web. Photoshop and the likes even have a special function for that, reducing the file size while keeping the quality good enough for use on the web.
In case of images that serve as full-size background for containers: use image sprites. That saves BTS requests as well.
Code the HTML and JS files so that there is no rendering dependency on files from external domains, such as from Twitter, Facebook, Google Analytics, advertisement agencies, etc.
Make sure to get a web-host that will respond swiftly, has a sufficient processing capacity, and has a(n almost) 100% up-time.
Use vanilla/native JS as much as possible. Use jQuery or other libraries only for tasks that would otherwise be too difficult or too time-consuming. jQuery not only is an extra file to download, it is also processed slower than native JS.
Lastly, you should realize that:
having the server minify the codes on the fly generally results in a much slower response from the server;
minifying a code makes it unreadable;
de-minifying tools are notorious for their poor performance.
Minifying resources refers to eliminating unnecessary bytes, such as extra spaces, line breaks, and indentation. Compacting HTML, CSS, and JavaScript can speed up downloading, parsing, and execution time. In addition, for CSS and JavaScript, it is possible to further reduce the file size by renaming variable names as long as the HTML is updated appropriately to ensure the selectors continue working.
You can find plenty of online tools for this purpose, a few of them are below.
HTML Minify
CSS Minify
JS Minify
good luck!
I was just wondering what are the pros and cons of using embedded images instead of dynamic loading? Because when making games on pure AS3 (without Flash IDE), its a pain to manually embed all the assets needed... That makes your code sloppy, besides you don't have control to automatically change the hud, for example, by only changing the external file.
But I heard that some sites only let you upload a single swf, so you can't have external images. Also I heard that some are worried about users downloading their art... But as far as I know, and please correct if I'm wrong, they can also download them if they hack the swf with a decompiler. Having it external, you can encrypt the image, and unencrypt it on the code, so if they try to download they will only get encrypted code.
So... What do you think about embedding images? Please share all your thoughts to me.
I believe dynamic loading of images is a better approach. I agree with you about the game problem you stated, but when you are talking of flash/as3 as a whole, games are just one of the things among the many, you do. Also there are a few which also accept multiple files & maybe more will allow later. As of now hosting sites are just being on the safe side by not allowing multiple files & formats. So if you really have additional files you could just host them elsewhere & call them from your main swf.
I however cant agree about the point of making the code sloppy by managing images dynamically. When you do it through an IDE the IDE is writing the code for you, but as you might realize letting an IDE decide what to write doesn't always make the best. Manually handling things let's you understand all entry & exit points of an app. Moreover, would you want to open a flash IDE every time you wish to add an image, make an update, etc.
I usually like to use IDE cause of the awesome tools it provides to make things more efficient & prefer letting the code do all management/control stuff.And yes, if you have many small images (as in online flash games), embedding is better approach.
As far as the security is concerned even externally loaded files can be accessed if the encryption algorithm you use, can be found by decompiling the swf. So your best bet in case of security is usually using a third party software to encrypt the swf, which let's say increases your chance to prevent theft of your material. So if you really encrypt the swf using a 3rd party tool, both the ways would be acceptable.
I believe both approaches are valid, it mainly depends on your assets.
As you know embedding assets, will increase the size of your swf, I would only consider doing this for icon type images where size is hardly an issue.
For bigger images, I would definitely go for dynamic loading which I also find more flexible.
If I have lots of small icons, I'm embedding them. Imagine that number of requests in runtime, any of which may timeout. Pain of embedding, where? A single Embed tag in source or CSS for an asset. "Constant" assets must be embedded, "variable" ones - loaded.
Edit: OK, I got it. Pain of embedding lots of assets. Here is one idea come to my mind... Even if you loading something dynamically, you need some list with all filenames? You may take list of files and generate class full of public static const members with [Embed] attributes, that's fairly trivial. Then you use that class in project and voila, all is in the place. Maybe this helps.
The company I work for we have a CBT system we have developed. We have to go through and create books out of the content that is in our system, I have developed a program that goes through and downloads all of the content out of our system and creates a offline version of the different training modules.
I created a program that creates PDF documents using the offline version of the CBT. It works by using Websites Screenshot to create a screen shot of the different pages and then using iTextSharp it creates a PDF Document from those images.
It seams to be a memory hug and painfully slow. There are 40 CBT Modules that it needs to turn into books. Even though I take every step to clear the memory after each time it creates a book. After about 2 books it crashes because there is no memory left.
Is there a better way to do this instead of having to take a screen shot of the pages that will yield the same look of the web page inside the pdf document?
I have searched and demoed and found that ABCPdf from WebSuperGoo is the best product for .NET. It is the most accurate and doesn't require a printer driver. It uses IE as the rendering engine, so it looks almost exactly like what you get in IE.
PrinceXML is commercial software that generates pdf from websites.
I've used PDFSharp in the past and have had good success in generating PDF's.
It's open source as well, so in the event of troubles like you've mentioned, you're able to hunt and peck to increase performance.
If you control the source it is probably not too difficult to generate pdf directly instead of through a screenshot.
Did you try unloading the dll?
There are also different ways of getting screenshots:
http://mashable.com/2007/08/24/web-screenshots/