Limitations of Web Workers - html

Please bear in mind that I have never used Web Workers before and I'm having some trouble wrapping my head around them.
Here's an explanation of a simplified version of what I'm doing.
My page has links to various files - some are text, some are images, etc. Each file has an image showing a generic file icon.
I want the script to replace each generic icon with a preview of the file's contents.
The script will request the file from the server (thereby adding it to the cache, like a preloader), then create a canvas and draw the preview onto it (a thumbnail for images, an excerpt of text for text files, a more specific icon for media files...) and finally replace the generic icon's source with the canvas using a data URL.
I can do this quite easily. However, I would prefer to have it in the background so that it doesn't interfere with the UI while it's working.
Before I dive right in to this, I need to know: can Workers work with a canvas, and if so how would I create one? I don't think document.createElement('canvas') would work because Workers can't access the DOM, or am I misunderstanding when all the references I've found say they "can't access the DOM"?

You cannot access the DOM from web workers. You cannot load images. You cannot create canvas elements and draw to them from web workers. For now, web workers are pretty much limited to doing ajax calls and doing compute intensive things. See this related question/answer on web workers and canvas objects: Web Workers and Canvas and this article about using webworkers to speed up image manipulation: http://blogs.msdn.com/b/eternalcoding/archive/2012/09/20/using-web-workers-to-improve-performance-of-image-manipulation.aspx
Your simplest bet is to chunk your work into small chunks (without web workers) and do a chunk at a time, do a setTimeout(), then process the next chunk of work. This will allow the UI to be responsive while still getting your work done. If there is any CPU consuming computation to be done (like doing image analysis), this can be farmed out to a web worker and the result can be sent via message back to the main thread to be put into the DOM, but if not, then just do your work in smaller chunks to keep the UI alive.
Parts of the tasks like loading images, fetching data from servers, etc... can also be done asynchronously so it won't interfere with the responsiveness of the UI anyway if done properly.
Here's the general idea of chunking:
function doMyWork() {
// state variables
// initialize state
var x, y, z;
function doChunk() {
// do a chunk of work
// updating state variables to represent how much you've done or what remains
if (more work to do) {
// schedule the next chunk
setTimeout(doChunk, 1);
}
}
// start the whole process
doChunk();
}

Another (frustrating) limitation of Web Workers is that it can't access geolocation on Chrome.
Just my two cents.

So as others have stated, you cannot access the DOM, or do any manipulations on the DOM from a web worker. However, you can outsource some of the more complete calculations on the web worker. Then once you get your return message from the web worker inside of your main JS thread, you can extract the values you need and use them on the DOM there.
This may be unrelated to your question, but you mentioned canvas so i'll share this with you.
if you need to improve the performance of drawling to canvas, I highly recommend having two canvas objects. One that's rendered to the UI, the other hidden. That way you can build everything on the hidden canvas, then draw the hidden canvas on the displayed one. It may not sound like it will do much if anything, but it will increase performance significantly.
See this link for more details about improving canvas performance.

Related

How to implement Caching while loading SWCs dynamically

I'm loading dynamic loading of all my SWCs in my master SWF, in order to load master swf faster, however now I need to cache all my swcs in local machine to speed up things.
private function loadAssets():void
{
swcObj=new Object();
swcObj.swcPath='assets/swc/1.swc';
swcObj.className="Part_0_1";
swcs.push(swcObj);
swcObj=new Object();
swcObj.swcPath='assets/swc/2.swc';
swcObj.className="0_2";
swcs.push(swcObj);
swcObj=new Object();
swcObj.swcPath='assets/swc/3.swc';
swcObj.className="0_3";
swcs.push(swcObj);
}
Then I'm using this array to use all the classes in my project, but I have no idea how to cache these swcs for faster use, if anyone have idea, please share.
In fact, the browser does this pre-caching for you, you don't need to produce extra efforts. So, just load them normally and don't worry about caching. You can, however, motivate the user to increase their local browser cache in order to potentially lessen time spent on waiting while your assets are loaded, but this won't help should the user watch three tons of YouTube each day.
SWC files are not intended for dynamic loading.
They are static libraries that can be linked in a swf using
-include-libraries and library-path options of mxmlc or - since you seem to be using FlashDevelop - SWC Include Libraries and SWC Libraries in Project>Compiler options
. SWC's may hold code (classes), assets (symbols/bitmaps/sounds...) or combination of the two.
Loading assets dynamically is done through flash.display.Loader
you may use the Loader as a simple DisplayObject instance that you add on stage:
http://help.adobe.com/en_US/FlashPlatform/reference/actionscript/3/flash/display/Loader.html#includeExamplesSummary
, or use its ApplicationDomain as a library of Class definitions that will allow you to create instances at will :
http://help.adobe.com/en_US/FlashPlatform/reference/actionscript/3/flash/system/ApplicationDomain.html#includeExamplesSummary
Caching from the browser will be sufficient in most cases, unless you have VERY specific needs
In the end, there is different ways to optimize loading times, one of which is having a small swf acting as loader/home menu, and loading rest of the content on demand like you seem to try doing, but you can also create a single swf with several frames, which will be "streamed" by flash player, example:
1st frame as small as possible holding just a few kb for a splash screen/logo/loading/whatever you want, to make the initial blank screen as short as possible, then second frame holding the main content. You can event extent this system with for instance
two levels of preloader: a first tiny frame with just a logo, second one with progress bar, and full background if needed
split content like home screen/menus in a frame, gameplay in the another, if you are making a game, so that gameplay continues loading while you are already displaying the menu

How to stack Sprites rendered inside Iterator?

I am rendering an image to a Sprite inside of an Iterator. I'd like each render (iteration) to remain on the canvas indefinitely, so that each successive render layers on top of the previous ones. How can I do this?
There are no Clears or any other layers in my composition.
In Quartz Composer, you'll almost always want to use a Clear patch — don't assume that you can rely on the prior contents of the framebuffer. So, to accomplish this, you'll need to load all of your images into a structure (probably by using JavaScript to feed an Image Loader patch and build a Queue from that), and then display all of the images each frame using an Iterator.
Check out Apple's "Image TV" sample composition, available in the OS X Developer Library in the Quartz Composer Conceptual Compositions bundle. This example demonstrates how to load a series of images into a structure and then display them.

Custom images for HTMLLoader

There is powerful HTMLLoader component for AIR wrapped in mx:HTML for Flex.
I want to supply images manually (ideally from bytes) for mx:HTML, which will display my generated content. The point is to pack all resources in the application without external files. I can pack different html pages in the app and switch them when mx:HTML dispatches Event.LOCATION_CHANGE. Now I want the same for images. What do you suggest?
Solved! Went through several stages:
Make HTMLLoader's background transparent with paintsDefaultBackground="false" and backgroundAlpha="0". Get notified of pictures location with javascript and draw them on HTMLLoader's graphics. This is complex and has problems with resizing (pictures get shifted), but was almost done...
Next idea - use <canvas> to draw images on them, sending data to javascript.
While reading canvas tutorials, stumbled upon data URI scheme, which does exactly what I needed in simplest possible way. Images are embedded in html page in base64 encoding.

What is eager loading?

What is eager loading? I code in PHP/JS but a more generalised answer will be just fine.
I saw a lot of questions regarding Java & Ruby, but i don't know any of these languages, and I find it hard to read code. I don't know whats supposed to do in the first place
There are three levels:
Eager loading: you do everything when asked. Classic example is when you multiply two matrices. You do all the calculations. That's eager loading;
Lazy loading: you only do a calculation when required. In the previous example, you don't do any calculations until you access an element of the result matrix; and
Over-eager loading: this is where you try and anticipate what the user will ask for and preload it.
I hope that makes sense in the context you're seeing it.
Let me give you a "Webby" example.
Imagine a page with rollover images like for menu items or navigation. There are three ways the image loading could work on this page:
Load every single image required before you render the page (eager);
Load only the displayed images on page load and load the others if/when they are required (lazy); and
Load only the displayed images on page load. After the page has loaded preload the other images in the background in case you need them (over-eager).
Make sense?
It's the opposite of lazy loading, which defers initialization of an object until the object is needed. Eager loading initializes an object upon creation.
If you imagine you have object called person who has a name, a date of birth and number of less critical details, lets say favourite colour, favourite tv program.
To lazy load this class you would initalise it reading in perhaps from a database all the core more frequently used details (say name and date of birth) and only read in the less used details when / if they are needed, eager loading is the opposite, i.e. you load in all the details at the same time.
The benifits of lazy loading are often citied as effiecency, however if objects aren't that complex or efficency isn't a concern eager loading may be used
Eager loading is also used in Angular 8. It just means that the instant the application is loaded inside the browser we automatically, instantly get all the code inside a particular module, for example, say you just created an Auth Module with a Signin and Signup component to it that gets imported into an App Module.
In contrast, there is lazy loading, which is when we tell the App Module which has the Auth Module loaded into it, to only load the Auth Module at a certain point in time such as when a user goes to a certain route.

does Google analytics make a major effect on time to download a static web page?

I understand that by simply adding a script to the end of the body tag of a html document one makes it processable by Google analytics. My question is, is this likely to have much effect on performance (download time and server load)? Let's assume a static page of say 100k served by IIS. Thanks.
Will my website's appearance or performance be affected by Google Analytics?
The appearance of your website will
never be affected by your use of
Google Analytics - we don't place any
images or text on your pages.
Likewise, the performance of your
pages won't be impacted, with the
possible exception of the very first
page-load after you have added the
tracking code. This first pageview
calls the JavaScript on Google's
servers, which may take slightly
longer than a regular page load.
Subsequent pageviews will use cached data and will not be affected.
It's important to note that many websites on the internet use the same Javascript from the same location on Google's servers, so only rarely will a new user come to your site without already having that file cached locally.
Yes it does have a performance hit see http://dotnetperls.com/Content/Google-Analytics-Speed.aspx . To speed up its recomended that you download the ga.js file locally and call that instead,Explained here http://www.askapache.com/javascript/google-analytics-speed-tips.html.
Edit: Google has released Asynchronous Tracking. I haven't tried it yet, but I guess that it addresses the issues listed below.
I think Google Analytics can make a website slower because it does happe that ga.js takes noticably long to load, and this can cause some problems:
If you have a JavaScript that triggers on window.onload (that includes the old school <body onload=""> syntax), then it wont fire until the web page has downloade complety. Using something like jQuery's ready event might remedy this, though.
Most browsers do not fill in saved user names and passwords until the web page has loaded completely.
It is not easy for the average user to spot that a web page is simply waiting for the Analytics script to load, so they might be waiting for the little download animation to finish even though the page has essentially finished downloading.
You could follow niallbrowne's suggestion of downloading ga.js and serving from your own web server. But this should be a last resort, since ga.js is cached across web sites and only expires once a week.
Yes.
I feel browsing speed is much better since I added google analytics to addblock filter.
No.
If you put it at the end it will be loaded last, so even if Googles servers are a bit slow your visitors will never notice.
ga.js is 9.58k and a logging call is about 1.2k. The js will be cached after the first load (I guess even across sites?) so it's really negligible size wise.
Even if you put Analytics code at the bottom of your code, from a users perspective the site hasn't loaded till the little blue bar at the bottom has gone away.
This means that your site will 'feel' slower, depending on (surprise surprise) how laggy your users connection is. For Dialup users and users accessing your websites from abroad (where request lag is a higher concern) the extra request will definitely mean a slightly less responsive website.
However, given that every image, every javascript file and any other embedded object is an additional request, if you're already using a rich website layout, this is no reason not to use analytics.
The user's experience is definitely slowed down by GA on a slow connection.
Remember that not every user has fast US based connections.
If you are on a slow connection from a country outside the US, the difference is certainly noticeable.
People running slower computers or browsers outside the norm (i.e. old versions, mobile phones etc) may all be affected by the javascript execution time.
Sometimes I experience lags in pages that use it. I can track the problem to GA since it's the only script waiting to get loaded. I know this shouldn't happen but with some page requests it does rather randomly. Not that it usually matters since the whole page is already loaded so you can start reading. But it becomes a small problem with pages that use ajax or generaly do stuff on document ready event. So I add it to my adlock filters.
Take a look at what the competition says.
Personally I really can't see that there would be much of a difference at all, your browser would cache it after the first request and use it there after on each other page.
The script is loaded at the very bottom of the page as well so everything else should already be loaded.
Regarding server load, the scripts are pulled from Google's servers, not yours so there will be no noticeable server side impact. Obviously your pages will all be slightly larger than they were without the code to load the JavaScript, but you'll never notice the difference.
Note too I've seen GA download a little GIF file with a hash attached to it... but I doubt the size of this will have much of an affect on performance.
If you add the code to the bottom of the page then it probably won't make much of a difference.
If however, you want it to make no difference then I'd take a look at this link:
http://lyncd.com/2009/03/better-google-analytics-javascript/
It describes the approach that Steve Souders took to completely avoid any kind of I/O block.
Although downloading and running the actual ga.js is fast, what I've noticed all across Europe, on different connections/computers/OSes/browsers, is a MAJOR lag (anywhere from 0 to 30 (thirty) seconds) between the last byte of HTTP request and first byte of HTTP response.
This is understandable, given the immense popularity of GA, but this is happenning before window.onload fires. So, if your page relies on JS and your users hit this lag, they are not going to analyze which component is responsible - they'll assume your site is horribly slow.
A workaround for this is to register a window.onload function which will add the GA script. Example (using "window.onload=function()" for simplicity):
window.onload = function() {
var gaJsHost = (("https:" == document.location.protocol)
? "https://ssl."
: "http://www.");
var s = document.createElement('script');
s.src = gaJsHost + "google-analytics.com/ga.js";
s.type='text/javascript';
var bodies = document.getElementsByTagName('body');
if (bodies.length > 0) {
bodies[0].appendChild(s);
} else { // this should never happen, but sometimes does (curse you IE6!)
document.appendChild(s);
}
// this says 100ms, but won't happen until ga.js is loaded
window.setTimeout(function(){
if (window['_gat']) {
var pageTracker = _gat._getTracker("UA-xxxxxx-x");
pageTracker._trackPageview();
}
},100);
}