How can I record a screen or region in a browser in a good quality without using soft or plug-ins? - google-chrome

I already used RecordRTC, but I had to abandon this decision, because I was getting videos with low quality and small fps. I tried change the options (videoBitsPerSecond, frameInterval) to improve quality, but nothing helped. Maybe I was not competent enough on this issue, but searches on this topic were futile.
If solution isn't exist, I'll look in universal plug-ins direction. Tell me what is better to use in a plugin to record a screen or region in good quality.

Presently only Firefox supports Screen Capture at navigator.mediaDevices.getUserMedia(). You can try testing various constraints at the Firefox implementation for screen, application, and window at getUserMedia Test Page.

Related

Why this rendering artifacts happen?

I'm developing simple mobile game for android using cordova. The game is the simple Hidden Object game where player has to find requested objects on the scene. One of my levels had very slow performance. More powerful devices worked fine. I've tried force GPU rendering using translateZ hack and got huge performance boost but, on low-end devices started wired rendering artifacts.
This screenshot is from Meizu U10. The background consists of 3 layers with position absolute, z-index 1,2,3. Objects to find (Cow, chicken etc) has same position absolute and z-index depends from background they are related to. If run game in browser on the same device, there is no artifacts and performance just great. Google did not give me any useful clues, so i will be very appreciate any guesses and tips.
Using SVG can be an option you might need to consider, specifically inline SVGs.
Using inline SVG is beneficial to the performance of a website because
it eliminates the HTTP request needs to load in an image file. Since
no file needs to download, this results in smaller loading times for a
page. This makes your website appear faster to visitors, improving the
user experience.
You should really consider using a webGL rendering engine such as PixiJS if you want to have performance in your games. You would just have to convert your images into textures.

Why not always use 2x images when building web pages? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
I was having a discussion with a designer friend yesterday, the salient points of which I'll detail below:
2x images are larger files, but not the 4x you may think. In one example a 1x image file is 47kb & its 2x counterpart is only 55kb.
2x images are only for Retina screens, and despite being phased in on the desktop/laptop side, the truth is that most retina screens are mobile.
While wifi is becoming fairly ubiquitous, desktops (mostly 1x) are the only devices that never have to download data off a cell network.
These all led me to posit the question: Why are we spending energy on providing 2x images, when they are mostly accessed by mobile devices which have the greatest bandwidth limitations?
After sleeping on it, I started to wonder: well fine, if we're going to ignore that last issue, why not just only deal in 2x? CSS can handle scaling down the images in any case (perhaps I'm wrong here?) so why not save the media queries & save the effort to generate and store 2 copies of every image by just using 2x everywhere?
Am I crazy?
The file size issue does bother me. I think things should be as small as possible.
If you're not worried about that, though, the only, temporary issue I can think of is browser support for background-size. IE8 doesn't support it and it's still used enough to have to worry about it (at least on my projects). There's a polyfill for it, but it's not up to snuff with the real thing.
I think the answer depends on where you live in the world. Believe it or not, I live in an area with a lot of country roads that still only have dial-up. Sometimes, even that does not work. We are still not nearly as high-speed as we should be with desktops. I can't imagine them having to download that extra data when they view 1x.
So, I think it depends on your target audience, where they are more likely to live or what devices they use. We will get there in time, but for some, not yet.
Let analytics be your guide
I use the 5% rule. Once any feature is more than 2 standard deviations out of the norm, I drop support for it. In browserland, that means IE6 and IE7 are gone for me but I keep supporting IE8 because a size-able chunk of the audience is using that feature. Yeah the big guys like Google have dropped it but I still see a good chunk of traffic from it on a lot of sites. Why make them suffer?
Now how this relates to your question: ask yourself what percentage of your audience is on a 10Mbps+ LTE connection with retina screen. Maybe in your case it's 95% retina screens with LTE on mobile, but check your analytics package. My guess is that it's probably under 20%, in which case having fall-backs gives a better UX to 80% of your audience - easily worth the effort.
In my opinion I see those problems:
some older phone models (ex: iPhone 3G) and tablets (ex: iPad 1) have low memory. A big enough image can cause out of memory errors.
to scale an image the system has to load it at full size and do a complex scaling operation each time it draws it (sometimes it is cached).
a scaled down image doesn't look as good.
you can run into problems with them in older browsers (as mentioned by Bill Criswell)
it increases download size. If we consider a 10kb increase in size / picture * 10 pictures per page than you get 100kb per page load. If your page has to display a lot of images (think social), than the overhead is a lot.
you can improve your search engine rankings if your page loads faster and it is smaller.
The only major issue is file size. And as you state, in a lot of cases, the file size differences are minimal.
If we're talking mostly icons, the benefit is that a) icons aren't huge to begin with, so the file size increase is minimal and b) icons benefit the most from retina resolution.
On the other hand, if we're talking 'full screen' news photos, those could be quite a bit larger file-size-wise, but also look perfectly fine if they are not retina (as they are continuous tone) so there's a less compelling need to make those retina if you are targeting a mobile device.
A compromise for the latter might be to lazy-load them. Check the screen size. If phone-sized, load the regular image and call it good. If it's larger than phone size, load the regular image, then go back and grab the retina version for those on an iPad 3, for example.
The only technical problem is IE8 and older. They can't handle the CSS you'd typically use for retina images. There are workarounds, but not for sprites--which you'd commonly use for icons.
Eventually, we'll see more SVG support, which will solve this problem--at least for icons. When I am doing pure iOS work, for example, most all of my imagery is SVG now. It's smaller, and automatically retina-ready.

How to measure complete all-in performance of DOM changes?

I've found lots of information about measuring load time for pages and quite a bit about profiling FPS performance of interactive applications, but this is something slightly different than that.
Say I have a chart rendered in SVG and every click I make causes the chart to render slightly differently. I want to get a sense of the complete time elapsed between the click and the point in time that the pixels on the screen actually change. Is there a way to do this?
Measuring the Javascript time is straight forward but that doesn't take into consideration any of the time the browser spends doing any layout, flow, paint, etc.
I know that Chrome timeline view shows a ton of good information about this, which is great for digging into issues but not so great for taking measurements because the tool itself affects performance and, more importantly, it's Chrome only. I was hoping there was a browser independent technique that might work. Something akin to how the Navigation Performance API works for page load times.
you may consider using capturing hdmi capturing hardware (just google for it) or a high speed camera to create a video, which could be analyzed offline.
http://www.webpagetest.org/ supports capturing using software only, but I guess it would be too slow for what you want to measure.

Most efficient way to cater for different web devices?

Folks voting to close as not constructive, read the whole thing please. Specific questions at the end. Looking for real world examples and approaches.
Context
With numerous devices like smart-phones, tablets being used increasingly to access the web its important to plan, design (responsively) and develop (esp. your front end) to give the devices a fast and tailored user experience.
There's some amazing sites being built. Have a look at mediaqueri.es (resize your browser)
We see approaches such as
big screen first, then target smaller devices.
mobile first then media queries to spice things up for bigger screens.
device detection with different techniques (including server side)
and serving completely different markup and content to devices.
Question(s)
What are you folks doing out there today? Why did you choose your approach and most importantly if it isn't the most efficient approach to tackle this, then what is?
Things I'm looking for:
Is it a pure CSS / JS / HTML approach, or server side, or a combination - why?
Each device gets only the resources (images etc.) that it needs so it performs well
Maintenance of the site is easier, i.e. adding / changing features is not a huge pain
Some code samples are always useful
lets leave out old shitty browsers like ie7 and below
I think what you are looking for is Responsive Web Design.
See:
http://www.alistapart.com/articles/responsive-web-design/
http://coding.smashingmagazine.com/2011/01/12/guidelines-for-responsive-web-design/
Responsive Web Design (using CSS) does not necessarily address performance issues but is a good starting point. Keeping in mind that premature optimization is the root of all evil, you can profile your users and bandwidth and determine where you need to optimize once you have a working design in place.
For a discussion of some of the downsides, e.g., image resizing in the browser (you can work around this with CSS and/or AJAX, though), see:
http://www.webdesignshock.com/responsive-design-problems/
You will want to use a framework like PhoneGap. HTML evolves slowly but every day, about 10 new devices with new bugs "features" hit the market and you innocent app - there simply is no way for a small group of people to handle the necessary work.
As for different devices, there are two trouble areas:
Screen sizes
Render bugs
For screen sizes, there are pretty cool solutions today. For example, some frameworks add the CSS style "640x400" or "480x800" to the body element. That makes it dead simple to style elements depending on the screen size. Or you can use JavaScript pull in the matching style sheets.
For render bugs, you'll need workarounds. As I said above, this is nothing a single person or a small group can handle. Every few months, there is a new version of Android with a new render engine and new "features". You fix problems for one of them and break ten others at the same time. Select a framework which plays well with many devices and which is open to changes (read: avoid anything proprietary).
That way, you get solutions for the common problems and you can fix small issues yourself.
I would definately recomment RedFilters Answer (+1).
I just wanted to add some libraries which you might find interesting relating to this topic. They involve Browser feature detection & conditional resource loading.
check them out:
http://www.modernizr.com/
http://yepnopejs.com/

confused with mobile friendly websites or related

I read somewhere that once you log in here (http://skweezer.com/) and put the website url, we can get the way the site will appear on mobile browsers. I tried getting this url (http://sachindra149.wordpress.com/) and was baffled to see he way it looked.
1) Can anyone please let me know why is that so ??
2) Why such a major difference???
3) And also what needs to be done ...
keeping one thing in mind that I dont have the control over the web codes for the page as it is a blog....
Hope I am clear enough !!!!!!!!
It looks like it just dumps pages to text. That isn't a realistic view of what mobile browsers do these days. Unless you are explicitly developing websites for low end and old mobile devices, don't worry about it.
If you are developing websites for such devices, then:
Don't use generic blog hosting which doesn't give you lots of control
Avoid tabular data
Avoid chrome (large navigation, anything that isn't the primary content)
Keep content short and to the point
Test of real mobile devices instead of third party emulators
First of all, it's not as bad as it looks. Mobile device have only very limited resources but they are much better today.
What you see on skweezer.com is just the raw text without CSS styling, tables and other complicated HTML. This way, the site loads much (!) faster and you only transfer a fraction of the data. Your original site needs 320KB, the skweezer version needs 50KB - less than 1/6th. Mobile browsing got much faster and cheaper but it's still many times slower and more expensive than on the desktop.
As for what you can do about it: Not much. You could select a design which is optimized for mobile devices but to know how good it looks, you would need all mobile devices that access your site. I suggest to rely on the experience of the designers at Wordpress. Your site does look much better on most mobile devices.
I prefer checking them on real devices.
Chrome extension like this will help you vew your website on different screen sizes if there is a limit on devices.