im not sure why (probably an update) but chrome has significantly lost performance while running some things I've made with three.js. I haven't worked on anything in a month and now that i've returned to my project i've found things are suddenly running much slower than they used to. I used to get a smooth 60 fps+, and now things are chugging along at 20 fps in one of my programs.
Just to be clear, I've changed absolutely nothing. I simply opened my projects a month later and the performance has dropped by 40+ fps, which is frightening. This is true for anything using three.js.
I'm wondering if anyone knows what the issue is.
EDIT:
http://gamejolt.com/games/arcade/tiny-tank/27522/
This is an application I've made which has significantly degraded in performance, at least on my machine. There also strange shading behavior which has appeared on the shading of certain objects due to hidden lights(?).
I'm using the WebGL renderer by the way.
I'm using Three.js version r66, since there are no migration instructions to move to any higher versions on github.
Go to chrome://flags and make sure Override software rendering list is set to enabled. This will make sure the GPU-acceleration is enabled on unsupported system configurations
Related
There used to be an option in dev tools under Render right next to the FPS Meter that said "Enable continuous page repainting". Now it's not there. Where did it go?
Version 60.0.3112.113 (Official Build) (64-bit)
There used to be an option in dev tools under Render right next to the FPS Meter that said "Enable continuous page repainting". Now it's not there. Where did it go?
It's been removed. See Crbug issue #523040.
The broader question here is, what workflow can you use to replace Continuous Painting Mode (CPM)?
CPM was before my time, but it looks like it gave you a realtime estimate of painting cost for the entire page. There's nothing equivalent to that anymore, but Performance recordings can definitely help you gauge how much painting your page is doing. The general idea is to start recording, interact with your page, and then analyze the recording results to see how many paint events occurred, and how much time each one took. See Get Started With Analyzing Runtime Performance to get familiar with the Performance panel.
2 Nov 2017 Update
There's a new feature coming to DevTools in Chrome 64 (which is currently in Canary) that's close to CPM called the Performance Monitor. It shows a realtime view of FPS, Layouts Per Second, and Style Recalculations Per Second.
The "continuous page repainting" debugging option was removed from Chrome quite a few versions ago. However, you can still get to paint instrumentation in the Performance tab of the developer tools:
Developer tools -> Performance -> Settings -> Enable advanced paint instrumentation
This will not enable continuous repaint since as far as I can tell Chrome no longer does that, but will allow you to see a profile of how your page actually worked during recording, and can be very useful for tracking down performance problems. It is integrated with other performance profile data as well.
I personally have found this article: https://blog.algolia.com/performant-web-animations/ to be useful if you're working on animations, but I'm not going to summarize it here since it's quite long and I'm not sure you're specifically looking to improve animation performance anyway. (No association with the author; just useful info.)
My macbook started freezing recently and I thought it might be due to shortage of memory. So I opened the activity monitor for the first time and I noticed there are some weird things happening with Chrome. Why are there so many Google Chrome Helper processes??
Is there any reason why there as so many processes?
Many thanks for your help
Well, for one, each tab of Chrome runs independently. So you'll see quite a few processes running due to that.
Another thing to keep in mind is extensions. Each of them usually run in their own process as well. If you have a lot of extensions running, they add up rather quickly.
This is a post for web gl developers.
Are you having nv4_disp.dll blue screen errors since you are developing web gl apps?
I have 1 or 2 every week in the last month. And I think I'm doing almost the same.
(That is , there is not any big difference into my app)
I'm using three & chrome and I promise I have this error playing with little 3D worlds (a simple plane and some lines, believe me). I have opened chrome + devtools.
Some times I have the error while I'm listening music at spotify.
I know this information can be irrelevant but maybe somebody has similar problems than me .
Any idea? Thanks.
Blue screen errors are indication of faulty hardware or faulty graphics drivers. Chrome itself cannot be the root cause of the problem, as Chrome cannot write into kernel space where the problem occurs.
I suggest you change your hardware if you want to get rid of the problem.
You may or may not want to report to this Nvidia, but most likely they don't care about individual reports, especially if the hardware is no longer being sold.
We have an air application (native for windows) that after being unused or background for about some hours after you try to use it it got stuck and "not responding" windows message.
No log or error or crash just got stuck.
The best behaviour that we have got is that after some seconds it unstucks but then stucks and unstucks every 30 seconds aprox, so it is anusable.
We have tried some "ugly" tricks like after being out of focus/background for some minutes, goes into front, it goes, but this haven't resolved the above problems.
I understand clearly, that SO (Windows) has removed the cached memory pages associated to this process, however, this is normal for each program, so, it is normal that program (specially visually heavy like ours) come back to use, it takes some seconds to go into memory again.
However after that time of re caching into memory any programs (most) works like a charm.
This is not the case of air app.
is there any trick or recommendation for fixing this?
Maybe we should combine the "to front" trick with some other "dumb processing" to keep the program in memory?
thanks in advance.
As far as I know, at the current moment, late 2011 the max-connections-per-server limit remains 6. Please correct me if I am wrong. This is bad that we cannot fix this easily as in Firefox. As far as I know this value is hardcoded.
One of the solutions is to download the Chromium's sources and rebuild them. Is there a more easy solution?
Is there any tricky way to hack this without creating a dozen of mirror-domains?
Why I'm asking the question: My task is to create a html-javascript slideshow that will run inside a fullscreened browser, and a huge monitor is hanging on the wall. The javascript is really complicated, it preloads photos and makes a lot of ajax calls to my web services. If WIFI connection is slow, if 6 photos are loading, the AJAX calls fail, the application runs bad. I want a fast solution based, on http or browser or ubuntu tweak something else, because rebuilding the javascript app will take days.
Offtopic: do you know any other things that can be tweaked in my concrete situation?
IE is even worse with 2 connection per domain limit. But I wouldn't rely on fixing client browsers. Even if you have control over them, browsers like chrome will auto update and a future release might behave differently than you expect. I'd focus on solving the problem within your system design.
Your choices are to:
Load the images in sequence so that only 1 or 2 XHR calls are active at a time (use the success event from the previous image to check if there are more images to download and start the next request).
Use sub-domains like serverA.myphotoserver.com and serverB.myphotoserver.com. Each sub domain will have its own pool for connection limits. This means you could have 2 requests going to 5 different sub-domains if you wanted to. The downfall is that the photos will be cached according to these sub-domains. BTW, these don't need to be "mirror" domains, you can just make additional DNS pointers to the exact same website/server. This means you don't have the headache of administrating many servers, just one server with many DNS records.
I don't know that you can do it in Chrome outside of Windows -- some Googling shows that Chrome (and therefore possibly Chromium) might respond well to a certain registry hack.
However, if you're just looking for a simple solution without modifying your code base, have you considered Firefox? In the about:config you can search for "network.http.max" and there are a few values in there that are definitely worth looking at.
Also, for a device that will not be moving (i.e. it is mounted in a fixed location) you should consider not using Wi-Fi (even a Home-Plug would be a step up as far as latency / stability / dropped connections go).
BTW, HTTP 1/1 specification (RFC2616) suggests no more than 2 connections per server.
Clients that use persistent connections SHOULD limit the number of simultaneous connections that they maintain to a given server. A single-user client SHOULD NOT maintain more than 2 connections with any server or proxy. A proxy SHOULD use up to 2*N connections to another server or proxy, where N is the number of simultaneously active users. These guidelines are intended to improve HTTP response times and avoid congestion.
There doesn't appear to be an external way to hack the behaviour of the executables.
You could modify the Chrome(ium) executables as this information is obviously compiled in. That approach brings a lot of problems with support and automatic upgrades so you probably want to avoid doing that. You also need to understand how to make the changes to the binaries which is not something most people can pick up in a few days.
If you compile your own browser you are creating a support issue for yourself as you are stuck with a specific revision. If you want to get new features and bug fixes you will have to recompile. All of this involves tracking Chrome development for bugs and build breakages - not something that a web developer should have to do.
I'd follow #BenSwayne's advice for now, but it might be worth thinking about doing some of the work outside of the client (the web browser) and putting it in a background process running on the same or different machines. This process can handle many more connections and you are just responsible for getting the data back from it. Since it is local(ish) you'll get results back quickly even with minimal connections.