PhpStorm choking and freezing while editing code - phpstorm

I have experienced sudden drop in PhpStorm performance. IDE which was performing very well (without any noticeable signs of delay in response) started to be sluggish and freeze for a fraction of a second every time any autocomplete was performed.
Despite available 16GB memory (and increasing max memory for program in PhpStorm settings up to 2 GB) IDE does not use any of the additional memory and still freezes every few second. None of available suggestions all over Internet regarding disabling unnecessary plugins, clearing indexes or changing memory were successful. Reducing highlighting level has no effect on described issues.
Question: is there any other option I could do to restore normal performance without disabling some vital IDE functionalities?

I have managed to identify cause of sudden drop in performance described in question. The reason behind this was adding to project Gulp libraries which seemed to be more than PHPStorm could index in real time.
Solution: exclude node_modules directory from indexing. To do this go to Settings > Directories, select node_modules and mark it excluded. Your PHPStorm performance should return to normal.

I was also getting this problem but node_modules was already excluded (I think PHPStorm does this by default).
I have just excluded the location of the compiled files as well. In my case this was the entire public folder. (Using Laravel and Elixir.)
This solved it for me.
Mick

I've experienced those kind of performance problems in the past. Options I've used are
Increase heap memory to 4gb
Exclude folders from indexing
Disable some plugins (last time the problem was with PHP Inspections plugin)

I'm using PhpStorm with projects on Laravel, therefore what worked for me is excluding bootstrap/cache as well as entire storage folders.

Related

How to stop indexing in PhpStorm?

Today I open my project in PhpStorm.
Then PhpStorm start indexing on files just like that
and updates.
Anyone please tell me how to stop it.
You can stop synchronizing/indexing each time you switch to the IDEA and it's quite useful when dealing with big projects and outside build process which triggers indexing.
Just disable checkbox System Settings -> Synchronize files on frame or editor tab activation.
Make sure you run indexing manually to update hints when needed through Synchronize menu (Cmd-Alt-Y on Mac).
Indexing temporarily may be an option, but by disabling the indexing your project will not be fully indexed and you may be missing some functionality (actually what makes PhpStorm useful).
it is a core feature, all WebStorm functionality (syntax/error highlighting, code completion, code inspections, navigation, refactoring, code coverage, search, etc.) is based on it.
https://intellij-support.jetbrains.com/hc/en-us/community/posts/207671405/comments/208579689
You did not write why you want to disable. Is it using too much CPU (which sometimes happens)?
What you can do instead is:
disable some directories for indexing - e.g. logs, automatically generated test files, build directories, unused sources. Either click on directory and in context menu select Mark Directory as | Excluded or in Settings choose Directories mark the directories as Excluded there. In my experience, this usually fixed high CPU usage for PhpStorm in my projects. Restart PhpStorm after that
reduce the number of threads: Help | Find Action | Registry | caches.indexerThreadsCount, maybe also caches.scanningThreadsCount (see https://intellij-support.jetbrains.com/hc/en-us/community/posts/360010215320/comments/360002619339)
try invalidating caches and restart: File > Invalidate Caches (I never used that but it might help), see also https://intellij-support.jetbrains.com/hc/en-us/community/posts/207671405/comments/208579689
if you are using several similar projects, maybe the plugin "Shared Project Indexes" is useful (have not tried this out)

Changing one CLJC-file causes full recompile

In a fairly large project I am working on, I have a separate source folder with .CLJC-files that are shared between the frontend (Clojurescript) and backend (Clojure).
I noticed that making a trivial change in a CLJC file (e.g., adding a blankline) causes all or most of the CLJC + Clojurescript files to be rebuilt (as listed by the compiler with :verbose to true). This causes a delay of about 45 seconds on my machine. Strange thing is that this even happens when the CLJC does not contain any functions, and is not require'd by any other CLJS/CLJC file...
Does anybody know what could cause this behaviour? My project file is quite standard, and I use the latest versions (CLJS 1.9.293, lein-cljsbuild 1.1.5 or even 1.1.6-SNAPSHOT).
Unfortunately it is not possible to even temporarily remove those pecularities from the project just to test this issue.
To answer my own question:
This problem was not caused by the specifics of my project -- it happens with every project, because it may be the case that a CLJ or CLJC file contains macros that (when changed) cause CLJS files to be implicitly modified. lein-cljsbuild therefore recompiles all CLJS files.
For some background discussion, see https://github.com/emezeske/lein-cljsbuild/issues/345
I have filed a bug report at https://github.com/emezeske/lein-cljsbuild/issues/457

How to build Chromium faster?

Following only the instructions here - https://www.chromium.org/developers/how-tos/get-the-code I have been able to successfully build and get a Chromium executable which I can then run.
So, I have been playing around with the code (adding new buttons to the browser etc.) for learning purposes. So each time I make a change (like adding a new button in the settings toolbar) and I use the ninja command to build it takes over 3 hours to finish before I can run the executable. It builds each and every file again I guess.
I have a decently powerful machine (i7, 8GB RAM) running 64-bit Ubuntu. Are there ways to speed up the builds? (At the moment, I have literally just followed the instructions in the above mentioned link and no other optimizations to speed it up.)
Thank you very very much!
If all you're doing is modifying a few files and rebuilding, ninja will only rebuild the objects that were affected by those files. When you run ninja -C ..., the console displays the number of targets that need to be built. If you're modifying only a few files, that should be ~2000 at the high end (modifying popular header files can touch lots of objects). Modifying a single .cpp would result in rebuilding just that object.
Of course, you still have to relink which can take a very long time. To make linking faster, try using a component build, which keeps everything in separate shared libraries rather than one big onw that needs to be relinked for any change. If you're using GN, add is_component_build=true to gn args out/${build_dir}. For GYP, see this page.
You can also peruse faster linux builds and see if any of those tips apply to you. Unfortunately, Chrome is a massive project so builds will naturally be long. However, once you've done the initial build, incremental builds should be on the order of minutes rather than hours.
Follow the recently updated instructions here:
https://chromium.googlesource.com/chromium/src/+/HEAD/docs/windows_build_instructions.md#Faster-builds
In addition to using component builds you can disable nacl, use jumbo builds, turn off symbols for webcore, etc. Jumbo builds are still experimental at this point but they already help build times and they will gradually help more.
Full builds will always take a long time even with jumbo builds, but component builds should let incremental builds be quite fast in many cases.
For building on Linux, you can see how to build faster at: https://chromium.googlesource.com/chromium/src/+/master/docs/linux_build_instructions.md#faster-builds
Most of them require add build argments. To edit build arguments, you can see GN build configuration at: https://www.chromium.org/developers/gn-build-configuration.
You can edit the build arguments on a build directory by:
$ gn args out/mybuild

Chrome Click&Clean - Not clean?

After running Chrome Click&Clean (set to clean everything), it turns out that the SQLITE3 file .config/google-chrome/Default/Cookies is still about 250K in size. This is huge compared with the initial size of Cookies after deleting the file and restarting Chrome.
Am I missing something.....or does Click& Clean actually NOT DO MUCH CLEANING?
It doesn't do a hell of lot, clears up a lot but not all.
Ccleaner should definitely clean that out, but be aware that cleaning up can cause error logs in registry sometimes from things badly coded trying call things which don't exist anymore.
You would not expect the physical file size to change when data is removed as that would require a full rewrite of the database to disk which would be extremely inefficient. The size will not change until the browser decides to do some housekeeping (VACUUM). CCLeaner can compact chrome databases on demand.

How to prevent tclIndex collisions?

If multiple tcl scripts are running in the same directory, they can crash if one tries to auto_mkindex at the same exact time as another.
How can I prevent this properly? I do not want to just place catch around auto_mkindex, nor do I want to implement a semaphore system for this simple problem.
Why would you be building the tclIndex files at the same time in the first place? That's a step that I would expect as part of installation (i.e., something done once as a special action) and not as part of operation (i.e., many times, in parallel potentially). If it's part of installation, it's entirely your own problem if you try to run the code while you're installing it.
I also wouldn't tend to use tclIndex for anything shared between applications, as that's optimized for simple scripts. Shared components are better off made into packages, especially as they're versioned entities, and they have their own indexing mechanism (the pkgIndex.tcl). (Having the same version of the same package installed twice in such a way that things interfere… well, that wouldn't be sensible, would it?)