PhpStorm not synchronizing new local files - phpstorm

I have a large project with multiple repositories. Whenever I create a new file in one of my repositories with a build tool like gulp or webpack I don't see any changes show up unless I manually right click my root directory and click synchronize selected files.
It does not matter how long I wait it only works if I do this. Is there something I could tweak or change to possible effect this and maybe help me get rid of this problem?
I am syncing with a virtual machine but edits or lightning fast and I have caching turned off.

Related

How to only download specific files from remote server instead of whole project?

I created a new project from remote sources and entered my servers data. PhpStorm instantly began to download the whole Magento project, even though I only need specific files for development, e.g. I don't need all the images or cache folders for my purpose. Now it takes 4-5 hours to download the whole project.
In NetBeans you can choose which folders you want to download to your machine, is this also possible in PhpStorm?
Yes it's possible.
You should have marked such unwanted folders as "Excluded from Download" --
See official help page for appropriate wizard step.
Other ways of creating a project:
Just create empty local project and then configure the rest manually (deployment etc); once done use "Browse Remote Host" and download folders/files you need.
Another way -- download all needed files locally first using you preferred program (e.g. FileZilla) and then just point to the project root folder in "Open" dialog -- IDE will create new project from those files.

Sync to remote diff by content stuck on loading

I've been trying to make a switch from Sublime Text 3 with SFTP package to PhpStorm. I'm having a big issue with syncing my local repo with remote server. Sublime allowed me to do that nearly flawlessly with just one press of a button. In PhpStorm however when I try to sync with remote and set to diff by file content, the diff window gets stuck on scanning (loading the files) after a while. If I remove the remote folder it takes so long to load, it just gets stuck on another one. Would greatly appreciate any advice!

PHPStorm cache on downloaded files?

So I've used PHPStorm before, and have been asked to evaluate it (along with some other coworkers) as I already had my own private license, for how effective it would be with my current company. Although I'm hitting a bit of a snag that I really dont think should be a show stopper.
Anyway, the way my company has its development environments setup now is a bit odd. We check everything into subversion, into different directories than what it will end up on the clients system because we save them to debian packages. This makes working with the files directly from subversion difficult, as PHPstorm has no idea where related files are located.
However, because of this, our files on our development virtual machines are not directly under subversion. Instead, we patch up our virtual machines by installing the updated packages when needed.
This makes life difficult for an IDE, which wants to keep a local copy of the files on your system. The best way I can figure out how to do this, is to run a synchronize between the remote server and local server (going by timestamp and size should be fine, and completes in less than a minute). It would be fine to tell developers "after you patch, make sure you sync with phpstorm".
However, the problem I'm having is, if I modify a file on the remote system, sync (and it says it downloaded) it takes several minutes after opening the file for the remote changes to be seen in phpstorm
I have no idea why this would be, and could potentially lead to really bad results if someone makes a few quick changes, saves, and overwrites the needed files.
I'm currently running phpstorm on Ubuntu 14.04 64-bit
Any help would be appreciated

PhpStorm and projects with remote files, keeping environments mirrored

I'm using PhpStorm for a while now but I'm a bit confused for the right way to work on projects with remote files.
At the moment I've created a project (FTP) and download everything from the deployment server. When I save a file it gets uploaded automatically but there are situations where I'm working from another device using a different approach to modify files.
At this moment I'm redownloading the entire deployment server, but this seems like an overkill. Is there a method like 'sync' to just download/modify the remote changed files? I know there is an item in the context menu 'Synchronize "Project name"', but this doesn't seem to do anything?

Sincronize databases with git deployment

So I own a VPS server running CentOS, and decided to use git for deployment. Man! That's fun. Push, done!
I'm really happier than i was with the old ftp approach.
But I wish I could go further, today it deploys automagically all my files, but it doesn't even touch my db. And if I change it in the mods, I have to update it manually. So i was thinking about using some git hooks to do this also automatically.
By now I'm using one git hook at the server, it's a post-receive hook and basically copies files to the production directory when pushed to master.
The prerequisites for the DB deployment are:
It needs to go both ways, if i pull from db, and it's different from my local it should update my local db.
It should be based on modifications and patchs and not the dump of the whole db, this way i can work with the team without compromising other guys work.
I was thinking about keeping a db.sql on the version control, and make a script to analyze it on post-receive (on server) and post-merge(on local), so it can take the mods and apply, and i would keep a database of which mods were applied already (the script should run in both, client and server).
Any of you guys have already done something similar to this? What would you recommend?
Thank you very much already,