Get changes from mercurial to FTP site - mercurial

I work with a partner on an PHP site for a client. We have a common Mercurial repository (on Bitbucket), both local copies and the live site. We have only FTP access to the live site (which can't be changed since it is a hosting package with FTP only).
I want to be able to push changes from the repository to the live site.
Until now I simply keep track of changed files in the repo and copy them manually with FileZilla - a error prone and annoying task. My idea is, to mount the remote location locally (i.e. using CurlFtpFS) and tell mercurial to automagically copy changed files to the site. Ideally I want to be able to specify which changes but this would be a bonus. It would be sufficient if the local state of the files within the repo are synced.
Is there any good way to do this using linux commandline tools?

My first recommendation is, if at all possible, get a package that allows more access. FTP only is just brutal.
But since you are looking for a real answer to your question, I have two ideas for you:
I would suggest looking into the mercurial FTP Extension. I personally have never used it since I have never gotten myself stuck in a ftp-only situation (not for a long time at least), but it looks promising. Looks like if you make sure that you tag your production releases it will work really well for you. (make sure to use the -uploaded param)
Also, if you only ever want the tip to be installed on your production env, then you could look at the suggestion Martin Geisler made on the bitbucket user group a few days ago. Basically his suggestion is to utilize bitbucket's "ping url" functionality. You would have to write a server-side script/url handler that would accept that ping, then fetch the tip from bitbucket (as a zip) and then unzip/unpack it. This is a bit complicated, but if you are looking for complete automation and the tip will always be the best this could work for you.

One notion is the use the hg archive command:
hg archive /path/to/curlftpsfs
which will put a snapshot of your repo in that location -- it will however overwrite any file already there.
Another option is to create a Mercurial clone in that same /path/to/curlftpsfs and then just do a hg pull ; hg update in it on your local system with the remote one mounted. Setting that up initially will mean transferring the whole thing but subsequently you'll only be sending deltas.
Some folks don't like this last options because it exposes your entire /.hg repository too, but you can block access to that at the web server.

I came across this problem a while ago after switching from AWS to a local web hosting that provides only ssh/ftp.
My previous approach of updating a production site on AWS using "hg pull; hg update -C" can no longer be used on the new web hosting. They don't have mercurial installed for shared hosts.
So, what I did is to mount the remote location using ftp, to a local machine (i.e. your laptop), then run the hg pull and update commands locally on your machine at the path where has the remote ftp site mounted.

Windows solution:
BeyondCompare (http://www.scootersoftware.com/) is an awesome piece of software. Apart from being awesome it can mirror your local folder to the FTP site. It's comparing files and only transfers what's new.

Related

Web development scheme for staging and production servers using Git Push

I am using git to manage a dynamic website (PHP + MySQL) and I want to send my files from my localhost to my staging and development servers in the most efficient and hassle-free way.
I am currently convinced that the best way for me to approach this problem is to use this git branching model to organize my local git repo. From there, I will use the release branches to push to my staging server for testing. Once I am happy that the release code works on the staging server, I can then merge with my master branch and push that to my production server.
Pushing to Staging Server:
As noted in many introductory git posts, I could run into problems pushing into a non-bare repo, so, as suggested in this response, I plan to push the release branch to a bare repo on the server and have a post-receive hook that clones the bare repo to a non-bare repo that also acts as the web-hosted directory.
Pushing to Production Server:
Here's my newest source of confusion...
In the response that I cited above, it made me curious as to why #Paul states that it's a completely different story when pushing to a live, development server. I guess I don't see the problem. Would it be safe and hassle-free to follow the same steps as above, but for the master branch? Where are the potential pit-falls?
Config Files:
With respect to configuration files that are unique to each environment (.htaccess, config.php, etc), it seems simplest to .gitignore each of those files in their respective repos on their respective servers. Can you see anything immediately wrong with this? Better solutions?
Accessing Data:
Finally, as I initially stated, the site uses MySQL databases to store data. How would you suggest I access that data (for testing purposes) from the staging server and localhost?
I realize that I may have asked way too many questions for a single post, but since they're all related to the best way to set up this development scheme, I thought it was necessary.
Pushing to the production server
I assume that in the response you quote, the answer refers to pushing to the production server as "a different story", just because one can push any old commit to the staging server for testing, but you would be very careful only to push a thoroughly tested version to the production server.
I think the approach you refer to (of deploying by pushing to a bare repository with a post-receive that does git checkout -f with an appropriately set GIT_WORK_TREE) is a good one for deploying from git.
Config Files
That is a reasonable plan, but you have to be a somewhat careful about using .gitignore to ignore configuration files - you might want to look at this answer for more about this:
How can I have different versions of a file in the local working directory, remote working directory and git ftp target?
Accessing data
I think the question about data for your staging server really is a separate issue, since none of that data will be in your version control system - it might be worth adding another question here about that issue. You could have a script that dumps data on your live server and imports it to the staging server, but I can think of many situations in which that would be undesirable, particularly where customer details and data protections laws have to be considered.
The Git FAQ reccomends this post-receive hook script to reset the head of a non-bare repository after it is pushed to. It will save any changes that are uncommitted on the remote using a stash. Personally, I'd rather it reject the push in that case, but that can be cone
(please note: Lots of answers contain out of date links to the FAQ and the script - hopefully these will remain valid for some time at least)
I use git flow too. For config files in expressionengine, we use ee master config which basically determines the environment its in and applied a specific config. I imagine it could easily be modified for whatever you're doing.
For deployments, we use Beanstalk which allows you to add "[deploy:Environment]" to a commit message, which will make it upload (ftp) your specified branch (the one you commit to) to the specified environment, which you configure in their web interface, when you git push.
I've been trying to find an effective solution for .htaccess files that will allow me to htpasswd one of my environments, but not all. It looks like it's possible in Apache 2.3 with something like this:
<if "%{HTTP_HOST} == 'dev.example.com'">
# auth directives
</if>
but sadly, most of the production servers we use are running an earlier version, which doesn't support the directive :(

Workflow for using TextMate/Coda with Transmit and Versions

I use TextMate to do my HTMl,PHP,JS/Other languages and CSSEdit to do my CSS.
I want to integrate TextMate with Transmit better because at the moment I work like this:
TextMate: Edit code
Transmit: Look for folder and drag to online server
Firefox: Refresh page
Rinse, Repeat.
It feels very clunky to me and I do the same with CSSEdit (although CSSEdit's live preview means that I only have to upload once) but I would like to be able to, on save, have Transmit upload the edited document to the relevant place on the server (given that linked browsing is enabled).
Does anyone have a certain workflow that they follow or macros enabled in TextMate to do such tasks as they would certainly make my life a lot easier, Coda is also an option instead of TextMate if needed.
Being able to have Versions/Git-Tower auto commit on save would be great too.
I recommend #Adam's solution for the uploading part of your question but why are you using Git and Transmit simultaneously? Why not Git for everything?
My workflow:
On my machine I keep a Git repository where I do all the work. The working directory is served by MAMP so that I can test my code before commiting anything.
When I'm satisfied I commit my latest changes until I think the branch I'm working on is stable.
When I'm ready, I push to the server where a post-commit hook checks out the latest version to what the "pre-prod server".
When everything has been tested to death, branches merged and so on I check out manually the repository to the "prod server".
No need to use an FTP client at any point, everything is done from the editor (TextMate before, Vim now).
If you set up a site in Transmit, and open the local directory that holds your files, you can activate the Textmate Transmit bundle by typing ctrl-shift-f. Then hit either 1 or 2. 1 will upload the current directory, 2 will send the current file.
You might consider using Transmit's ability to mount FTP servers as volumes and simply edit the files directly on the server. To TextMate the mounted FTP server will appear to be just another volume. Search the help files for Transmit Disk, their name for this feature.

Why doesn't Mercurial support remote repository creations over HTTP?

I know it is not possible to create Mercurial repositories remotely using HTTP(S), for instance:
$ hg init https://host.org/repos/project
or
$ hg clone /path/to/local/project https://host.org/repos/project
But, what's the reason? Security issues? No need for it? Simply because nobody has implemented it yet?
Rationale for this question: In my company we share most resources via HTTPS, i.e. access permissions are managed by Apache only and regular users cannot login via SSH on the server. That's just perfect as long as repositories need to be served only (for that purpose we are happy with hgwebdir.cgi). However, we also want to allow the remote creation of repos, without the need to maintain additional/patched scripts on the server and extra tools on clients.
To be clear: This question does not ask for solutions to our particular problem but for the reason why Mercurial does not support this feature itself.
UPDATE
Here's a more technical description of the situation I'm thinking of. Supposed hgwebdir.cgi serves a collection of repositories in /path/to/repos at https://.../repos (with pushing enabled). Every user allowed to access this URL (as configured in Apache) may pull and push changesets, effectively this means that hgwebdir.cgi (and thus hg) edits and creates files below /path/to/repos. Now, what's the barrier in letting hgwebdir.cgi also create new repositories below /path/to/repos?
I think the reason is that adding support for creating repositories will bring in a fair amount of baggage:
if you can create repositories you would expect to be able to delete them. While that might seem simple, it would be a big step away from the safe manner in which Mercurial normally works -- there is no destructive commands in standard Mercurial.
people would also want to edit the .hg/hgrc files to set the description and contact information -- standard Mercurial never changes the config files, so this would again be a new thing.
people would also want to manage users' access to the new repositories -- this means editing .htaccess files or the equivalent for other webservers.
... and so on. Implementing this "little" feature will open up for a lot of extra feature requests and we only have a few Mercurial developers that are also sawy web developers.
However, there is now an excellent open source solution: Kallithea gives you a "mini-Bitbucket" that you can deploy on your own server. It will do all of the above. I would install that on my server if I needed something more powerful than plain hgweb.cgi. It supports both Mercurial and Git.
As far as I know, none of the SCM alternatives allow the creation of remote repositories natively. SVN, CVS, Git, et al.
That's usually the job of a hosting provider: SourceForge, Google Code, BitBucket. All of them implement the repository creation on top of their authetication infrastructure.
For example, Debian's Mercurial hosting is limited to Debian Developers, and to create a new repository you need to login via SSH to the server and create the repository on your local home folder, much like Apache's public_html directory.
Various answers (including your own) give some pretty good reasons why the functionality isn't there (separation of concerns mostly), but if you really want to add it you could do so with just a line or two of shell. Here's a hideously unsafe example I gave quite a while ago showing how to add that funcionality in high trust environments: Remote Repository Creation in Mercurial over HTTP

Is there a good (gitorious-like) server for mercurial?

At the company where I work we are using hg as (d)vcs.
Most of the repositories in use are kept in a cenralized space and served via hgweb.
For ease of use and better user experiance (and overview) I like to have something like gitorious (github, bitbucket).
It should allow
hg as backend (or else I'd install gitorious...)
local installation (not per developer, but locally on our site / not hosted)
easy (web-based) repository-creation
personal forking (cloning, but keeping the new repo physically on the same server)
merge requests
A good tool is RhodeCode that serves Mercurial. It looks really good, has user management, grouping, LDAP integration hook control and some graphing options.
The current release (1.3.x) supports git repositories.
You should make this decision looking at the PublishingRepositories wiki page.
My preferred solution is to use the hg-ssh script that already comes with your mercurial install. It makes it very easy to give multiple people ssh access without creating a separate system account for each, and without giving them shell access. It's very easily configured in the .ssh/authorized_keys file of the single shared user.
Repository creation isn't web-based, but it's very easy and personal forking is completely supported:
hg clone ssh://shared#server/main/repo ssh://shared#server/my-personal/repo
I then set up the hgweb script that comes with mercurial to provide a read-only view, and rely on ssh:// for all writes (though hgweb also does writes / push just fine).
If you really think web based repo creation is easier than one-line ssh-based creation I've previously written a stupidly simple script to do so:
http://ry4an.org/unblog/UnBlog/2009-09-17
Someone is going to suggest "mercurial server", and I'd recommend against it. It's not current and never added much value over ssh.
BitBucket.
They are the official HG host, and are actually very good.
I'm completely biased, since I'm a developer on it, but Kiln does a very good job helping you create and manage repositories. It also has code reviews and is commercially supported. You can install on your own server, or Fog Creek will host it for you.

Mercurial website VC over HTTP or FTP - no SSH

I would like to have a mercurial repository on my website so that I can push/pull as I make updates to it, but I do not have SSH access, only HTTP or FTP.
Can this be done?
I suspect no, since I would not be able to run hg on the server, so I would only be able to clone it.
Yeah, this can definitely be done. You don't need ssh access to install mercurial or to access it. You compile it on your own system and then FTP up the resulting file. The only real requirement is that the website to which you upload it allow CGI applications to run. Without that you're limited to the static-http repositories, which don't support pushing.
You're correct. If you can't install software, you could use the static-http option, but it only supports cloning/pulling. See the comparison of publishing mechanisms.
You can setup hgweb to allow pushes.
I'm also using only FTP to update my site, but I keep all the files in a local Mercurial repo (you could use a repository on bitbucket - they have a free plan that includes 1 private repo and 1 GB of available space). When you need to add/update something, apply the
changes locally, update the Mercurial repo, and then use FTP to update the website.