Is there a way to commit selected chunks of files non-interactively with mercurial? - mercurial

When using mercurial, I know we can use hg commit --interactive to select which chunks of which files to commit. Is there any way to do this non-interactively, perhaps by preparing a patch file then applying it?
I'm thinking of making a vim plugin to handle mercurial commits, and it would be nice if there was a way to commit only specific chunks after the user selects them. For git, I think we can use git apply with a patch file, and there are many plugins which provide this feature for git (but none that I know of for mercurial).
I'm looking for something that exists in mercurial's public CLI interface, since they discourage using the python hg libraries (it's not considered a stable API).

Related

How to visually (side by side) view a patch generated by Mercurial Queues on a repo?

I use a Mercurial Queues to create patches while I work on the repository.
So, let's say after I'm done with a patch, I do hg qrefresh and export the patch to some file I want.
So, now I have a patch file.
How do I view this visually view this patch, side by side , to know the changes I made to the original file(s) in the repository?
I know of one direct way: Keep copies of all the files before I edit and use kdiff3 or meld when I'm done. But this is clearly very time consuming and not straightforward.
If you still have the patch in MQ you can do a side-by-side view using the ExtDiff extension, which you already have installed but probably not enabled.
Then you'd:
hg extdiff .... -r revision_before_patch -r revision_including_patch
On the other hand if you want to be able to do it from just original files and a .patch file you'll need to find a diff program that takes an original and a patch instead of an original and a result. My (old!) favorite is xxdiff.

Converting a Mercurial repository to Bazaar

Is there an easy way of converting an existing Mercurial repository to Bazaar without losing any history? If I convert Mercurial to Subversion to Bazaar will I lose any history?
You need to use bzr-fastimport plugin. It has hg-fastexport helper to dump your Mercurial history to fastimport stream which can be imported into Bazaar branch.
The entire history should be preserved this way. There is one type of information which will be lost though: information on file copies, because bzr does not support that.
Another option is to use bzr-hg plugin which should be able to work directly with Mercurial repositories. Because you're asking for an easy way then I suggest to try bzr-hg first.
According to the help for hg convert ("hg help convert"), it only converts to type of Mercurial or SVN (Bazaar is only supported as a source repository). If you decide to go Mercurial -> SVN -> Bazaar using "hg convert", the help file says history on branches isn't preserved.

Is it possible to use the Mercurial Fetch extension with the TortoiseHg merge dialog?

I'm trying to implement Mercurial in the company where I work. Previously we used ClearCase, but for various reasons we decided to upgrade to Mercurial. The development team is very accustomed to the ClearCase workflow, especially the visual tools. Therefore, for our implementation of Mercurial, we will be using TortoiseHg.
A Mercurial extension that caught my attention is the Fetch extension, which allows to do a "hg pull -u" followed by "hg merge" and "hg commit", if necessary. The extension basically does what we want and integrates perfectly with TortoiseHg configuring it to run automatically "Post Pull."
The only problem is that the Fetch extension does not allow to compile and test the merge before running the commit. However, using the TortoiseHg merge dialog with "hgtk merge -r tip" there is a visual way to run the merge command, but with the advantage of allowing me to compile and run the tests. If all went well I press the Commit button, but if something went wrong just press Undo and everything is back as before.
TortoiseHg Merge Dialog: http://www.freeimagehosting.net/uploads/a2f43fe5ff.png
So, my question is:
Is it possible to use the Mercurial Fetch extension with the TortoiseHg merge dialog?
If it's not possible, how would you recommend implementing this workflow? Is there a way to assign an Alias to this secuence:
hg pull -u
* if merge is needed *
hgtk merge -r tip
I am not sure if you could have an "if" in an alias, but I think not.
To answer your last question: no, there are no way to make such aliases with the command line version of Mercurial or with TortoiseHg.

What's the best way to get a copy of the tip of a mercurial repository?

I want to do the equivalent of svn export REMOTE_URL with a mercurial repository. What I want at the end is an unversioned snapshot of the repository at the remote URL, but without cloning all of the changesets over to my local machine.
Also, I want to be able to specify a tag in the remote repository to pick this from. If it's not obvious, I'm building a release management tool that pulls from a canonical mercurial repository to build a release file, and it's slow right now because some projects have large, multiple-version binary files committed.
Is this possible? How would one go about it?
Its usually easier (if the remote HG is using the hgweb interface) to just visit the repo in your browser and download a .tgz / .zip / .bz2 of the tip revision. You'll see the links if the remote HG supports this.
If you want the repository, you need all of the revisions that went into the current tip for it to be at all functional.
There are options to hg clone that allow you to fetch a repository up to a certain revision, but none (that I could find) that allow you to get just the tip revision. What you are essentially asking for is a snapshot of the repo.
Edit: To Get A Snapshot
hg clone http[s]://url.to.repo repo.hg
cd repo.hg
hg archive ../repo-snapshot
cd ..
rm -rf repo.hg
The snapshot is now in repo-snapshot.
Yes, this does entail cloning the repo first, which is why I suggested seeing if the remote hgweb supports on the fly downloads of any particular revision. If it does, your problem is solved with something like curl or wget instead of HG.
If not, its good to let the original repo 'live' since you can update it again later via hg pull, then create another snapshot of a future release. This saves having to start over from scratch when cloning, especially for large repositories with lots of changes.
Also, Linux centric, but you get the gist. Of course, replace http[s] with the desired protocol as needed.
Is there any reason you can't maintain a mirror (updated in the background however often you want) of the remote repository on your local machine, then have the release management tool on your local machine run hg archive out of the local clone as necessary? If your concern is user-responsiveness, and not total bandwidth/storage consumed, this offsets the "slow" part to where you won't see it.
Tim Post noted that if you do have the hgweb CGI interface available, you can configure it to pull compressed archives down and unpack them (and the interface is consistent enough that you could script that via wget), but if you don't, core Mercurial doesn't have a lot of tools to help you, and the developers have expressed an opposition to trying to turn Mercurial into a general rsync-type client.
If you aren't afraid of playing with unofficial add-ons, you could have a look at the FTP Extension. That will force you to push from the server, however.

Is it possible to checkout a single directory from a Mercurial (HG) repository?

So, I'm trying to checkout just the TestNG plugin from the Netbeans contrib repository. (Or is it module? I'm new to Mercurial, so I don't really know the lingo yet.)
When I run the following command...
hg clone http://hg.netbeans.org/main/contrib/
...I get the entire repository, which contains all of the the contrib plug-ins. Is it possible to just pull this location?
http://hg.netbeans.org/main/contrib/file/tip/testng/
Thanks!
This concept is called "narrow cloning" and no, it's not possible at the moment in Mercurial.
It's on the radar of some of us that contribute to Mercurial but it's a hard problem to solve. For example:
How do you calculate the hash of any new commits you make if you don't have all of the files in the repo?
What happens if you try to view the history of a file in contrib/testng if that file was moved from another folder?
I'm not sure, but I think the answer in the general case is "probably not".
If the repository is local (it doesn't sound like it is in your case), you can do something like:
hg archive -R /path/to/my/repo -I /path/to/my/repo/folder/i/want export-folder-name
(The command would need to be something that exports non-VC'd files, rather than creating a partial repo, since the .hg stuff is stored once at the toplevel, rather than in pieces in each folder as SVN does.)
It doesn't work on remote repositories, though. Neither does "hg log", and the hg folks explained why:
Imagine I send a log -p command to http://www.kernel.org/hg/linux-2.6, which is
approaching 100k changesets. At one diff per second (lots of seeking), this will
take about 3 hours of CPU/disk time on the server, nevermind metric tons of
bandwidth. It would be faster and simpler for everyone just to clone the repo
and do the log locally.
I suspect hg archive can't work remotely for the same reason.