Re-synching Mercurial repositories after a machine re-installtion - mercurial

I just had to re-install the system drive on my PC.
As a result I've a load of Mercurial repositories on another drive that have lost their association with the repository server.
All the local repositories are up to date as I'm the only one working on them and everything was synched before the PC's re-installation.
I'm using Tortoise hg and BitBucket.
I don't want to have to delete the repositories and re-clone them to get it all working again as this strikes me as a long way round an needless use of bandwith.
Is there a simple way to tell a local repository to re-connect to the repo on the server?
Thanks, Matt

Yeah, you just edit the default entry in the [paths] section of the repos .hg/hgrc file. You can see what it's currently set to with the hg paths command. Correct it and you're good to go.

Related

How to automatically keep remote mercurial repository at tip after pushes

I created a mercurial repository on some file servers net share.
Is it possible to automatically get the remote repository updated to tip if somebody pushes its changes?
Because some other people (purely users) may copy the repositories content (rather than cloning, because of lack of .hg) and i want them to get the newest version.
Since it is a share on a simple NAS it would be good if the pushing client could invoke this update.
It seems that a hook on the changegroup event can solve this.
Add the following lines to the repository's configuration file (repo/.hg/hgrc)
[hooks]
changegroup = hg update
This solution was suggested on a slightly different question:
Cloning mercurial repo to the remote host
At least under windows this seems only to work on local repositories. The reason for this is, that hg tries run a cmd on the remote path that fails, since it does not support UNC paths as current direcory.
Adding explicitly the repository url fixes this, but its not client independent anymore.
[hooks]
changegroup = hg update -R %HG_URL%
You could treat the server repository as your "local working directory" and then PULL from your own PC to that location. If you use hg pull --update then it will automatically update the working folder to the latest.
One way to do this is to login to your NAS and physically run the hg command line program there. If not, you could also mount the NAS folder on your local PC and then chdir to its mapped local folder and use your local hg client to do so.
This might seem like an odd thing to do but Mercurial doesn't care which is the "clone" and which is the "server", you can swap them interchangeably in your workflow.

Mercurial repository corruption repair says it's not a mercurial repository?

So I have managed to corrupt my mercurial repo. So I am following the steps from the repository corruption page on the wiki to repair it.
When I run the convert command:
hg convert --config convert.hg.ignoreerrors=True REPO REPOFIX
It gives me the following output:
initializing destination REPOFIX repository
REPO does not look like a CVS checkout
REPO does not look like a Git repository
REPO does not look like a Subversion repository
REPO is not a local Mercurial repository
REPO does not look like a darcs repository
REPO does not look like a monotone repository
REPO does not look like a GNU Arch repository
REPO does not look like a Bazaar repository
cannot find required "p4" tool
Why on earth would it say that? And how can I go about fixing it?
It definitely is a mercurial repository, it's hosted on Bitbucket, and I am using Tortoisehg to manage it.
Edit:
I think maybe I can't do this against a remote repository? How can I go about fixing this then?
You probably did not corrupt the remote repository at Bitbucket, did you?
It's more likely you corrupted your local copy, and so you can just clone it from Bitbucket again or try the hg convert … trick on your local copy (i.e. the folder you manage with TortoiseHG).
A bit late but I faced the same issue. The mistake was running that command inside the project folder. You have to run the command outside the folder containing the .hg file. I could not find a way through TortoiseHg console to move up a directory so I used windows terminal.

Mercurial and online sharing - how to proceed

A noob question... i think
I use Mercurial for my project on my laptop. How do i submit the project to an online server like codeplex?
I'm using tortoisehg and i cant find the upload interface for submit the project online...
From the command line, the command is:
hg push <url>
to push changes a remote repository.
In TortoiseHg, this is accessed through the "Synchronize" function, which seems to show up if you right-click in a Windows Explorer window but not on any file. It's also available in the workbench; the icon is 2 arrows pointing in a circle.
For these things, I find the best way to go is to use the command line interface - TortoiseHG is OK if you need to perform some common operations from the file browser, and it's a nice tool to visualize some aspects of your repository, but it doesn't implement all of mercurial's features in full detail, and it renames and bundles some operations for no apparent reason.
I don't know how things work at codeplex, but I assume it is similar to bitbucket or github, in which case here's what you'd do:
Create an empty repository on the remote end (codeplex / bitbucket / ...).
Find the remote repository's URL - for bitbucket, it is https://bitbucket.org/yourname/project, or ssh://hg#bitbucket.org/yourname/project.
From your local repository, commit all pending changes, then issue the command: hg push {remote_url}, where {remote_url} is the URL of the remote repository. This will push all committed changes from your local repository to the remote repository.
Since the remote's head revision (an empty project) is the same as the first revision in your local copy (because all hg repositories start out empty), mercurial should consider the two repositories related and accept the push.
For an introductory guide to command-line mercurial, I recommend http://hginit.com/

Cloning/Converting Local Perforce Workspace to Mercurial Repo

I'm new to Perforce and Mercurial, so bear with me. I would like to use Mercurial to interface with Perforce in the following way:
I check-out a local Perforce workspace using the P4V client. I then clone a Mercurial repo of that workspace, and use this cloned repo for all my work. When I need updated files, I would first update the local Perforce workspace, and then have the Mercurial repo pull from that. When I'm ready to commit, I push my changes to the local Perforce workspace. Then I use the P4V client to commit my changes in the Perforce workspace to the Perforce depot. Essentially, the local Perforce workspace is a proxy for the Perforce repot.
The reason behind this set-up (versus the common scenario of directly pulling from and pushing to the Perforce repot) is that there is some configuration I need to do via the P4V client (such as mapping/renaming files and directories).
I've looked at the convert and perfarce extensions, but I'm not quite sure they do what I want. They seem to do a one-time conversion, and then thereafter they talk directly to the Perforce repot. Any help would be appreciated.
Convert does an incremental conversion, where it will convert only new changes, but it's unidirectional only (perforce -> mercurial).
I've not looked at the perfarce extension, but it's my understanding that's it's built for a bi-directional, continuous process -- you might want to look at it again.
Alternately, the non-extension options on the Working with Subversion page in the mercurial wiki, details a process for using Mercurial alongside/atop Subversion w/o them interacting in any way except for the file working directory. That's probably very similar to what you're looking to do.
The Perfarce extension should do what you want. I'm also experimenting with a similar setup, and I can pull & push to Perforce quite happily.
I must admit I am having issues with local config files and how they operate in this environment, but there's a couple of other answers here on SO that appear to address this.
I would recommend you give Perfarce a go first, before reverting to anything more manual.

What's the best way to get a copy of the tip of a mercurial repository?

I want to do the equivalent of svn export REMOTE_URL with a mercurial repository. What I want at the end is an unversioned snapshot of the repository at the remote URL, but without cloning all of the changesets over to my local machine.
Also, I want to be able to specify a tag in the remote repository to pick this from. If it's not obvious, I'm building a release management tool that pulls from a canonical mercurial repository to build a release file, and it's slow right now because some projects have large, multiple-version binary files committed.
Is this possible? How would one go about it?
Its usually easier (if the remote HG is using the hgweb interface) to just visit the repo in your browser and download a .tgz / .zip / .bz2 of the tip revision. You'll see the links if the remote HG supports this.
If you want the repository, you need all of the revisions that went into the current tip for it to be at all functional.
There are options to hg clone that allow you to fetch a repository up to a certain revision, but none (that I could find) that allow you to get just the tip revision. What you are essentially asking for is a snapshot of the repo.
Edit: To Get A Snapshot
hg clone http[s]://url.to.repo repo.hg
cd repo.hg
hg archive ../repo-snapshot
cd ..
rm -rf repo.hg
The snapshot is now in repo-snapshot.
Yes, this does entail cloning the repo first, which is why I suggested seeing if the remote hgweb supports on the fly downloads of any particular revision. If it does, your problem is solved with something like curl or wget instead of HG.
If not, its good to let the original repo 'live' since you can update it again later via hg pull, then create another snapshot of a future release. This saves having to start over from scratch when cloning, especially for large repositories with lots of changes.
Also, Linux centric, but you get the gist. Of course, replace http[s] with the desired protocol as needed.
Is there any reason you can't maintain a mirror (updated in the background however often you want) of the remote repository on your local machine, then have the release management tool on your local machine run hg archive out of the local clone as necessary? If your concern is user-responsiveness, and not total bandwidth/storage consumed, this offsets the "slow" part to where you won't see it.
Tim Post noted that if you do have the hgweb CGI interface available, you can configure it to pull compressed archives down and unpack them (and the interface is consistent enough that you could script that via wget), but if you don't, core Mercurial doesn't have a lot of tools to help you, and the developers have expressed an opposition to trying to turn Mercurial into a general rsync-type client.
If you aren't afraid of playing with unofficial add-ons, you could have a look at the FTP Extension. That will force you to push from the server, however.