Cloning/Converting Local Perforce Workspace to Mercurial Repo - mercurial

I'm new to Perforce and Mercurial, so bear with me. I would like to use Mercurial to interface with Perforce in the following way:
I check-out a local Perforce workspace using the P4V client. I then clone a Mercurial repo of that workspace, and use this cloned repo for all my work. When I need updated files, I would first update the local Perforce workspace, and then have the Mercurial repo pull from that. When I'm ready to commit, I push my changes to the local Perforce workspace. Then I use the P4V client to commit my changes in the Perforce workspace to the Perforce depot. Essentially, the local Perforce workspace is a proxy for the Perforce repot.
The reason behind this set-up (versus the common scenario of directly pulling from and pushing to the Perforce repot) is that there is some configuration I need to do via the P4V client (such as mapping/renaming files and directories).
I've looked at the convert and perfarce extensions, but I'm not quite sure they do what I want. They seem to do a one-time conversion, and then thereafter they talk directly to the Perforce repot. Any help would be appreciated.

Convert does an incremental conversion, where it will convert only new changes, but it's unidirectional only (perforce -> mercurial).
I've not looked at the perfarce extension, but it's my understanding that's it's built for a bi-directional, continuous process -- you might want to look at it again.
Alternately, the non-extension options on the Working with Subversion page in the mercurial wiki, details a process for using Mercurial alongside/atop Subversion w/o them interacting in any way except for the file working directory. That's probably very similar to what you're looking to do.

The Perfarce extension should do what you want. I'm also experimenting with a similar setup, and I can pull & push to Perforce quite happily.
I must admit I am having issues with local config files and how they operate in this environment, but there's a couple of other answers here on SO that appear to address this.
I would recommend you give Perfarce a go first, before reverting to anything more manual.

Related

Is it possible to convert a googlecode hg repository to a largefile repository?

I have a remote hg repository hosted on googlecode. Thus I don't have admin access to run e.g. lfconvert on it (as far as I know), and of course lfconvert can only be used on local repositories.
So, is there any way to a convert an googlecode hg repository to a largefile repository?
(one idea is to convert a local clone of the repo to a largefile repo and then push the changes to the "central" googlecode repo, but I fear trying that without knowing if it is a valid approach).
Using your idea to do a local conversion and push, you can take advantage of the 'reset' feature for your repositories:
Do a local clone.
Convert to largefiles: `hg lfconvert normal_repo largefiles_repo``. Do NOT delete the original clone until you are sure everything works.
Reset the hosted repository (See https://code.google.com/p/support/wiki/MercurialFAQ#Mercurial_FAQ).
Push the largefiles repository.
Pushing the largefiles repository without reseting seems problematic because the largefiles repository is essentially a fork of the original one starting at the point the first largefile was committed.
If the push fails*, you can push the original clone and you'll be back where you started without any data loss. (One of the many advantages of DVCS. :-))
The big downside of course is that everybody who has ever cloned your project will now be working from a different fork of the repository. This is always a danger when you do anything involving changing history and is the motivation for Mercurial phases. If you want to be 'kinder', you can start a second project for the largefiles version and place a link at the original project cite describing the move.
[*] I can't figure out from Google Code's documentation whether the largefiles extension is supported. There is a reviewed feature request, but I couldn't find any mention of the request actually being implemented. The push failing would probably be a good indication that largefiles isn't supported though...

Mercurial and online sharing - how to proceed

A noob question... i think
I use Mercurial for my project on my laptop. How do i submit the project to an online server like codeplex?
I'm using tortoisehg and i cant find the upload interface for submit the project online...
From the command line, the command is:
hg push <url>
to push changes a remote repository.
In TortoiseHg, this is accessed through the "Synchronize" function, which seems to show up if you right-click in a Windows Explorer window but not on any file. It's also available in the workbench; the icon is 2 arrows pointing in a circle.
For these things, I find the best way to go is to use the command line interface - TortoiseHG is OK if you need to perform some common operations from the file browser, and it's a nice tool to visualize some aspects of your repository, but it doesn't implement all of mercurial's features in full detail, and it renames and bundles some operations for no apparent reason.
I don't know how things work at codeplex, but I assume it is similar to bitbucket or github, in which case here's what you'd do:
Create an empty repository on the remote end (codeplex / bitbucket / ...).
Find the remote repository's URL - for bitbucket, it is https://bitbucket.org/yourname/project, or ssh://hg#bitbucket.org/yourname/project.
From your local repository, commit all pending changes, then issue the command: hg push {remote_url}, where {remote_url} is the URL of the remote repository. This will push all committed changes from your local repository to the remote repository.
Since the remote's head revision (an empty project) is the same as the first revision in your local copy (because all hg repositories start out empty), mercurial should consider the two repositories related and accept the push.
For an introductory guide to command-line mercurial, I recommend http://hginit.com/

Automatic deployment of files from a BitBucket repository

Is there a tool out there (preferably web-based) which would automatically detect commits to a BitBucket repository, and at that time, copy all files in the repository to a web-server via FTP?
I basically want a quick and painless way (if one exists) to set up continuous integration between my BitBucket repository and my website.
No build/compilation step would be necessary, since these are only front-end (HTML/CSS/Javascript) files.
The changegroup hook is the way to do this. See Hooks for info about what to do with it.
I've used changegroup hooks on my own hg repositories, but not in BitBucket; it's possible that the BitBucket servers are restricted in what you can do, I'm not sure. I do know a wget/curl attempt to rebuild a manual upon my server upon updating its contents in a repository on SourceForge failed for me because they've locked up their servers too tightly (sending an email from the hook would work but not http access). I would expect BitBucket to be set up better; a quick search for "bitbucket changegroup hook" doesn't seem to indicate that there are any problems with it. Try it and see!

What's the best way to get a copy of the tip of a mercurial repository?

I want to do the equivalent of svn export REMOTE_URL with a mercurial repository. What I want at the end is an unversioned snapshot of the repository at the remote URL, but without cloning all of the changesets over to my local machine.
Also, I want to be able to specify a tag in the remote repository to pick this from. If it's not obvious, I'm building a release management tool that pulls from a canonical mercurial repository to build a release file, and it's slow right now because some projects have large, multiple-version binary files committed.
Is this possible? How would one go about it?
Its usually easier (if the remote HG is using the hgweb interface) to just visit the repo in your browser and download a .tgz / .zip / .bz2 of the tip revision. You'll see the links if the remote HG supports this.
If you want the repository, you need all of the revisions that went into the current tip for it to be at all functional.
There are options to hg clone that allow you to fetch a repository up to a certain revision, but none (that I could find) that allow you to get just the tip revision. What you are essentially asking for is a snapshot of the repo.
Edit: To Get A Snapshot
hg clone http[s]://url.to.repo repo.hg
cd repo.hg
hg archive ../repo-snapshot
cd ..
rm -rf repo.hg
The snapshot is now in repo-snapshot.
Yes, this does entail cloning the repo first, which is why I suggested seeing if the remote hgweb supports on the fly downloads of any particular revision. If it does, your problem is solved with something like curl or wget instead of HG.
If not, its good to let the original repo 'live' since you can update it again later via hg pull, then create another snapshot of a future release. This saves having to start over from scratch when cloning, especially for large repositories with lots of changes.
Also, Linux centric, but you get the gist. Of course, replace http[s] with the desired protocol as needed.
Is there any reason you can't maintain a mirror (updated in the background however often you want) of the remote repository on your local machine, then have the release management tool on your local machine run hg archive out of the local clone as necessary? If your concern is user-responsiveness, and not total bandwidth/storage consumed, this offsets the "slow" part to where you won't see it.
Tim Post noted that if you do have the hgweb CGI interface available, you can configure it to pull compressed archives down and unpack them (and the interface is consistent enough that you could script that via wget), but if you don't, core Mercurial doesn't have a lot of tools to help you, and the developers have expressed an opposition to trying to turn Mercurial into a general rsync-type client.
If you aren't afraid of playing with unofficial add-ons, you could have a look at the FTP Extension. That will force you to push from the server, however.

How do I push a new project to a shared Mercurial multi-repository?

I have a local machine ("laptop") and a shared Mercurial repository on another machine ("server").
The shared repository is set up as a multi-repository as described in the Mercurial documentation using Apache, the hgwebdir.cgi script and Mercurial 1.4.
The setup works in the sense that I can browse the projects (repositories) in the web browser, I can clone and pull from the server, and I can push from the laptop when the project/repository already exists on the server.
But I cannot create a new project on the laptop (hg init, do stuff, hg commit) and push it to the shared multi-repository (hg push http://server/hg/my-new-project-name) - I get "abort: HTTP Error 404: Not Found", presumably because the directory/project repository does not exist yet.
How can I push a new project/directory structure to a Mercurial running elsewhere? I couldn't find anything in the documentation, how do you guys do it?
You cannot create new remote repositories over http with the built-in functionality. Your options are to either:
create with a ssh clone: `ssh clone local-repo ssh://you#remote//path/to/repo'
log in to the remote repo and do a hg init where you want the repo. After that you can push to the new empty repo
Use a cheesy http-creation CGI like the one I wrote here: http://ry4an.org/unblog/UnBlog/2009-09-17
Update
I tried using Dropbox as described below, but couldn't make it sufficiently reliable, so I'm not recommending that option.
Original answer below, kept for context.
/update
I found one more option: Skipping both http and ssh altogether and using Dropbox for shared repos.
For the one-person-multiple-computers scenario, it looks like the simplest option of the lot, and you get backups as a nice side effect.
Here is a discussion on Hacker News