tl;dr;
hg clone ssh://hg#bitbucket.org/team/repo ~/prod/ fails with "destination is not empty" if ~/prod/ is not empty. Can I force cloning?
I am trying to write my first Ansible playbook that should deploy my code from a Bitbucket Mercurial repository to my server. There is a deployment path, ~/prod, which contains all code files as well as the data in ~/prod/media and ~/prod/db.db. To make sure the playbook works even if the ~/prod directory is empty or doesn't exist, this is what I have so far:
- name: create directory
file: path=/home/user/prod state=directory
- name: clone repo
hg:
repo: ssh://hg#bitbucket.org/team/repo
dest: /home/user/prod
force: yes
In my understanding, it ensures that the deployment directory exists and then clones the repo there. It works beautifully if the directory doesn't exist or is empty. However, as soon as I've cloned the repo once, this playbook fails with destination is not empty.
I can move media and db.db out first, then delete all other files, then clone, then move the data back. But it looks cumbersome.
I simply want to force cloning. But I cannot find the way to do it. Presumably this is so wrong that Mercurial won't allow me doing this. Why and what's a better way to go?
Though I haven't yet read it anywhere, looks like force-cloning is impossible. The two alternatives then are, as explained in another thread on the same topics:
indeed, clone the .hg folder to another directory and then move it to the target directory
or, hg init /home/user/prod and then hg pull ssh://hg#bitbucket.org/team/repo /home/user/prod; hg update -C -R /home/user/prod.
With the second one, it is possible to optimise the Ansible task, to perform this action only if the target directory doesn't contain .hg:
- name: recreate repo
command:
hg ssh://hg#bitbucket.org/team/repo -R /home/user/prod
creates=/home/user/prod/.hg # <-- only execute command if .hg does not exist
- name: update files
hg:
repo: ssh://hg#bitbucket.org/team/repo
dest: /home/user/prod
clone: no
update: yes # optional, for readability
force: yes
notify: "restart web services"
Related
I have often used this approach to dot file management in git, where I create a bare git repo "~/.dotfiles" and us $HOME as a work tree. With the shell alias config I can then add dot files from the home dir quickly (as in config add, config commit
alias config='git --git-dir=$HOME/.dotfiles/ --work-tree=$HOME'
I wonder if a similar setup is possible in mercurial.
You can use a regular repository for that[^bare] and clone it with the share extension. Creating a new home dir as one-liner:
hg --config extensions.share= share $HOME/.dotfiles $HOME
For more information see hg help share. For information how to ignore changes to untracked files, see hg help hgignore.
[^bare]: If it is important for you to have no files in the .dotfiles, just hg update null in ~/.dotfiles. That’s the root of the repository (before anything got added). Mercurial needs no special bare state.
Over time a number of the developers have committed files that were then added to the .hgignore. From what I hear there is no way to remove items from the history of mercurial, which is ok. But I also heard that there is a way to do a clone, I think using the convert plugin, to clone/export a repo while specifying which files to not include in the conversion.
I can't help but think that someone out there has a script that does this export/filter/convert using the patterns from the .hgignore file.
Has anyone created such a beast?
You could create a filemap from .hgignore doing something like this:
hg clone -U yourrepo temprepo # create a temp repo with no files in working dir
cd tmprepo
hg revert --all # put files in working dir
hg forget ** # un-add the files
hg status --ignored --no-status | sed 's/^/exclude /' > ../filemap
that will get you a filemap you can pass into hg convert that removes all the added files that would be ignored given your .hgignore.
Do understand though, that running convert creates a whole new repo that is unrelated to your previous repo. All existing clones will be unusable with the new one. It's not normally worth it.
hg convert is indeed the thing you want to use.
You will want to create a file map (just a text file) which will list all of the things you either want to include, exclude, or rename:
include subfolder
exclude subfolder/supersub
etc...
Read the following for a more concrete example:
https://www.mercurial-scm.org/wiki/ConvertExtension#A--filemap
Once you have created this file you will just use the following command:
$ hg convert --filemap my_file_map /path/to/source/repo /path/to/dest/repo
The source repo will not be modified and a dest repo will be created. I don't want to just copy verbatim what the documentation already says so here is the link:
How to keep just a subdirectory (or run on the mercurial repo):
https://www.mercurial-scm.org/wiki/ConvertExtension#Converting_from_Mercurial
I created a repository on a remote machine using:
hg init
hg add
hg commit
The repository was created.
I cloned the repository on a local machine with no errors reported; The files seem to be there
Now I'm trying to make a clone of the clone (as a working copy) using:
hg clone "path to original clone"
It returns:
destination directory: "name of repository"
abort: No such file or directory: "path to original clone"/.hg/store/lock
What am I doing wrong?
Thanks
What filesystem is used on the partition where the main repository is ?
Actually, when Mercurial is doing some operations, it needs to lock the repository. For doing this it creates a symbolic link to an nonexistent file, when the filesystem supports it, in the .hg repository, telling every other processes that the repository can't be modified at this time. When symbolic links aren't supported by the filesystem, a normal file is created.
However, there's some problems with some FUSE filesystems, typically SSHFS with the follow_symlinks option activated. FUSE reports that he knows about symbolic links, but since SSHFS follows the symbolic link and the file doesn't exist, the "state" of the link is marked as unknown thus Mercurial thinks the repository isn't correctly locked and abort the operation.
I see you're using Cygwin, so maybe it's the same kind of problem with tools designed for UNIX on a windows filesystem. It's a strange, coworkers of mine are using Mercurial via Cygwin just fine.
I don't know if it's the case for you, but I lost nearly half a day on this problem. Maybe this answers can help some people in the future.
Please paste in the actual command that's failing and the output including the actual path to the clone that you're cloning. When you do the clone use --debug and --traceback too.
As a workaround you can can always try hg init newclone followed by hg pull -R newclone pathtooriginalclone, which is effectively equivalent except it doesn't use local hardlinks when possible.
Following is the scenario: I have a remote Mercurial repository at ssh://remotehost//dir/repo and I am able to clone it to a local host "pandora" in directory /home/user/localrepo/.
Now, I have a superset of this remote repository, where I add my own testing framework, but do not want to merge to the main depot until I am certain it works. So I clone this "local" repo to /home/user/workingdir/ but when I issue the command to do so
$ hg clone /home/user/localrepo/
only the repository folder gets copied none of the files get copied.
I'm not sure what you mean when you say that "only the repo folders gets copied". So there's two things you can try :
Try to do a hg update in your new clone.
List the directory in /home/user/workingdir and if there is a directory name localrepo in it, this is actually your repository. To clone in the current directory, you must do hg clone /home/user/localrepo .
This sounds odd but try a few things:
First in the local repo that you cloned from do a
hg status -A
are all the files that you think should be in there in there? If not are you at the tip of the repo.
You can see what revision you are at with
hg parent
If you want to just go to the tip do hg update
If there still aren't any files listed in the repo do the same to check the one on the server.
If there aren't any files on the server you will need to add all of the files you want mercurial to track, mercurial doesn't automagically start tracking files in the repo location.
(Use hg add --all to add all of the file in the entire directory tree under the repo location.)
If there are files in the local repo, check the testing area and make sure that it is on the proper changeset.
I am working on a system that performs continuous integration and I am looking for a method I can use to get the most recent changeset from a Mercurial repository without creating a repository locally.
I have considered using clone but this method will only work if you have set a working directory locally (since this will be occurring on a build server, I would prefer not to do this because of inclusion of the .hg file and the diffs - all I want is essentially an export of the files from the tip revision and nothing more.)
This request may not even be possible, and it's very likely that I just do not understand DVCS very well. However, if I cannot do what I want to do, is there a workaround?
It's possible using 'hg archive' depending how your remote repository is set up.
If it's available over HTTP using hgweb.cgi or hg serve you can hit the archive link programmatically to get the files you want. Example:
wget https://www.mercurial-scm.org/repo/hg/archive/tip.zip --output-document=- | unzip -
or it's available over ssh:
ssh you#there.com hg archive --type=zip - | unzip -
You can use:
$ hg clone http://your_repo
$ hg archive ../export/
$ rm -rf *
$ cd ..
$ cd export
From Mercurial's help files:
$ hg help archive
hg archive [OPTION]... DEST
create an unversioned archive of a
repository revision
You can use:
http://merc/raw-file
to retrieve a list of files in the repository or
http://merc/raw-file/filename
to get a specific file.