Has anyone taken a local repo and imported it into Bitbucket? When I go to do this, the Import page asks for a URL, but I'm working on a local computer that does not have port 8000 open to the outside world.
Can I just use some special form of a file path?
First you need to create a repository in Bitbucket, go to Repositories -> create repository. Then you can choose between HTTPS or SSH.
You can customize your hgrc file like this:
[ui]
username = Your Name <youremail#example.com>
[paths]
myproject = https://.. # The one provided by Bitbucket
Now you can just push your changes to the repository:
$ hg commit -m "my changes"
$ hg push myproject
Or pull changes:
$ hg pull -u myproject
The -u option will also update your local repository after pulling the changes. You can use this option instead of pulling and then updating your local repository. The -u option is the same as doing:
$ hg pull myproject
$ hg update
You may also want to take a look to the hgignore file.
Related
I would like to export files from a repository, ignoring changes in the working tree. Furthermore, rather than exporting everything, I would like to see a subset of it. The destination directory might already contain some files and those must be overwritten.
Given:
project/some/sub/dir/
I would like to export it to:
output/dir/
In git, I can use:
git archive --prefix=dir/ HEAD -- some/sub/dir/ | tar -xv -C output
What is the equivalent command in hg? If I use hg archive -t files -I some/sub/dir output/, then I get output/some/sub/dir. I could pipe the result through tar, but then I have to manually calculate the prefix that should be dropped:
hg archive -t tar -I some/sub/dir/ - |
tar -xv -C output --strip-components=3
(in reality, I have some other tar patterns that should be ignored such as --exclude='.*'). Any ideas? This export will be done for three other directories located in the repository.
Current situation:
srcdir=some/sub/dir
dstdir=output/dir
# hg archive auto-adds a 'proj-version' prefix. Given the srcdir,
# proj-version/some/sub/dir/X should become dstdir/X, so strip 4 components
prefixlength=$(grep -c / <<<"/${srcdir%%/}/")
hg archive -t tar -I "$srcdir" - |
tar -xv -C "$dstdir" --strip-components=$prefixlength
You can
hg archive ... && cd output/some/sub/dir && tar ... isn't it?
Build intermediate repo (Convert Extension), where some/sub/dir/ will be root of this repository (understand also sample from Converting from Mercurial topic) and get tar'red archive directly from hg archive for intermediate repository
I create a local repo with
local-host $ hg init ~/test/
and then in the remote host i do a clone
remote-host $ hg clone ssh://user#local-host/test
without any issues.
When im trying to check if there are outgoing changes in the remote repo im getting this error
remote-host $ cd test
remote-host $ hg --verbose out
comparing with ssh://user#local-host/test
running ssh user#local-host 'hg -R test serve --stdio'
searching for changes
no changes found
remote: abort: no repository found in '/home/user' (.hg not found)!
remote: abort: no repository found in '/home/user' (.hg not found)!
If i commit any change in the remote/local repository and push it im getting the error but the change gets pushed.
Both hosts have the same mercurial version.
Any ideas ?
You need to let it know which repository - easiest is:
remote-host $ cd ~/test/
remote-host $ hg --verbose out
I am using Mercurial SCM over a LAN using a normal shared folder instead of HTTP and I'm having a problem getting the auto update hook to run.
I have entered this hook as detailed in the FAQ. This installs the hook, but when I push something to the remote repository, I get an error:
added 1 changesets with 1 changes to 1 files
running hook changegroup: hg update >&2
warning: changegroup hook exited with status -1
There is another stackoverflow question similar to this, but it offers no solutions other than it may be a permissions error somewhere.
Has anyone else had this problem and can anyone else shed any more light on this or give me a heads up on where to start fixing this? Thanks.
Is hg in your standard search PATH ?
Replace your .hgrc configuration with a custom script, e.g.
[hooks]
changegroup = /var/tmp/myscript.sh
[ui]
debug = true
(unix) In the /var/tmp/myscript.sh write something like this:
#!/bin/sh
set -e
echo ---------- >>/tmp/myscript.log
set >>/tmp/myscript.log
echo --- >>/tmp/myscript.log
pwd >>/tmp/myscript.log
hg update >>/tmp/myscript.log 2>&1
(unix) Do not forget to mark as executable: chmod a+x /var/tmp/myscript.sh
(windows) The corresponding batch file myscript.bat is:
#echo off
echo ------ >>%TEMP%\myscript.log
set >>%TEMP%\myscript.log
echo --- >>%TEMP%\myscript.log
cd >>%TEMP%\myscript.log
hg update >>%TEMP%\myscript.log 2>&1
(windows) Of course, the line in .hgrc is changegroup = \your\directory\myscript.bat.
Run your hg push command to reproduce the problem.
Check the content of the /tmp/myscript.log file.
I'm trying to install mercurial-server. After adding my keys to keys/root and refreshing auth, I tried to clone hgadmin-repo but I get the following error:
$ hg clone ssh://hg#<domain>/hgadmin
remote: mercurial-server: no such repository hgadmin
abort: no suitable response from remote hg!
Anyone know what's the problem?
I had this same problem and for me it was a problem with the installation of the hgadmin repository. When I installed the package, I got errors from python saying the mercurial package wasn't installed. I assume that happened when mercurial-server tried to initialize the hgadmin repository. So when I went to checkout the hgadmin respistory, there was no .hg directory:
root#myshost:/var/lib/mercurial-server/repos# cd hgadmin/
root#myshost:/var/lib/mercurial-server/repos/hgadmin# ls -a
. ..
In order to resolve this, I did:
easy_install mercurial
sudo apt-get purge mercurial-server
sudo rm -rf /var/lib/mercurial-server
sudo apt-get install mercurial-server
And then continued on with the directions here:
http://kurtgrandis.com/blog/2010/03/20/gitosis-for-mercurial/
Thanks a lot Randy for exposing the exact issue here.
I struggled with the same problem, and found an alternative approach to solving it (without the need to purge and re-install).
You can initialize the hgadmin repo manually and install the hooks, achieving the same effect as a normal installation. You need to to it as 'hg' user though.
Procedure
The commands worked for my environment (Ubuntu 10.04.4 / Hg 1.4.3)
First initialise a mercurial repository in /var/lib/mercurial-server/repos/hgadmin :
$ sudo su hg
$ cd ~/repos/hgadmin/
$ hg init
Then the only difference I found with a normally initialized hgadmin repo (that I deployed in a VM for comparison) were the hooks in .hg/hgrc file. So open the file :
$ vim .hg/hgrc
and paste this exact content :
# WARNING: when these hooks run they will entirely destroy and rewrite
# ~/.ssh/authorized_keys
[extensions]
hgext.purge =
[hooks]
changegroup.aaaab_update = hg update -C default > /dev/null
changegroup.aaaac_purge = hg purge --all > /dev/null
changegroup.refreshauth = python:mercurialserver.refreshauth.hook
Are you sure your clone command syntax is correct? I see at least two errors in it:
You must put the repo you're cloning (not just the destination)
Just as for push, you must use two slashes before hgadmin:
Example FAILING (missing the source repo and using only one '/' before 'home')
$ hg clone ssh://John#127.0.0.1/home/John/delme
Example FAILING (missing the source repo)
$ hg clone . ssh://John#127.0.0.1/home/John/delme
Example SUCCEEDING:
$ hg clone . ssh://John#127.0.0.1//home/John/delme
Is there any way to archive a Mercurial repository to a remote directory over SSH? For example, it would be nice if one could do the following:
hg archive ssh://user#example.com/path/to/archive
However, that does not appear to work. It instead creates a directory called ssh: in the current directory.
I made the following quick-and-dirty script that emulates the desired behavior by creating a temporary ZIP archive, copying it over SSH, and unzipping the destination directory. However, I would like to know if there is a better way.
if [[ $# != 1 ]]; then
echo "Usage: $0 [user#]hostname:remote_dir"
exit
fi
arg=$1
arg=${arg%/} # remove trailing slash
host=${arg%%:*}
remote_dir=${arg##*:}
# zip named to match lowest directory in $remote_dir
zip=${remote_dir##*/}.zip
# root of archive will match zip name
hg archive -t zip $zip
# make $remote_dir if it doesn't exist
ssh $host mkdir --parents $remote_dir
# copy zip over ssh into destination
scp $zip $host:$remote_dir
# unzip into containing directory (will prompt for overwrite)
ssh $host unzip $remote_dir/$zip -d $remote_dir/..
# clean up zips
ssh $host rm $remote_dir/$zip
rm $zip
Edit: clone-and-push would be ideal, but unfortunately the remote server does not have Mercurial installed.
Nope, this is not possible -- we always assume that there is a functioning Mercurial installation on the remote host.
I definitely agree with you that this functionality would be nice, but I think it would have to be made in an extension. Mercurial is not a general SCP/FTP/rsync file-copying program, so don't expect to see this functionality in the core.
This reminds me... perhaps you can built on the FTP extension to make it do what you want. Good luck! :-)
Have you considered simply having a clone on the remote and doing hg push to archive?
Could you use a ssh tunnel to mount a remote directory on your local machine and then just do standard hg clone and hg push operations 'locally' (as far as HG knows) but where they actually write to a filesystem which is on the remote computer?
It looks like there are several stackoverflow questions about doing this:
How do I mount a remote Linux folder in Windows through SSH?
Map SSH drive in Windows
How can I mount a remote directory on my computer?
I am often in a similar situation. The way I get around it is with sshfs.
sshfs me#somewhere-else:path/to/repo local/path/to/somewhere-else
hg archive local/path/to/somewhere-else
fusermount -r somewhere-else
The only disadvantage is sshfs is slower than nfs, samba or rsync. Generally I don't notice as I only rarely need to do anything in the remote file-system.
You could also simply execute hg on the remote host:
ssh user#example.com "cd /path/to/repo; hg archive -r 123 /path/to/archive"