creating new remote repository for existing project with Mercurial - mercurial

I have a project with version.2 and i have to start working on it to develop a new version.3 .
I want to create a new repo on a remote server (i.e. a mercurial-server) so that my team member could access that repo .I have my project file on my local machine .
I have two concerned questions :
How can I create it in /home/hg/repositories/private/project3 (Lets say new repo name would be project3) of remote mercurial-server with my project files. What steps should I follow to do this.
How can I create a access permission (usrname/pword) so that my team will access this repo on http://dev.myproject.com/private/project3 .
Note: /home/hg/repositories/ is default for http://dev.myproject.com/ and I have no repo of version 2 (clone is not possible I guess! )

Without installing additional server side software your team will need ssh accounts on that box. I'm assuming you have one and that you can create them for your friends. If you don't have that setup you're better off just using bitbucket, which is free and provides both ssh and ftp access.
Also, you don't say if you project2 is already under Mercurial control, so I'm assuming it's not.
To create the remote repo you'd do something like this on your local machine:
hg init project3 # <-- creates a new empty respository
cp ALL_THE_PROJECT3_FILES_YOU_WANT project3 # <--- put the files you want into project3
cd project3 # <-- go into your local project3 repository
hg addremove # <-- LOCALLY add the files you copied in
hg commit -m "initial commit copied in project2" # <-- LOCALLY commit the files
cd .. # <---- go up a directory
hg clone project3 ssh://yourusername#dev.myproject.com//home/hg/repos/project3 # clone the repo over to the server
Your teammates can then clone down using:
hg clone ssh://theirusername#dev.myproject.com//home/hg/repos/project3
Here are some things you could accidentally mess up on the way to getting this working:
Your friends need ssh accounts
your friends accounts need read/write access to /home/hg/repos
Notice that all cloning is happening over ssh. Setting up HTTP is harder and probably not something you need to do.
Seriously, just use bitbucket.

Related

mercurial + bitbucket + windows 7, how to setup?

thanks to the help of Stackoverflow I was able to setup an account and repository on bitbucket and manually push my local repo to the cloud using password.
I was unable to find a proper tutorial on how to setup SSH between mercurial and bitbucket using Windows 7 and also I was unable to find a proper tutorial on how to automatize the push command to avoid writing the full path all the time of each of the repositories.
Anyone can help on achieveing those two issues?
to find a proper tutorial on how to setup SSH between mercurial and bitbucket
Keywords: plink, pageant
proper tutorial on how to automatize the push command to avoid writing the full path all the time of each of the repositories
"Full path" to local or remote repo?
In case
Local, and using -R "path/to/local/repo" - just cd to repo always before using HG
Remote - add all needed repositories into .hgrc of repository (.hg\hgrc from the root of repo-dir) [paths]
[paths]
default = git+ssh://git#github.com/lazybadger/Fiver-l10n.git
sf = ssh://bigbadger#hg.code.sf.net/u/bigbadger/code
With these names I can pull/push from/to default || sf as URLs: hg push sf, "default" as default target can be omitted totally

How to configure mercurial to deploy website folder only

I have a website that I want to deploy to a clients DEV and UAT environments, the site is part of a mercurial repo - it is in the Website folder at the same level as the .hg folder. I know I can push the entire repository but would rather push only the website folder so the client does not have the other files and folders.
The repo looks like this:
Project root
.hg
Database (SQL Source Control uses this)
Documentation (All specs, pdfs, art work etc.)
Lib (pre-Nuget 3rd party dlls)
packages (Nuget stuff)
Website (this is the only area I want to deploy)
.hgignore
Project.sln
Edit:
The clients servers are not connected directly to the internet, my access to them is over a vpn and then RDP. Currently to deploy any changes I need to zip the site up, put it on a shared ftp server then wait up to 3 days for the files to be copied to the servers. Rules have been configured so I can use Mercurial over this connection.
Edit 2
I have managed to create a subrepo from the Website folder by forgetting the Website folder and all it's contents, committing the change then putting the files back, creating a repo then echoing out the .hgsub file. Locally this works for me, I can clone from the Website repo without getting any of the additional folders. However I have not been able to use this version of the repo, even if I repeat the process on our repo server. When I try to clone the hosted version down to my local working copy I get 404 errors, but I can clone the hosted version on the hosting server.
I would appreciate some step-by-step instructions (a guide for dummies if you like) on how to achive my goal; which is to be able to push only the Website folder to the clients servers. The master copy of the repo is on our repo server, I have a local clone and need to be able to push out versions from my copy.
Edit 3
Turns out that the problem I was having converting a folder to a subrepo as described in http://mercurial.aragost.com/kick-start/en/subrepositories/#converting-folder-into-a-subrepository was that the convert command, in versions after 2.1.0, is broken and is still broken in 2.3.1. After I figured that out and rolled back to that version of TortoiseHg I was able to convert the folder to a subrepo, in the root of the repo I have .hgsub which says Website = Website. I was able to work with that locally, commit to the whole repo, the subrepo, clone either the full repo or the subrepo (which is what I want), however I can't get this to work from our master repo server.
I zipped the whole thing up and ftp'd it to our remote master repo server, then set it up so I could clone from it. Directly on the server this works fine (hg clone --verbose -- C:\Repositories\EM .), however when I try to clone from the server to my local development machine with (hg clone --verbose -- https://myserver.com/hg/EM/ .) it fails with "HTTP Error: 404 (Not Found)".
requesting all changes
adding changesets
adding manifests
adding file changes
added 628 changesets with 6002 changes to 4326 files
updating to branch default
resolving manifests
calling hook preupdate.eol: <function preupdate at 0x00000000035204A8>
getting .hgignore
getting .hgsub
getting .hgsubstate
HTTP Error: 404 (Not Found)
[command returned code 255 Fri Apr 20 10:51:23 2012]
I don't know what the problem is, the files are there so why the 404?
In my opinion Mercurial shouldn't be used for this purpose. This is particularly true if that website is a web application because you shouldn't have the DLLs in Mercurial.
You should look at the web deployment tool built into Visual Studio. Have a look at this page to see if it suits your purpose.
If you can't install the required services on the destination server then it can be configured to use FTP instead.
You can not push part of repo tree
If DEV and UAT environments are unversioned targets, you can use any other way for distributing Mercurial content
You can separate Website into subrepo and will be able to push this repo
As others have pointed out you can't use push for this. Just do 'rsync' from your server to theirs. You could even automated that in a hook, where you push to a local repository and it auto-deploys to their site. Something like:
[hooks]
changegroup.deploy = $HG update ; rsync Website account#theirserver:/path/to/docroot
I have a working solution to this. I created a batch file that creates an outgoing repo and starts the built in server so I can pull from it on the client machines. First it clears out the previous folder, then clones from my local working copy (there's a parameter to determine which tag it should clone from). Next it creates a map file and converts the Website folder to a new Website2 folder in order to preserve the history then gets rid of the original folder and renames the new one. Finally it spins up the built in server.
cd c:\inetpub\wwwroot
rd /S /Q _ProjectName
hg clone -- C:\inetpub\wwwroot\ProjectName#%1 C:\inetpub\wwwroot\_ProjectName
cd c:\inetpub\wwwroot\_ProjectName
echo include Website > map.txt
echo rename Website . >> map.txt
hg --config extensions.hgext.convert= convert --filemap map.txt . Website2
cd Website2
hg update
cd ..
hg remove Website/*
hg commit -m "Removed Website"
rename Website2 Website
hg serve
So it isn't pretty, but now I just need to call the batch file and pass the tag I want to build the outgoing website from (uat, dev etc.) and give it a minute to create my Website folder, with history, that I can use to pull from or push from. I don't need to call hg serve because I know the names of the client servers so I can push the changeset out by creating aliased remote repositories. But I included that step so the client machines can pull. I haven't fully explored this option, so I'm not sure whether it's got any particular advantage. It's fine for the case when it's just me working on the project, but if any other developer needs to work on this then the Uri for their local project server will obviously be different (http://SIMON-PC:8000/ won't be the case for everyone), in which case pushing into the client might be best.
But by using this approach my local working repo doesn't need to change and so I don't get any issues communicating with our central repo, the 404 errors mentioned in edit3. I keep the entire history of the repo with the convert process, so the next time I need to send changes I'm not starting at revision 1 - in other words it isn't destructive of the Website and although I am deleting the entire outgoing repo (_ProjectName) each time I am retaining the history and yet in a position to pull / push ONLY the Website directory because it is created each time as a 'standalone' repo

Mercurial - compare local and remote repositories?

In Git, there is the command
git remote show <remote>
When properly configured, this will show you the status of the remote compared to your local repository, including whether there are pending changes in either. I can't find a similar command in Mercurial. Am I missing something or does it just not exist?
Perhaps hg summary --remote?
To compare local and remote repositories follow these steps:
go to local repo folder (use cd path_to_local_repo)
run "hg outgoing -p path_to_remote_repo" (without quotes)
See GenerateDiffBetweenRepositories

how do i setup a local working directory to work with a local repo using Mercurial

Following is the scenario: I have a remote Mercurial repository at ssh://remotehost//dir/repo and I am able to clone it to a local host "pandora" in directory /home/user/localrepo/.
Now, I have a superset of this remote repository, where I add my own testing framework, but do not want to merge to the main depot until I am certain it works. So I clone this "local" repo to /home/user/workingdir/ but when I issue the command to do so
$ hg clone /home/user/localrepo/
only the repository folder gets copied none of the files get copied.
I'm not sure what you mean when you say that "only the repo folders gets copied". So there's two things you can try :
Try to do a hg update in your new clone.
List the directory in /home/user/workingdir and if there is a directory name localrepo in it, this is actually your repository. To clone in the current directory, you must do hg clone /home/user/localrepo .
This sounds odd but try a few things:
First in the local repo that you cloned from do a
hg status -A
are all the files that you think should be in there in there? If not are you at the tip of the repo.
You can see what revision you are at with
hg parent
If you want to just go to the tip do hg update
If there still aren't any files listed in the repo do the same to check the one on the server.
If there aren't any files on the server you will need to add all of the files you want mercurial to track, mercurial doesn't automagically start tracking files in the repo location.
(Use hg add --all to add all of the file in the entire directory tree under the repo location.)
If there are files in the local repo, check the testing area and make sure that it is on the proper changeset.

How to clone repository to a remote server/repository with Mercurial

Found myself quite confused today about this.
I create a blank repository locally(hg init), cloned it to working copy, added some code, commited and pushed it(to local repo obviously).
Now I need to share that repository with others. There is a server that has mercurial on it, how do I clone my repository to a remote one such that other developers can access it and pull/push code from/to it?
You'll want to check out the publishing repositories wiki page to get into web interfaces and access controls, but at it's most basic you can do something like this:
hg clone yourlocalrepo ssh://you#server//home/you/repo
That clones your local repo to a remote location of your choosing. Note that there are two double slashes in that URL.
You can't create a remote repo like that using http://, only ssh://. If all you have is http to hgweb.cgi you can 'hg init' an empty repo on the server and then hg push to it.
If your "official" repositories are served up by an HTTP server, and you want to create a repo in the central location based on a local machine's repo, here's one way. You need admin rights on the central server to do this.
e.g. I'm developing on windows, and my central repository is running on linux and served by lighttpd per the official guide. The server's central repo directory is /var/hg/repos/, owned by the user/group www-data. My local machine's IP is 10.1.10.100, and the repository I want to clone is named foo.
On the local machine, open a command prompt into the repository directory and type hg serve. This runs the local hg web server, which will allow the server to pull from it.
ssh into the central repo server, logging in as a user with sudo rights to www-data.
cd /var/hg/repos
sudo -u www-data hg clone http://10.1.10.100 foo
For those that come later and don't want to bother about the hassles of ssh for pushing changes to a server built to host repos, you can just init on the server, and then push as you do every other repo.
# on server:
cd repos/
mkdir myrepo
cd myrepo
hg init
cd ..
chown -R apache:apache myrepo
cd ..
vim hgweb.config
# change [paths]
[paths]
myrepo = /path/to/myrepo
# on your machine
# make sure you've configured hgrc correctly
[paths]
default = http://server/hg/repos/myrepo
hg push
# ???
# profit