I have just started working on a web project that uses Mercurial version control system to a bitbucket account.
The web project is hosted on a 3rd party server - Webfaction.
I have followed all the Mercurial tutorials at Mercurial
The tutorials state that a repository should be made on the local pc and then changes made to the code in the repository on the local pc and then added, committed and pushed to the bitbucket account.
But my project is hosted on a server - WebFaction, so all the code changes should happen on the server, so I can see that the changes work.
I cannot find a reference to changing the code on the WebFaction server (only on the local pc) and then committing and pushing the code from the WebFaction server to the bitbucket account. I simply don't know how to do this (or even if it can be done!).
Can someone give me the steps and syntax (as much as possible) to do this? Could you also keep the answers as simple as possible as there are huge parts of Mercurial I don't yet understand.
Thanks.
Assuming you have full SSH access to the WebFaction server (you should according to the WebFaction features page), I suggest you try following the detailled instructions found here. If you get stuck on any step, then you can ask a more specific question (probably better to ask on serverfault though).
The fact that the repository is on a remote server does not really change anything. You connect through SSH to the remote server (WebFaction) and you follow the steps as if it was a local machine.
Related
We are looking for a way to use GitHub on an internal system that we are developing at work. We have developed it in PHP and MySQL, with a fair bit of jQuery/Ajax, on a Windows Server VM running IIS. Other staff can access the frontend over the network using the IP address.
There are currently three people working on it and at the moment we directly edit the file on the VM as we need it to still communicate with the database to check our changes have worked. There is no option to install anything like WAMP on our individual machines and there are the usual group policy restrictions so the only access we have to a database is via the VM. We have been working with copies of files/folders and the database but there is always the risk that then merging these would be a massive task.
I do use GitHub (mainly desktop but I can just about get by with using the command line as long as I have a list of the command in front of me) at home to sync between my PC and Laptop, via GitHub.com and believe that the issues we get with several people needing to update the same file would be eradicated by using it here at work.
However, there are some queries we need to ensure we have straight in our heads before putting forward a request.
Is what we are asking for viable? Can several branches on the same server be worked on at the same time or would this only work on an individual machine.
Given that our network is fairly restricted, is there any way that we can work on the files on our machine and connect to a VM hosted database? I believe that an IDE will allow us to run php files on a standard machine (although a request for Eclipse is now around 6 weeks old and there is still no confirmation that we will get it any time soon) but will this also allow .
The stuff we do is not overly sensitive but the company would certainly not want what we do out there in a public repository (and also would not be likely to pay for a premium GitHub account) so we would need to branch/pull/merge directly from our machines to the VM.
Does anyone have any advice/suggestions/solutions to this? Although GitHub would be a preferred option as I already use it, we are open to any suggestion that will allow three people, on different machines, simultaneously work on a central system while ensuring that we do not overwrite or affect each others stuff.
Setting up a git repo on Windows is not trivial and may require a fair bit of work. You can try using SVN it is fairly straight forward to install on windows and has a better learning curve than Git. I am not saying SVN is better/worse as compared to Git, it's much better suited to your needs. We have a similar setup and we use Tortoise SVN https://subversion.apache.org/ as a client. SVN also has branches and stuff.
SVN for server side repository https://subversion.apache.org/
If you would still prefer Git on windows, check this out - https://www.linkedin.com/pulse/step-guide-setup-secure-git-remote-repository-windows-nivedan-bamal
1) It is possible to work on many branches and then merge them into a single branch. That's the preferred Git development way. You can do the same on SVN.
Forgive me if this question sounds ridiculously naive, but I don't seem to be able to find a straightforward answer to this anywhere.
I am considering using Mercurial for source control on a small (2-3 developers) project. I like the idea of not having to subscribe to a central repository, and I like that everyone effectively has a complete copy of the project. What I don't understand is how the Mercurial clients communicate changes to each other. Does it require opening a specific port or something similar?
Any pointers to help on Mercurial or comments from people who have used it would be gratefully received.
Mercurial can access remote repositories in a number of ways. The main ones are:
File access (i.e. look at the repo at this path)
HTTP / HTTPS (i.e. look at the repo being served at this web address)
SSH (i.e. log into this machine / user, and look at the repo at this path)
The first only really works if your team is all working on a shared file-system, but is also useful if you personally have multiple clones that you want to move information between.
HTTP is how you would access something like Bitbucket, or some other "available to all" server. You can set-up a temporary server on a machine with the command hg serve, and that's useful if you need to share with a colleague quickly and don't care about security, but normally HTTP Mercurial servers sit behind Apache, or some other web server software.
An SSH based set-up is just a machine running an SSH server which has Mercurial installed. Mercurial logs on to the machine, and invokes hg remotely. The local hg and the remote hg then talk over the link. Very easy to set-up in my opinion. Credentials are all handled in normal SSH ways.
Further reading:Publishing Repositories
I've read similiar questions here and elsewhere. This is not intended to be a duplicate, but I haven't found the answer.
I'm trying to ask a very particular question, so please don't mark this as duplicate unless you can point me somewhere with a very specific correct answer.
I'm running CentOS 6 and I have Mercurial 1.9 installed as our Mercurial Server.
I can add repositories and and I can clone, and commit changes, and push back to the server with no problems as long as I don't try to use SSL.
The apache website is configured with a self signed SSL cert (I am aware of the pros and cons around self signed SSL certs, but we have made the decision to use one unless it is technically impossible).
Our client machines are Windows 7 with TortoiseHG 2.1.4 installed. In Visual Studio 2010 I'm using "Mercurial Source Control Package".
What I would like to do, is make a server configuration change that would either on a server level or repository level allow a self signed certificate.
Per client machine changes are burdensom because even after I update everyones machine, next time I have to setup a new client I have to have these changes documented and remember to go back through the steps.
I've tried the hostfingerprints option but I haven't been able to get it to work. I'm not sure if this is supposed to work as a server configuration or if I'm putting the setting in the correct file or what.
As a side note, I finally found how to turn on --insecure through the TortoiseHG UI (clicking the lock icon), but it looks like the visual studio source control provider doesn't have an option (at least that I can find).
I'm not a Linux expert (but I have access to experts if needed) so please be verbose in your explanations.
Everyone in our organization is an HG novise.
As a last resort, we may just get an SSL cert.
Jamie F is correct, but I'll put it down here since s/he didn't. There is nothing a server can do to tell a client to trust it -- there would be little point in that. You need to either configure your clients or use a certificate signed by a CA that your client systems already trust.
I am learning how I can install Mercurial on our team system, but I am not experienced enough to make some decision.
For our team, we have a server machine used as a repository. Every team member also has her/his own Linux RedHat installed machine. However, we do not do anything on our local terminals and we do everything on the server. Every member has a user directory on the server such as /home/Cassie, /home/john, ... and we save all our code and work there. When we turn on the local terminals, the GNOME system shows our personal files on the server not the local machine. Whenever everyone click the terminal application on desktop, it connects to her own home directory. Thus, we do not need to use SSH command to connect to the server. It is like the school multi-users system. Everyone has a user account and she logs into her own account to do her own work. I hope I can install a shared repository on that server and every one can do push, pull, etc. all kind of commands there.
1) Since we use a shared environment, does it mean that I need to install Mercurial on only the server and that is enough for everyone to do "commit", "push", "pull", etc. commands?
2) By installing only system-wide Mercurial, does it eliminate the ability to do local commit? If I would like to let everyone still have the "local commit" ability, how should I do it?
3) I have searched online. Some people mentioned that for a shared network server, it is impossible to have locks for any two users if they are trying to access the same file at the same time. Does it imply my situation?
In sum, we do all the work on the server. I hope to find a plan to have Mercurial control on a repository shared by everyone when everyone still has local commit ability and the repository still has some locks protection if any two users try to access a file at the same time. If this scenario is feasible, can I just install the Mercurial on the server or I need to install Mercurial for both servers and users machines? If it is impossible for the scenario, would someone please suggest me a plan to have version control for our system?
1) Since we use a shared environment, does it mean that I just need to install the Mercurial on the server and it is enough for everyone to do "commit","push","pull"..etc commands ?
If your users are logging into a shell on the server in order to do their work, then yes it is sufficient to have Mercurial installed only on the server.
2) By installing only system-wide Mercurial, does it eliminate the ability to do local commit ? If I would like to let everyone still have the "local commit" ability, how should I do it ?
Your users will presumably checkout from a shared "root" repository into their own home directory in order to work on the code. They will have a "local" copy of the repo in their home directory and will push into the shared root repository.
3) I have searched online. Some people mentioned that for a shared network server, it is impossible to have locks for any two users if they are trying to access the same file at the same time. Does it imply my situation ?
As long as your users are working within their own local copies of the repo, they will not interfere with one another. The only time a conflict may arise is when committing back to the shared root repository -- in which case the user will need to merge their changes and resolve any conflicts.
I would recommend reading carefully through Joel Spolsky's excellent Hg Init tutorial for a better understanding of how Mercurial handles "central" and "local" copies.
I would like to host my own version control on the server I already pay for. I don't have shell access, but I can use ftp (obviously) and mysql. Are there any version control solutions that can run with only these?
I don't think Subversion or CVS will work without software installation. Can you mount your account as a file share/network drive? If so Mercurial would work. You could keep your master repo in the folder and clone it onto your local hard drive for real work. If you're just looking for a remote repository solution, you might be able to use Mercurial or Git with DropBox in the same way.
This is just my opinion, but if any provider ever told me they don't support SSH because of security I would immediately cancel my account and get a new provider. Let me guess: they only do Windows server hosting?
If you need to keep your project private you could always use a paid GitHub or BitBucket account as well, but that doesn't solve the problem of hosting on your existing account.
Straight out of git's man page:
Git natively supports ssh, git, http, https, ftp, ftps, and rsync protocols.
Rsync works over FTP, so you could have a local folder that represents what you want on the host, and then use rsync to deploy it.