Is it possible to use Jenkins with Mercurial in a way that a job will have the mercurial repository URL as a parameter?
This will allow us to use a single job that can clone and build different repositories. This build is for testing only. An official build and release will be from a constant repository.
On the Job Configuration page, check the 'This build is parameterized' option.
You should be able to define a named parameter for your repository and then reference that in the build configuration further down. In my set up, I do this to allow named branches in Mercurial to built.
You'll have to manually specify the parameter when you trigger the build.
Related
Docker Hub's Automated Build documentation talks about setting up builds based on branches and tags, but does not mention Mercurial bookmarks or how they interact with the build configuration.
Can you configure automated builds based on Mercurial bookmarks, and have bookmark pushes trigger the builds?
Short answer: Configure the bookmark names as if they were tags, and it should work.
Longer answer: From some experimentation, as of Sep. 2016, it seems that bookmark names actually match against both tag and branch builds configured in Docker Hub. However, when pushing a bookmark update, only tag builds matching the name will get triggered automatically; branch builds won't get triggered automatically, but can still be triggered manually.
to project manage a mercurial repository, and track issues.
The workflow I want to use for the mercurial repository is branch based, i.e people are welcome to create branches and push their branches to the designated server. However I want to restrict access to the default(trunk) branch, so that all changes are merged by me into the default(trunk) branch.
This will allow control over things like code-review for each branch before merging it to the default(trunk) branch
Is there any way to use redmine to manage permissions and access to mercurial repositories?
I think I'm looking to do something like gitflow
Redmine only allows you to work on a local clone of the repository as explained in their official guide Redmine Oficcial guide#Mercurial-repository .
So i think you cant restrict the access to the repository on the web via Redmine.
How would one make a job in Jenkins that polls source
control (i.e. mercurial) as a triggers it to execute the job, but
without actually clone/pull the monitored repo?
If it already has a local clone and you just don't want to update it can run hg incoming whose exit code lets you know if there's new stuff. If you don't have a local clone you'll need to run something like hgweb on the box that's serving the repo and then poll the raw version of the latest commit and watch for changes: http://hg.intevation.org/mercurial/crew/raw-rev/tip
This link Is there a way to keep Hudson / Jenkins configuration files in source control? shows how to save Hudson configuration changes to an SCM (i.e. a "backup with history")
My question is: can the Hudson configuration be pulled from an SCM. In other words, to change a job configuration, you add a changeset to the SCM repository first. Hudson, at the start of a build, pulls the configuration from the SCM and runs as usual.
Of course, it would also be ideal to make the entire job configuration screen read-only (or as minimal as possible).
Why would I want this?
I want the SCM to be where a configuration change is begun. Why? So
the changesets in the SCM reflect when the configuration change was
done in the flow of changesets for the project, i.e. it imposes a
chronological ordering to the project changes.
I don't want to use the security feature (i.e. no need for a login, etc)
I searched and could only find plugins for backing up or saving the configuration, but none that "pulled" the .xml files.
Thanks,
John
I haven't tried it myself, but you might be able to do this with a custom build that does the following on a schedule:
Sync all of the job configuration files from your SCM into the Hudson jobs directory
Do an HTTP GET to [Your Hudson URL]/reload - this is the equivalent of clicking the "Reload Configuration from Disk" link on the "Manage Hudson" page.
I don't think you could have each job update its own configuration from SCM every time it runs, because the configuration will have already been loaded by the time the job polls the SCM for changes.
I use the on-demand (hosted) version of FogBugz. I would like to start using Mercurial for source control. I would like to integrate FogBugz and a BitBucket repository.
I gave it a bit of a try but things weren't going very well.
FogBugz requires that you hook up your Mercurial client to a fogbugz.py python script. TortoiseHg doesn't seem to have the hgext directory that they refer to in instructions.
So has anyone successfully done something similar?
Post-mortem:
Bitbucket now has native fogbugz support, as well as other post-back services.
http://www.bitbucket.org/help/service-integration/
From the sounds of it you are wanting to run the hook on your local machine. The hook and directions are intended for use on the central server.
If you are the only one working in your repository or don't mind commit not showing up in FB until after you do a pull, then you can add the hook locally to your primary clone, If you are using your primary clone then you need to do something slightly different from what they say here:
http://bugs.movabletype.org/help/topics/sourcecontrol/setup/Mercurial.html
You can put your fogbugz.py anywhere you want, just add a path line to your [fogbugz] section of that repositories hgrc file:
[fogbugz]
path=C:\Program Files\TortoiseHg\scripts\fogbugz.py
Just make sure you have python installed. you may also wish to add a commit hook so that local commits to the repository also get into FB.
[hooks]
commit=python:hgext.fogbugz.hook
incoming=python:hgext.fogbugz.hook
On the Fogbugz install you will want change put the following in your for your logs url:
^REPO/log/^R2/^FILE
and the following for your diff url:
^REPO/diff/^R2/^FILE
When the hook script runs it connects to your FB install and sends it a few parameters. These parameters are stored in the DB and used to generate urls for diffs and log informaiton. The script sends the url of repo, this is in your baseurl setting in the [web] section. You want this url to be the url to your bitbucket repository. This will be used to replace ^REPO from the url templates above. The hook script also passes the revision id and the file name to FB. These will replace ^R2 and ^FILE. So in summary this is the stuff you want to add to the hgrc file in your .hg directory:
[extensions]
hgext.fogbugz=
[fogbugz]
path=C:\Program Files\TortoiseHg\scripts\fogbugz.py
host=https://<YOURACCOUNT>.fogbugz.com/
script=cvsSubmit.asp
[hooks]
commit=python:hgext.fogbugz.hook
incoming=python:hgext.fogbugz.hook
[web]
baseurl=http://www.bitbucket.org/<YOURBITBUCKETACCOUNT>/<YOURPROJECT>/
One thing to remember is that FB may get notified of a checkin before you actually push those changes to bitbucket. If this is the cause do a push and things will work.
EDIT: added section about the FB server and the summary.
Just a heads-up: Fog Creek has released Kiln which provides Mercurial hosting that's tightly integrated with FogBugz and doesn't require any configuration.
I normally wouldn't "advertise" on Stack Overflow (disclaimer: I'm one of the Kiln devs), but I feel that this directly answers the original question.
It is possible to integrate your GIT BitBucket repository with FogBugz issue tracker, but unfortunately it is not properly documented.
You have to follow steps described at https://confluence.atlassian.com/display/BITBUCKET/FogBugz+Service+Management, but beware that
In CVSSubmit URL you need to put url WITHOUT "?ixBug=bugID&sFile=file&sPrev=x&sNew=y&ixRepository=" parameters.
It should just be "https://your_repo.fogbugz.com/cvsSubmit.asp"
You will need to mention your FogBugz case ID in the git commit message
by putting "BugzID: ID" string in it (this is not documented
anywhere :-( ) similar to this:
git commit -m "This is a superb commit which solves case BugzID: 42"
Of course, commit info will be sent to FogBugz after you push your commit to BitBucket server, not after your do a local commit.