I would like to be able to present a view of Jenkins builds similar to the buildbot console view. With Jenkins out of the box, there appears to be really no good way to associate a commit with a build. You have to access the specific built to determine what commit it was building.
I would like to be able to show status on what commits have been tested in a particular branch, so we know if a commit was skipped or if the latest commit has not yet been tested.
I tried using the Jenkins API for this, but I found that I could only see the SHA1 hash for a git commit via the build itself, i.e. via http://server/job/job-name/388/api/json. So, the only way I can see to take a commit and find builds for it is to iterate through every build in a job and retrieve its associated build info. This is certainly not going to be efficient and fast. Is there another way to do it?
Imperfect Answer: put the "revision number" you care about in the package name of all related artifacts, and use the "fingerprint" feature.
For example: my "product package" artifacts have a revision number, and if I carried that through to the "test package" artifact (which includes the unpacked product artifact) you would be able to track that revision number via the "artifact/fingerprint" feature, and show which test jobs used it. Below, you can't tell with a single click which test used which "commit."
Related
We use Jenkins for doing incremental builds of our project on each commit to the SCM. We would like to get separate builds for every single commit. However, the naive approach (setup SCM and use post-commit hooks to trigger a build) exhibits problem in the following scenario:
Build is triggered.
While build takes place (it can take up to several minutes) two separate commits to the SCM are made by two developers.
One new build is triggered. It receives changes from both of the commits, made during previous build.
This "race condition" complicates finding which one of the commits has broken the build/introduced warnings.
The currently employed solution is checking for changes in one job ("scheduler job") and triggering another job to do the actual checkout and build.
Are there any proper solutions to this problem?
Not yet, there's a Feature Request covering this kind of build, but it's still open: Issue 673
Maybe it misses the point, but we have a pretty nice build process running here.
We use git as our source control system
We use gerrit as our review tool
We use the gerrit trigger to run builds on our jenkins server
We check for changes on the develop branch to run jenkins when a changeset is merged
In short the ideal developer day is like this
developer 1 stars a new branch to do his changes, based on our main develop branch
The developer 1 commits as often as he likes
developer 1 thinks he finished his job, he combines his changes into one change and pushes it to gerrit
A new gerrit change is created and jenkins tries to build exactly this change
When there are no errors during the build, a review is made on this change
When the review is submited, the changeset is merged into the develop branch of the main repository (No change is merged into the develop branch, without review)
Jenkins builds the merged version to be sure, that there are no merge errors
no developer 2 joins the party and tries to do some work
the process is exactly the same both start working, in there branches. Developer 1 is faster and his changes are merged into the develop branch. Now, before developer 2 can publish his changes he has to rebase his changes on top of the changes made by developer 1.
So we are sure, that the build process is triggered for every change made to our codebase.
We use this for our C# development - on windows not on linux
I don't believe what you'd like to do is possible. The "quiet period" mentioned by Daniel Kutik is actually used to tell Hudson/Jenkins how much time to wait, in order to allow other commits to the same project to be picked up. Meaning -- if you set this value to 60 seconds and you've made a commit, it will wait for a minute before starting a new build, allowing time for other commits to be picked up as well (during that one minute).
If you use the rule "NO COMMIT on a broken build‏" and take it to it's logical conclusion, you actually end up with "No commit on a broken build or a build in progress", in which case the problem you describe goes away.
Let me explain. If you have two developers working on the same project and both of them try to commit (or push if you're using DVCS). One of them is going to succeed and and they other will fail and need to update before the commit.
The developer who had to do the update knows from the commit history, that the other commit was recent and thus a build in progress (even if it hasn't checked out yet). They don't know if that build is broken yet of not, so the only safe option is to wait and see.
The only thing that would stop you from using the above approach is if the build takes so long, in which case you might find that your developers never get a chance to commit (it's always building). This is then a driver to split up your build into a pipeline of multiple steps, so that the Post Commit job takes no more than 5 minutes, but is ideally 1 minute.
I think what might help, is to set the Quiet Period (Jenkins > Manage Jenkins > Configure System) to 0 and the SCM Polling to a very short time. But even during that short interval there could be two commits. As of now Jenkins does not have the feature to split build into single builds on multiple SVN commit.
Here is a tutorial about that topic: Quiet Period Feature.
As pointed out by someone in Issue 673 you could try starting a parametrized build with the parameter being the actual git commit you want to build. This in combination with a VCS commit hook.
How do we track older builds which had a triggered because of check-in happened in Hudson,
For example I made build periodically a certain build and its fine, I want to look older builds only which had scm change.
Ex: Currently I am in 200th build,I want to see older build say 76 if it had a change in SCM,i.e I should not click the 76th build console and see manually for any changes is there any way to determine by just seeing particular build number if that build had any change.
-Thanks
Pravin
If you want to browse the changes associated with recent builds, there's a link on the project's sidebar to Changes. The URL should be http://my-hudson-server/job/[job name]/changes. It will show the list of builds that Hudson has a record of and the change summary for each build.
If you know the build number (and Hudson is keeping track of enough previous builds), then you can use the URL http://my-hudson-server/job/[job name]/[build number]/changes.
If you try to access a build that's too old and Hudson has gotten rid of, you'll get a 404 (not found) error.
I'm working on setting up a Hudson/Mercurial stack for development. One of the use cases I have is "As a developer, I want to update my local sandbox to a particular build number from Hudson, so I can [fix a bug, debug issues, create a branched version of code, etc.]."
So, if I see build #49 on Hudson, how do I update my local Mercurial repo to the same source code that was used to build #49?
Note: I have looked at Mercurial tags, however they don't seem quite appropriate. They require a commit, so it seems the commits will dirty up the history (each commit by a developer will show a subsequent commit from the tag operation). If this is the best there is, I guess I will have to live with it, but hoping for something better. Would probably still use tags for releases.
Ok, here's the solution I ended up with:
Using the Description Setter Plugin, I set both the success and failing build description to "Mercurial ${MERCURIAL_REVISION}". Turns out the current Mercurial SCM plugin sets this environment variable to the parent changeset id.
I can then look at a build on Hudson, and if so desired, grab the changeset id and do a "hg update " on my local repo to get that revision of code.
Note that in the Mercurial plugin issue tracker there is some talk of changing this to HG_REVISION instead and adding other environment variables, so this may break at some point in the future, but works for me for now.
You can use the keywords extension on the hudson system to update the nodeid into some aspect of the build, possibly including the artifact names. If the Hudson job output artifacts are like: myproject-2010-02-17-2dbf7575fa46.tar.gz you certainly know how to 'hg update' to that point in time.
The keywords extension and maybe a little ant-fu make that easy to do.
We have the following requirements for our Hudson setup:
We would like to directly link to all builds that have been executed
The effective number of artifacts should be limited
It is possible to limit the number of maximum builds in Hudson per job (see this question). This option effectively removes old artifacts. The problem is that this also removes all other information related to the build.
Is there a way to retain linking directly to completed builds via http://${hudson}/job/${jobname}/${buildnumber}, even if artifacts were removed? Sometimes it may be good to commit fixes and link to the corresponding build error.
There's a checkbox under the 'advanced' button when configuring 'archive the artifacts' that allows you to delete all but the most recent artifacts. The build history is retained, but the artifacts are deleted.
There is an open issue for keeping the artifacts from the last N builds - see issue 834
I have a project that is combining multiple hg repositories (different components) to build a single application. I'm looking for a cross-platform tool to support performing an operation on multiple repos at the same time (e.g. tag, pull, push, commit etc...) Essentially, I'm looking for the 'repo' script that Google wrote for Android, but for hg instead of git:
http://source.android.com/download/using-repo
I searched on stack overflow and found this:
mercurial windows batch file for pulling changes to multiple repositories
But it's still a bit manual and windows only. I know it's not that hard to write the script to either pass a command to the repos or try to encapsulate everything, but thought that it might be a common thing so maybe others already have a solution. I suppose one approach would be to port the repo script to hg (find and replace git with hg would probably get pretty far for simple operations).
What do other people do in this situation?
Definitely look at the new (in version 1.3) subrepositories feature in Mercurial. It lets you have an overarching repository that contains other repositories. The state of the top level includes a file that specifies the hash of the tip of the sub-repos, so you can effectively specify a single hash node id that encompasses the state of all the subordinate repos.
https://www.mercurial-scm.org/wiki/subrepos
I have been in a similar scenario for the past year, working daily with a project spread over 6 repositories, and being the one responsible for most branching/merging stuff etc.
At some point I found a simple bash-script for checking multiple SVN repo's. I adoptet it to Mercurial and have extended it over time. Right now I can't recall or google the original source of the script, so I unfortunately can't credit the original author (will credit in my script if I find the info later).
My "hgall" script is published on GitHub at https://github.com/JKrag/hgall
It is made for my own use, and thus directly references a few extensions that I have been using, but this can been easily edited out. (or just don't use the commands that call extensions you haven't installed).
The script is documented in the README, including my personal "tweak" of having it (by default) ignore repo's that start with underscore, as I use these as temporary clones to test messy operations (e.g. large merges).
I hope this script is useful to some of you - it has been of great help to me over the last year. Our company has just moved to Git, so I will probably not be making many updates in the near future, but might possibly be porting it to Git some day....
Additionally, Iain Lowe has an extension https://bitbucket.org/ilowe/multirepo/src which can manage multiple repositories.