Jenkins build outside of workspace - hudson

I am new to Jenkins/Hudson and am trying to migrate a C make-based project from buildbot. For legacy reasons, the build system is hard-coded to build outside of the versioned source tree (git), one directory above, in a separate directory. E.g.:
workspace
.git
foo
bar
build
artifacts
Besides the fact that it ends up creating a directory outside the workspace, Jenkins won't recognize items in the build/ directory above to archive as artifacts.
How can I make this kind of build system work with Hudson? Building in-source-tree is not a short-term option. The only option I found was "use custom workspace," but all this does it hard-code the workspace directory to some other directory.

To answer my own question: there is indeed an option in Jenkins git plugin to check out to a local subdirectory instead of the root of the workspace. With the git plugin, click on the Advanced button and fill in the field "Local subdirectory for repo (optional)".

I don't find the option that djs mentioned, but you can specify a different work directory:
Configure job
Extended Project settings
Use custom work space
This can be set to everywhere you want, also the workspace of a different job.

Related

Can I use opam to make a package out of a local file and install it?

I'm new to opam and trying to figure out how to use it properly. For a class, I want to set up students with an environment that has some custom packages installed. (The package will consist of some raw .ml files that I got from a colleague at another school; the files are on their github but there's no .opam file that I can see, and as far as I know they're not in any official package release.)
Can I somehow call these local .ml files a package and ask opam to install it? Do the files have to be on github first, and if so can I use my colleague's existing repository as the source? I don't want to make any of this public, since it is not my own work; I just want to configure my local environment so that the code in the files can be included easily as a package. Basically I don't know the best way to proceed so I'm happy for any advice.
You can add a custom opam file in the base directory of the project. See the documentation for how to create that file.
Then you can enter opam pin add . in the base directory and your project will be installed as if it was an opam package. Check opam pin --help for more info (you can also pin to a remote git project for instance).
Note that though the default repository is hosted on github, this is in no way a requirement for opam. Opam is dependent on git but you can absolutely use it with a private git repository. If you want to use your colleague's repository as the source, that is totally doable though it is often preferable to have the opam file at the root of the directory (you can do a PR on their repository or make your own fork of it on github, the site makes it clear you copied the code).
If pinning is not to your taste, you can also create your own repository though this is probably a bit too heavyweight for your needs.
Good luck!

yii2 - All Files and Folders permissions are messed up. What should be the permissions of yii2 framework's directory hierarchy

I moved the complete yii2 installation from one server to another with the help of FileZilla. Sadly, Filezilla don't keep the file permissions by default, and now I'm facing issues with file / directory permissions. I would like to know what's the file permissions for different directories and files in the yii2 directory hierarchy.
You should not transfer the project this way.
Currently it's the era of version control (especially Git) and Composer.
Once you created you project locally and put it under version control, you push it to your main repository and then deploy it to production server.
No need to use Filezilla or something like that.
If your hoster limits you in that, it's better to switch to another one.
In your current situation comparing and setting permissions manually can be very tidious, some of the permissions are set during init command.
So I recommend to deploy it again using version control and Composer instead of struggling with manual permissions setting.
But just in case, I checked production server, most of the folder permissions are 0755, for files - 0644. Folders like runtime, assets have 0777 permissions and set with init command as I mentioned above.
Locally I use Vagrant and pretty much everything here has 0777 permission.

Jenkins projects pointing to same Mercurial repo do not share source

I am using Jenkins for our build server. I have multiple projects using the same Mercurial (Hg) repository and want to avoid each project cloning it's own local repo to build from (since the repo is rather large). This is supposed to be possible via Jenkins and the Mercurial plugin.
In my Mercurial plugin configuration I have checked both "Use Repository Caches" and "Use Repository Sharing". In each project, the same repository location (a network location specified via IP address) is listed.
However, each project still seems to want to create a clone of the repository. Any ideas?
In our setup (using Jenkins 1.506), I've defined a custom workspace under the Advanced Project Options for each of my builds, typically at [project]\repo and then build from there into a \build\ folder.
If you define the custom workspace for each Jenkins project to point to the same shared custom workspace using the same source for the repo it will reuse what is already there.
I've not tested this, but I would assume that under this setup, it is important to prevent concurrent builds from occurring in the same working directory. Bad things would follow.
As a followup question: What is your rationale for not wanting each build to have its own source code?

Can Jenkins store artifacts outside the job directory?

I currently have Jenkins set up with a number of jobs, but it's proving difficult to back up because the artifacts are stored within the job directory. I'd like to back up the job configurations and artifacts separately. I'm sure I remember reading somewhere that Jenkins now has an option to store them outside the job, but I can't find this.
Is there any configuration option that does this while still making the artifacts visible from within the job on the Jenkins interface? (ie rather than merely an add-in that copies the artifacts elsewhere)
Go to your jenkins configuration page, e.g.
http://mybuildserver.acme.com/configure
At the top of the configuration page there is a "home directory" setting. Click the "advanced..." button below it.
Now set the "Workspace Root Directory" to e:\jenkins-workspaces\${ITEM_FULL_NAME}, and "Build Record Root Directory" to e:\jenkins-builds\${ITEM_FULL_NAME} or something similar.
Warning: I run Jenkins 2.7.2 and noticed that certain features don't work properly after configuring Jenkins like that. I saw problems with folders and problems with the multi-branch project plugin. Check the status of those issues if your rely on these features.
As you can see here, there are many plugins to deploy artifacts anywhere you want/need, on FTP, CIFS, Confluence, Artifactory.... especially the ArtifactsDeployer that will allow you to make a copy of the artifacts in the Jenkins Home.
Thank you Sam, for your post, which directed me into the right direction to solve my problem.
Have been searching for a way on how can I make a symlink to the Job-Archive of a build for multibranch projects. Up to now, we used to manually search for the correct folder basename in the filesystem and added that one to the Jenkinsfile.
Now, I can simply use
jobOutputFolder = currentBuild.rawBuild.artifactsDir.path
and use that in my script.
If security is a concern, I could implement that as a shared library additionally.
Try the Use Custom Workspace build option. From the Jenkins popup help:
For each job on Jenkins, Jenkins allocates a unique "workspace
directory." This is the directory where the code is checked out and
builds happen. Normally you should let Jenkins allocate and clean up
workspace directories, but in several situations this is problematic,
and in such case, this option lets you specify the workspace location
manually.
This option is also available under advanced project properties of multi-configuration project builds.
A groovy script under "Prepare an environment for the run" will always run on the master, and this groovy script can create a symlink to where you really want artifacts archiving to archive_to which SHOULD include the job name and build number:
if (! Files.createSymbolicLink(Paths.get(currentBuild.artifactsDir.path),
Paths.get(archive_to.getCanonicalPath()))) {
throw new RuntimeException("Can't create symlink to archive dir")
}
Of course (sadly) when old builds are purged by Jenkins the old artifacts are left because jenkins will not follow a symlink when purging, even if jenkins owns the symlink and the target (shame).
I workaround for that may be to point a symlink back from the new archive dir, then, when jenkins purges it's archive dir, the new symlink will dangle and a cron job can then later delete the new job archive dir
Copy Artifact Plugin (https://wiki.jenkins-ci.org/display/JENKINS/Copy+Artifact+Plugin) adds a build step for retrieving files from another project's workspace to current and work from there.

Different Hudson folders for wars and jobs

Is there any way to have the war files of Hudson in an different directory or drive that the job files.
We want to have all executables in c:\programme\hudson and all jobs in f:\data\hudson.
I've alredy played around with in hudson.xml. But this redirects not only the job directory but copies also the whole war directory to the new destination folder.
Is there any way to configure Hudson (on a windows server) to have a separation of the executable and the data/job directories?
Seting HUDSON_HOME to f:\data\hudson should do the trick
I think this problem has not an easy solution. Besides deploying to an app server, I can come up with two options.
Configure the workspace explicitly in every job to point to F:\data\hudson
create a file system link from c:\programme\hudson\jobs to f:\data\hudson. I have never used it. So have fun reading through the following links. hard links and junctions, symbolic links
I'm not sure if this is what you want, but I run hudson simply via java -jar, and then I can specify freely where the hudson war is. It seems the war unpacks into HUDSON_HOME when starting up, but I still have a separate directory where I keep the wars and download upgrades, and I can just change the shortcut when I want to run a newer war.
We run Hudson on a Windows server and use Tomcat as our container.
In this setup, you can set HUDSON_HOME to whatever you want, which holds the job configuration, and then the HUDSON.WAR file lives in C:\Program Files\Apache Software Foundation\Tomcat 6.0\webapps.