How to deal with overwhelming Jenkins updates to core and plugins? - hudson

I love Jenkins and appreciate that it is an active project. Still, I don't know what would be the correct approach for maintaining a Jenkins installation because I do see Jenkins+plugins updates daily and this takes too much time to update them.
Is there a way to automate this or a LTS version that I can use instead?

The Jenkins team do have a concept of LTS releases, so take a look at this Wiki: https://wiki.jenkins-ci.org/display/JENKINS/LTS+Release+Line
As for automating updates, you can do it if you've installed Jenkins using a package manager at the OS level. For instance, on ubuntu you could have a cron that calls apt-get update and apt-get install jenkins at midnight. I'm not sure about automating if you've installed it manually.
However, automatic updates have a bad side, as essential plugins could potentially stop working with new updates, or bugs that have slipped through the net could cause problems.
Having said that, the quality of Jenkins seems consistently good, so it might be worth the risk for you.

As far as I know there isn't a way to automate the update.
However, given the fact that an update could (in theory, Jenkins has been completely stable in my experience) break something in your build process, I don't think automating would be an appropriate solution.
What seems to work well for me is to periodically do the updates manually and then re-run all of the build jobs to verify that nothing has broken. I do this as part of our regular maintenance when we do VM backups and operating system updates on our CI environment. (The updates are done after the backups so that if something does go wrong, we have an up-to-date fall back point..)

Related

What is the best approach towards updating a very old MediaWiki installation?

I've got a client that's running a pretty ancient version of Mediawiki (1.17.0) on an internal corporate network. It's running on PHP5, but they're looking to upgrade to PHP7. This version has basically served its purpose well and whoever was in charge never messed with upgrading it. (I know) But it won't run under PHP 7.4 and the company is upgrading their PHP systems.
Obviously there have been a TON of revisions to MediaWiki in this time. I've got a few questions...
First, what's the earliest version of MediaWiki that is PHP7 compatible?
Second, what's the best approach to updating this system? It's got over 1000 pages and 1000 images. It's not huge, but it's obviously not small, and if there's any way to automate bringing this up to the most current version, I'm curious what my options are?
I'm trying to avoid doing incremental updates because upon investigating, it's a ton of them, and not all interim versions seem to be available.
Is it possible to install a new, fresh version and copy the database/images over?
Any advice is most appreciated!
As noted in https://www.mediawiki.org/wiki/Compatibility#Upgrade , you need to upgrade first to 1.35 and then to 1.38 (or 1.39 if you wait a little - 1.39 is an LTS release so probably worth it). See the manual on upgrading on how to do it - basically, update the files in MediaWiki core and all extensions, possibly run composer update depending on your distribution (tarballs tipically already include the files that Composer would download, git doesn't), and run the upgrade script wich will migrate the DB.

Removing msql-libs using yum removes many critical applications

I'm trying to delete msyql 5.1 and install mysql 5.7 on Centos 6.9 However when I enter
"yum remove mysql-libs-5.1.73-8.el6_8.x86_64"
it shows a whole slew of applications such as crontab, chrome, redhat-lsb-core, postfix and many others that will be deleted because they have a dependency on it.
From googling, I saw there is a "swap" option, but can I do this for each application? Do both the old and the new MySql libraries have to be installed?
Another option is to leave the old dependency there, and install the new one. But it seems like this could throw off the mysql install...it might somehow find the old dependency - or the existing apps might find the new dependency.
A third option is to re-install all the deleted applications, but figuring out how to install mysql was tricky enough. How do I know which ones are critical or not, for one thing? I definitely need crontab and chrome. The other "redhat-lsb" onese look important. Also postfix.
Any ideas on how to approach this?
To avoid issues such as this, CentOS/Redhat has provided another way to get latest packages of MySQL, PHP, Python etc.. through Software Collections(SCL).
Using SCL you can have multiple versions of MySQL running on the same server without any conflict or dealing with dependency.
https://wiki.centos.org/AdditionalResources/Repositories/SCL
https://www.softwarecollections.org/en/scls/rhscl/rh-mysql57/

Running another MSI through visual studio installer

I want to install MySQL installer msi with my setup.(MySQL installer has to install silently. I am using batch file in custom action to do this.)
However, the problem is that MySQL's msi cannot be run from within main setup.msi it gives out 1618 error(Another installation needs to be completed.) I would like to know an easy way around this.
Background info: MySQL installer unpacks the manifest which contains MySQLInstallerConsole.exe it is then called through another cmd command to install MySQL.
So all I am looking to do is to execute MySQL installer so that it unpacks the manifest. Later I would call the MySQLInstallerConsole.exe to install MySQL through custom action.
Just to mention even more possibilities- some my colleagues mentioned (VS bootstrapper, burn):
Just start writing a batch or script for calling the two MSIs after each other.
Always a good starting point maybe, if you have no experience with MSI.
Write your own mini setup.exe bootstrapper with 5 lines of code to do the above.
(To be more concrete in "Third party tools":) Buy InstallShield or Advanced Installer or InstallAware, this are the tools with ready GUIs to do such easy bootstrapping.
I would recommend the second out of them. Starting another MSI are only two clicks. Similar with the other. But there are BIG differences between the three, especially InstallAware is special.
! Mentioned "mini bootstrappers" of those tools are not as powerful as Burn or the others followed:
Buy the ready setup suite SetupFactory which can be used as a bootstrapper for MSIs.
Use the InstallShield "Suite" project type, if you buy the Premium Edition of InstallShield. Costs big bucks, but has a friendly user interface. I was successful using it before some years, but I had to work around a handful of bugs as always with IS (but I guess you will discover bugs with most tools. Way it is.)
There (again) Burn would come in handy, you could fix potential bugs or behaviours on your own here ...
Only it could take you more time in the beginning.
... Of course there may be more.
There isn't an easy work around. Windows Installer enforces a 1 installation at a time rule through the use of a mutex. You need to create a bootstrapper / chainer to serialize the installation of your packages. Visual Studio Deployment Projects don't support this. I'd suggest looking at Windows Installer XML (WiX) and it's Burn boostrapper / chainer engine.
The documentation is a bit sparse, but in the Visual Studio world the customized bootstrap is the Bootstrap Manifest Generator. The docs start here:
https://msdn.microsoft.com/en-us/library/ms165429.aspx
and there is an old article here:
https://msdn.microsoft.com/en-us/magazine/cc163899.aspx
but it's not clear how much info and support is available since setup projects were removed from setup projects and then restored.

Unified technology for software update

It is cumbersome to update different kinds of software one by one. In my case,
For the Ubuntu Linux OS, I periodically do apt-get update and apt-get upgrade or apt-get dist-upgrade (this part can be automated).
For Ruby language, I have to do particular things to upgrade.
For Ruby libraries (gem), I have to periodically do gem update or gem update --system.
For JQuery library, I have to periodically check its website and download the latest version.
And so on.
I can write a script to automatically do these things, but it would be nice if there were a cross-platform unified established way where a user can just register all the software they want to be automatically updated, and a software will periodically check the relevant websites for updates and notice the user and/or automatically do the update when there is such update.
Does such technology exist? If not, what will be a promising line for such technology in the future? I have a feeling that it could be done by using something like RSS feed, where software developers will publish feeds whenever they update a software, and a specific reader takes care of the update process.
No. And it never will. This kind of luxury comes with a price called "gated community". But then it does not apply to ALL software...

Does Jenkins support clustering?

I am a user of hudson. I recently moved to jenkins. I know hudson does not support clustering of servers. Does jenkins provide that. Also elaborate things a little as I am new to this. Thanks in Advance.
If by clustering you mean having a single web interface and many workers behind, yes Jenkins (like Hudson from which Jenkins is forked) support it and it's called Distributed Builds. It allows you to run jobs on differents workers called slaves.
See the Distributed Builds page on the Jenkins Wiki.
OS Jenkins does not support clustering.
Cloudbees Jenkins Enterprise has HA support using active and stand-by Jenkins masters.
http://jenkins-enterprise.cloudbees.com/docs/user-guide-bundle/ha.html
Jenkins is fairly close to Hudson, feature-wise. Jenkins project forked off Hudson around 18 months ago and the basic architecture is still the same. So, even without knowing exactly what you mean by clustering, I am confident Jenkins does not support clustering if you say Hudson does not support it.
I have heard rumors there is work going on to make Jenkins have some high-availability features, but that is all I know. No idea what exactly that means or how is it implemented.