I need to define a runtime environment for my development. The first idea is of course not to reinvent the wheel. I downloaded macports, used easy_install, tried fink. I always had problems. Right now, for example, I am not able to compile scipy because the MacPorts installer wants to download and install gcc43, but this does not compile on Snow Leopard. A bug is open for this issue, but I am basically tied to them for my runtime to be usable.
A technique I learned some time ago, was to write a makefile to download and build the runtime/libs with clearly specified versions of libraries and utilities. This predates the MacPorts/fink/apt approach, but you have much more control on it, although you have to do everything by hand. Of course, this can become a nightmare on its own if the runtime grows, but if you find a problem, you can use patch and fix the issue on the downloaded package, then build it.
I have multiple questions:
What is your technique to prepare a well-defined runtime/library collection for your development?
Does MacPorts/fink/whatever allows me the same flexibility of rehacking if something goes wrong ?
Considering my makefile solution, when my software is finally out for download, what are your suggestions about solving the potential troubles between my development environment and the actual platform on my user's machines ?
Edit: What I don't understand in particular is that other projects don't give me hints. For example, I just downloaded scipy, a complex library with lots of dependencies. Developers must have all the deps setup before working on it. Despite this, there's nothing in the svn that creates this environment.
Edit: Added a bounty to the question. I think this is an important issue and it deserves to get more answers. I will consider best those answers with real world examples with particular attention towards any arisen issues and their solution.
Additional questions to inspire for the Bounty:
Do you perform testing on your environment (to check proper installation, e.g. on an integration machine) ?
How do you include your environment at shipping time ? If it's C, do you statically link it, or ship the dynamic library, tinkering the LD_LIBRARY_PATH before running the executable? What about the same issue for python, perl, and other ?
Do you stick to the runtime, or update it as time passes? Do you download "trunk" packages of your dependency libraries or a fixed version?
How do you deal with situations like: library foo needs python 2.5, but you need to develop in python 2.4 because library bar does not work with python 2.5 ?
We use a CMake script that generates Makefiles that download (mainly through SVN)/configure/build all our dependencies. Why CMake? Multiplatform. This works quite well, and we support invocation of scons/autopain/cmake. As we build on several platforms (Windows, MacOSX, a bunch of Linux variants) we also support different compile flags etc based on the operating system. Typically a library has a default configuration, and if we encounter a system that needs special configuration the configuration is replaced with a specialized configuration. This works quite well. We did not really find any ready solution that would fit our purpose.
That being said, it is a PITA to get it up and running - there's a lot of knobs to turn when you need to support several operating systems. I don't think it will become a maintainance-nightmare as the dependencies are quite fixed (libraries are upgraded regularly, but we rarely introduce new one).
virtualenv is good, but it can't do magic - e.g. if you want use a library that just MUST have Python 2.4 and another one that absolutely NEEDS 2.5 instead, you're out of luck. Nor can virtualenv (or any other tool) help when there's a brand new release of an OS and half the tools &c just don't support it yet, as you mentioned for Snow Leopard: some problems are just impossible to solve (two libraries with absolutely conflicting needs within the same build), others just require patience (until all tools you need are ported to the new OS's release, you just need to stick with the previous OS release).
Related
I am working on a J2EE web-application. I would like to upgrade some of major open-source libraries from older version to newer one based on availability of libraries on maven repository.
For example of libraries : Google-Sitebricks, Jetty, JSON, Jackson, JUnit, HttpClient etc.
What would be the best way to ensure that the after upgrades, there are no side-effects/leakages happened in application.
Here, I tested the application's basic functionalities like sanity test to ensure its working or breaking the things.
I am wondering on what is the standard process need to be followed in such scenarios. Please suggest the effective & best approaches/practices for this.
Thanks
Test, test, and test. And then test some more.
Hopefully you have automated tests which will shake out problems. If not, you will find yourself spending a lot of cycles manually testing. Just sanity checks may not be enough; version upgrades can break functionality in subtle ways.
Even before you get to the testing, you may find that your application fails to compile after upgrading the libraries.
In any case, I would not upgrade many libraries all at once, except when upgrading a library causes a cascade of upgrades in transitive dependencies. Upgrading them one by one is the safest thing to do.
Given that the Tcl Wiki had a page titled
"Why isn't there a Comprehensive Tcl Archive Network like Perl's CPAN?", it seems the short answer to this is "no". Is there however someplace with an approximation - web sites, or FTP sites, with tarballs of multiple packages, ideally with older versions?
I'm an infrequent user of Tcl, and am undertaking a new x86_64 install of ActiveTcl, attempting to reproduce the package set of an i386 installation on older machines.
Using ActiveState's teacup package installer, it seems I'm able to find and install equivalents to two-thirds of the packages that were installed under the old tree. For the remaining set of packages, searching finds package-specific webpages across a variety of sites, some with only their latest version of source available.
As I'd like to avoid up-reving dependancies that some of these latest-version Tcl packages demand, I'd like to be able to install older versions of the packages.
Is there a repository that maintains broad and deep sets of Tcl add-on packages?
ActiveState's “teapot” repository is pretty good (especially for 32-bit builds) since they include nearly everything that is a sane (and reusable outside that one application) package with a sane build process. I use their stuff (and sometimes submit bug reports where the packaging has gone wrong).
There's also tcllib (as #glennjackman mentions in comments). The large majority (maybe even all) of that is available from the ActiveState repo; that which isn't is either because it isn't yet finished or because it's seriously broken in some other way. The main downside of tcllib is that it only has Tcl (and Tk, for tklib) as a binary dependency, which restricts what can be in there quite a bit; there's no database drivers in there, for example.
For binary packages, where you're not getting them via the teapot, the best place to start looking is Joe English's Gutter (Great Unified Tcl/Tk Extension Repository). It doesn't include builds but it does let you find things and then build them yourself. Most packages will also have a page (at least) on the Tcler's Wiki; that tends to be a reasonable place to look for usage samples and collected tips. (Also remember you can ask here if you've got a question specific enough.)
If you're working with tclkits, the space of libraries is a little different (as you're looking for a .kit-ted version). However I don't know that area so well.
I am experimenting JRuby on Rails and need some guidance/tips in choosing the server. I have used nginx (and passenger) for my RoR in the past. JRuby wiki page Servers provide a lot of options and I have no experience in any of them.
What I am looking for is a simple to install, easy to scale server or the one widely used (so that I can find solutions if I am stuck)
I do realize this might be a broad question and the answer would be "it depends..." but would appreciate some pointers.
I might be biased being on the Trinidad team, but I still think it's top ... esp. as it handles high concurrency well while still being much lighter than TorqueBox (which I would also recommend if you do want something beyond just a web server e.g. built-in jobs but be aware that some of it is TB/JRuby specific). Most other options mentioned are Java web servers, which you can rule out if you do not want to install a Java server, warbling your application and deploying it as a .war file.
p.s. for the (next) version 1.5.0 re-deploys are to be revisited and one should be able to do some (memory leak-free) zero-down time deployments.
Disclaimer: I am relatively unfamiliar with the flash build processes, so some/all of this may be misinformed nonsense. Please feel free to suggest alternative approaches.
We're currently developing a flex web app and our build situation is far from ideal. At present we're (as in individual developers) just building using FlashBuilder and deploying manually. The programmers are currently screaming bloody murder for two reasons, though:
The lack of CI is like going back to the stone age
We don't much care for FlashBuilder
(Note: We're only using FlashBuilder because it was the easiest way to set up a flex project in conjunction with Away3d and get it building / rendering correctly -- it's a stopgap solution).
As a predominately .NET development shop, we're used to doing continuous integration as well as continuous deployment. Ideally, we'd like to get something comparable to this for our flash projects without tying ourselves to a particular IDE.
Requirements:
The build process must be:
.. runnable via the commandline
.. runnable on both developer and CI build machines (and certainly not requiring an IDE!)
.. preferably as IDE-independent as possible (pragmatism will kick in though; if this causes a lot of friction we'll just pick one).
.. able to run on Windows (we develop using Windows)
We don't mind a touch of duplication or a few manual steps (e.g. tarting up the build scripts if we add a new project via an IDE, or generating one configuration from another if tools exist), but the less duplication / maintenance required the better.
I've read quite a few articles / blog posts and watched some short screencasts, but most of them are very thin on the ground on how the build system sits alongside IDEs. Most articles/screencasts have the same formula: How to create a "Hello World" build using a single file & text editors (no IDE).
I've not seen the topic of multiple libraries/projects etc. being broached, either.
After reading around the issue for a while, I'm considering investigating the following options:
Project Sprouts
Flexmojos
Maven Flex Plugin
buildr as3
Does anyone have any experience of the above solutions (or others I'm unaware of) and, if so, what do you make of them? Any help / pointers appreciated.
I recently started building with Gradle and the GradleFx plugin and I immediately fell in love with its power and ease of use.
Gradle is ANT + Maven + Ivy evolved and is primarily used from the command-line. You can:
write scripts in Groovy (a powerful Ruby-like language that runs in the Java Virtual Machine)
access all existing Maven and Ivy repositories as well as your own repos
use existing ANT tasks
integrate with CI (in Jenkins you just tick a checkbox to activate Gradle support)
although it has originally grown from the Java/Groovy community, it is in fact language agnostic. You add language-specific plugins for added functionality. GradleFx is such a plugin that provides you with additional ActionScript/Flex building tasks.
do easy multi-project builds. e.g. you can compile, unit test, package and deploy both your .NET service layer and your Flex client application with just one command.
use convention over configuration: if you stick to the conventions, your build scripts will be extremely terse
generate all kinds of reports: unit testing, checkstyles, codenarc, ...
generate Eclipse, IDEA or other IDE projects
all the things I haven't discovered yet
And best of all: it's very easy to learn. I had no knowledge of Maven before I started with Gradle and could get a multi-project build with some customizations working quite quickly.
Edit (comparison to Buildr AS3 and Maven)
I can compare this only to one of the projects you mentioned: Buildr AS3. It seems to start from a philosophy that is similar to Gradle's. But I've tried to use it about half a year ago and couldn't even get a simple 'Hello World' app to work. I e-mailed the developer for help: no response.
Compared to GradleFx: I had a small forum discussion with the developer (on a rather philosophical topic, since I didn't really need any help because it just worked right away). He answered within minutes.
As for Maven: (for what it's worth) I've only glanced at some configurations and they seem overly complicated when I compare them to a Gradle script.
There is one thing Maven does that you can't do with GradleFx (yet). It can pull the right Flex SDK from a Maven repo and build against that. With GradleFx you have to have your SDK's available locally.
I'm quite familiar with using maven as the main build tool and the flexmojos plugin from Sonatype. My experience has been a bit of a roller coaster with flexmojos. Maven is completely solid, it works all the time without issue, the only issue is the flexmojos plugin which has fluctuated a lot between versions. If you choose to go this route make sure to grab the source for flexmojos so you can see what your configuration options are actually doing to the command line parameters etc. For Flex 3.x flexmojos 3.x up to around 3.9 is good and works fine with regard to the goal for generating the .project eclipse files, believe there's also a mojo (a maven plugin) for generating intelliJ IDEA project files as well as others. If you're using Flex 4 you can compile with the latest flexmojos 4.0RC2 but it appears to me that the goal for generating flex/flashbuilder project properties is now gone (I'm not sure if this is because it's been replaced by another plugin altogether or what the deal is). However building with maven and flexmojos does fulfill all of your goals above (we also use it for building our service layer, so in a single mvn clean install we get a jar packed in a war packed in an ear with everything configured and a swf, that part is really nice). Also you can do continuous integration using bamboo (or simply write your own script that is triggered from a cron job or in windows as a batch file executed with a scheduled task if you don't have a *nix server around). Let me know if you'd like any more details or if I missed something major.
Shaun
I have been using Hudson, now Jenkins, with Ant for Flex automated builds and FlexUnit testing. Jenkins has some really useful plugins for integration with eclipse (and hence, FDT or FlashBuilder), Jira, SVN, Git etc., and it's free. Also, you can integrate the Ant build into Maven scripts, so I've found this to be a good and flexible solution for all purposes I've come across so far.
The Flex SDK comes with Ant tasks, and writing even elaborate Ant build scripts is quite easy - in fact, I'd been using Ant locally before, and I could reuse my existing scripts with only a few added extra compiler options for FlexUnit tasks.
However, it took a while to set up the system correctly for unit testing, because I'm running a headless server on Linux, and that implicates a rather complicated environment for ActionScript tests, because they run only in Flash Player. This, of course, is true for all CI scenarios using FlexUnit, regardless of which server you use.
Here's what I've learned:
FlexUnit needs a standalone debugger version of Flash Player installed, but Adobe only distributes binaries for the standard version on Linux. Therefore, compiling from source was necessary, and since my server system is stripped down to the bare necessities, it took some effort to install all the correct dependencies and get them to work.
The Flash Player needs hardware to run correctly: It uses graphics, therefore it needs a graphics card, and sounds, therefore it needs a sound card. On my headless server, this meant I had to install a VNC host to get it to run at all, and I had to eliminate any tests using sounds (those will now only run on local machines). If anyone ever comes across a working sound card emulation for openSuSE that I could use with the VNC client - you'll be my hero forever!
If you've set asynchronous timeouts in your unit tests, and/or you need to use setTimeout() to send delayed procedure calls, make sure the intervals aren't too short - I've had problems with tests that ran fine on any local machine, but broke the build on the CI server, because the Flash Player is considerably slower on the VNC client than on an actual graphics card.
I've also found this last issue to be a healthy lesson: Criteria for unit tests should not be based on assumptions about the system's performance, or at least be tolerant enough to succeed even on a slow machine.
What are best practices in recording build/test machine(s) setup during the life time of a project? If you need to provide a patch for previous version of your product, you likely need to reload the same compiler and support tools to re-issue the patched release. What and how do you record? The obvious things are:the OS version and patch level, compiler/IDE version and patch level 3rd party tools/libraries.
My first thought is to keep a log file of all the requirements. This log file would go into your VCS.
VMWare Virtualization(or other similar products) are ideal for this type of thing. Build an entire development/build/ or test environment, and leave it setup just for that purpose. You can take the image off-line, back it up to a DVD and simply turn it back on when you need it.
I'm using maven for java with the enforcerer plugin so all of these things are stored in my project object model, even the version of maven itself which is required. As long as I manage to get the proper version from version control I'm home free.
3rd party tools and libraries go in version control along with everything else; we have a libs tree that goes under our VCS trunk right next to our app tree, so it gets included with any branches or tags that we create. The one wrinkle I haven't yet solved is Windows tools and libraries that require their own installers instead of running out of whatever directory VCS gives them.
For OS and compiler, I'd recommend creating a VM for each release if you can't install multiple compiler versions in parallel. Then your project wiki can document which VM and which compiler version to use for a given build. This isn't automatic like your log file would be, but it provides a ready-to-go environment (instead of potentially having to reinstall a machine to match your log file). Some projects check their entire compiler into version control, but this seems overkill to me (and doesn't play well with IDEs and compilers that need their own installers).
We don't track patch levels for the OS and compiler. I realize that it's possible that a patch would break or change something, but the chance seems so low that the cost-benefit ratio just isn't there.