While working on a number of Aurelia applications we reached the point that we need to share code between these applications.
The code to be shared is not something we (actually our customer) would like to be open sourced. So we were thinking of creating private jspm packages so we can easily reuse our modules when needed. This idea is inspired on this blogpost
Jspm does support linking packages as well so that is something we would really like to use during development, as the code to be reused is not yet stable enough.
Workflow with linked jspm packages
So our intended workflow is to create different number of projects/modules with ES2016 code and create jspm packages from these projects. These packages will then be installed and linked in our applications.
Transpiling
As far as we know jspm packages need to be transpiled before you install it into your project. So we always need to transpile code if there is any change.
According to this systemjs/babel-plugin documentation there will be support for on the fly transpiling in version 0.17 (which is in beta at this point). Currently we're using jspm version 0.16.32 which does not support on the fly transpiling with babel 6.
Somehow this workflow feels over-engineered, so we consider on the fly transpiling in our applications, though we fear a performance penalty.
How would you approach sharing es2016 code between different Aurelia applications?
Is there anyway to achieve code reuse without using jspm packages?
I personally would publish the packages to an internal npm feed of some sort. Set up the internal feed to mirror the public npm feed, then set your .npmrc to point to the private feed. Then just
jspm install npm:my-private-package
Related
I am using polymer core and paper components to build my app, working great! But when I want to deploy my app to production I don't have a clear way of knowing what all common polymer dependencies I have to package with my app and shared those common elements across multiple apps.
For example:
To use https://github.com/Polymer/paper-button, I include a link to ../paper-button/paper-button.html which might have other dependencies. I used vulcanize but it seems to inline all components which is not an ideal (in terms of size) way to share via a CDN.
Any suggestions on how to deal with this?
I suggest using Bower, or another dependency management framework. Using bower, you can declare what the dependencies of your app are:
{
"dependencies": {
"polymer": "Polymer/polymer#master",
"paper-button": "Polymer/paper-button#master"
}
}
and then use bower install to add those dependencies to your application. The benefit there is that bower will then install the dependencies of your dependencies, and their dependencies, and so on.
With bower you could run bower list and it will list the dependency hierarchy.
If you don't want to use bower, or have trouble setting it up, the Polymer Elements Catalog does a pretty good job of providing an archive of the components you want with all the appropriate dependencies.
I am performing Android app reverse engineering (decompilation and further analysis). I wish to analyse the package dependencies between different packages inside the same apk file. Is there any tool which does this? Else, how can I deduce package dependency in apk files?
I wish to analyse the package dependencies between different packages inside the same apk file. Please let me know if there is any tool which does this
A tool called Dexter is available to find out all the packages present in an apk. Also, if there is a program dependency among any 2 packages, Dexter show it with an edge. It could be noted that the edges do not have any weight to indicate the strength of the dependency.
The only dependencies that I have run into with apk files are with the framework from the device. If you are dealing with a standard app available from the Play, then it generally contains everything that it needs inside its own package.
Some of the custom builds like Amazon Kindle Fire, Barns & Noble Nook, etc have built custom launchers and other services which then offer other functions through their frameworks.
The framework-res.apk is typically stored in the /system/framework location. You can extract that from your device and place it in your apktool folder of your %userprofile%/~. Apktool (I am assuming that is one of the tools you are using) will then reference it when it does the work.
Disclaimer: I am relatively unfamiliar with the flash build processes, so some/all of this may be misinformed nonsense. Please feel free to suggest alternative approaches.
We're currently developing a flex web app and our build situation is far from ideal. At present we're (as in individual developers) just building using FlashBuilder and deploying manually. The programmers are currently screaming bloody murder for two reasons, though:
The lack of CI is like going back to the stone age
We don't much care for FlashBuilder
(Note: We're only using FlashBuilder because it was the easiest way to set up a flex project in conjunction with Away3d and get it building / rendering correctly -- it's a stopgap solution).
As a predominately .NET development shop, we're used to doing continuous integration as well as continuous deployment. Ideally, we'd like to get something comparable to this for our flash projects without tying ourselves to a particular IDE.
Requirements:
The build process must be:
.. runnable via the commandline
.. runnable on both developer and CI build machines (and certainly not requiring an IDE!)
.. preferably as IDE-independent as possible (pragmatism will kick in though; if this causes a lot of friction we'll just pick one).
.. able to run on Windows (we develop using Windows)
We don't mind a touch of duplication or a few manual steps (e.g. tarting up the build scripts if we add a new project via an IDE, or generating one configuration from another if tools exist), but the less duplication / maintenance required the better.
I've read quite a few articles / blog posts and watched some short screencasts, but most of them are very thin on the ground on how the build system sits alongside IDEs. Most articles/screencasts have the same formula: How to create a "Hello World" build using a single file & text editors (no IDE).
I've not seen the topic of multiple libraries/projects etc. being broached, either.
After reading around the issue for a while, I'm considering investigating the following options:
Project Sprouts
Flexmojos
Maven Flex Plugin
buildr as3
Does anyone have any experience of the above solutions (or others I'm unaware of) and, if so, what do you make of them? Any help / pointers appreciated.
I recently started building with Gradle and the GradleFx plugin and I immediately fell in love with its power and ease of use.
Gradle is ANT + Maven + Ivy evolved and is primarily used from the command-line. You can:
write scripts in Groovy (a powerful Ruby-like language that runs in the Java Virtual Machine)
access all existing Maven and Ivy repositories as well as your own repos
use existing ANT tasks
integrate with CI (in Jenkins you just tick a checkbox to activate Gradle support)
although it has originally grown from the Java/Groovy community, it is in fact language agnostic. You add language-specific plugins for added functionality. GradleFx is such a plugin that provides you with additional ActionScript/Flex building tasks.
do easy multi-project builds. e.g. you can compile, unit test, package and deploy both your .NET service layer and your Flex client application with just one command.
use convention over configuration: if you stick to the conventions, your build scripts will be extremely terse
generate all kinds of reports: unit testing, checkstyles, codenarc, ...
generate Eclipse, IDEA or other IDE projects
all the things I haven't discovered yet
And best of all: it's very easy to learn. I had no knowledge of Maven before I started with Gradle and could get a multi-project build with some customizations working quite quickly.
Edit (comparison to Buildr AS3 and Maven)
I can compare this only to one of the projects you mentioned: Buildr AS3. It seems to start from a philosophy that is similar to Gradle's. But I've tried to use it about half a year ago and couldn't even get a simple 'Hello World' app to work. I e-mailed the developer for help: no response.
Compared to GradleFx: I had a small forum discussion with the developer (on a rather philosophical topic, since I didn't really need any help because it just worked right away). He answered within minutes.
As for Maven: (for what it's worth) I've only glanced at some configurations and they seem overly complicated when I compare them to a Gradle script.
There is one thing Maven does that you can't do with GradleFx (yet). It can pull the right Flex SDK from a Maven repo and build against that. With GradleFx you have to have your SDK's available locally.
I'm quite familiar with using maven as the main build tool and the flexmojos plugin from Sonatype. My experience has been a bit of a roller coaster with flexmojos. Maven is completely solid, it works all the time without issue, the only issue is the flexmojos plugin which has fluctuated a lot between versions. If you choose to go this route make sure to grab the source for flexmojos so you can see what your configuration options are actually doing to the command line parameters etc. For Flex 3.x flexmojos 3.x up to around 3.9 is good and works fine with regard to the goal for generating the .project eclipse files, believe there's also a mojo (a maven plugin) for generating intelliJ IDEA project files as well as others. If you're using Flex 4 you can compile with the latest flexmojos 4.0RC2 but it appears to me that the goal for generating flex/flashbuilder project properties is now gone (I'm not sure if this is because it's been replaced by another plugin altogether or what the deal is). However building with maven and flexmojos does fulfill all of your goals above (we also use it for building our service layer, so in a single mvn clean install we get a jar packed in a war packed in an ear with everything configured and a swf, that part is really nice). Also you can do continuous integration using bamboo (or simply write your own script that is triggered from a cron job or in windows as a batch file executed with a scheduled task if you don't have a *nix server around). Let me know if you'd like any more details or if I missed something major.
Shaun
I have been using Hudson, now Jenkins, with Ant for Flex automated builds and FlexUnit testing. Jenkins has some really useful plugins for integration with eclipse (and hence, FDT or FlashBuilder), Jira, SVN, Git etc., and it's free. Also, you can integrate the Ant build into Maven scripts, so I've found this to be a good and flexible solution for all purposes I've come across so far.
The Flex SDK comes with Ant tasks, and writing even elaborate Ant build scripts is quite easy - in fact, I'd been using Ant locally before, and I could reuse my existing scripts with only a few added extra compiler options for FlexUnit tasks.
However, it took a while to set up the system correctly for unit testing, because I'm running a headless server on Linux, and that implicates a rather complicated environment for ActionScript tests, because they run only in Flash Player. This, of course, is true for all CI scenarios using FlexUnit, regardless of which server you use.
Here's what I've learned:
FlexUnit needs a standalone debugger version of Flash Player installed, but Adobe only distributes binaries for the standard version on Linux. Therefore, compiling from source was necessary, and since my server system is stripped down to the bare necessities, it took some effort to install all the correct dependencies and get them to work.
The Flash Player needs hardware to run correctly: It uses graphics, therefore it needs a graphics card, and sounds, therefore it needs a sound card. On my headless server, this meant I had to install a VNC host to get it to run at all, and I had to eliminate any tests using sounds (those will now only run on local machines). If anyone ever comes across a working sound card emulation for openSuSE that I could use with the VNC client - you'll be my hero forever!
If you've set asynchronous timeouts in your unit tests, and/or you need to use setTimeout() to send delayed procedure calls, make sure the intervals aren't too short - I've had problems with tests that ran fine on any local machine, but broke the build on the CI server, because the Flash Player is considerably slower on the VNC client than on an actual graphics card.
I've also found this last issue to be a healthy lesson: Criteria for unit tests should not be based on assumptions about the system's performance, or at least be tolerant enough to succeed even on a slow machine.
We are developing a closed-source project, versionned with Mercurial. We are using two libraries in our project :
One of those libraries is being developed by a third-party. They are using git, and we usually just pull from their repo once in a week to get the latest changes.
The other library is being developed by ourselves, and is under active development. It must live in its own public mercurial repository, as it is licensed under LGPL. (It's a fork of a third-party LGPL component, ported to our platform)
So my question is: How should I organize the source to ensure that:
A developer from our team should be able to get all the source (main project + libraries) with a single "clone" command
We should be able to pull easily the latest changes from the libraries, even though one of them is managed by git
Should we use mercurial sub-repos functionnality, with hg-git to access to the library under git? Is it well supported by TortoiseHg and BitBucket? (pros: easy to pull library changes / cons: does it works well?)
Or should we keep only snapshots of the libraries under our project? (thus, when there are new upstream changes in the libraries, we pull them to a separate place, and then copy the whole source to our project? (pros: will work / cons: pain in the ass, especially for the library that is being developed by ourselves, which is subject to a lot of daily changes)
Yeah, use the subrepo with hg-git. It's easy, well-supported, and effective. Your .hgsubstate file will include pointers to snapshots of the subrepos, and that file is controlled, so at any point in time you'll be able to answer the question: what version of library X was this working against. It's good stuff.
Alternately, you can use a dependency manager like ivy or maven as appropriate for your language, but don't include their libraries in your repo if you can avoid it. Pointers to versions of their code is better and a dependency manager or subrepos are the cleanest ways to do that.
I need to define a runtime environment for my development. The first idea is of course not to reinvent the wheel. I downloaded macports, used easy_install, tried fink. I always had problems. Right now, for example, I am not able to compile scipy because the MacPorts installer wants to download and install gcc43, but this does not compile on Snow Leopard. A bug is open for this issue, but I am basically tied to them for my runtime to be usable.
A technique I learned some time ago, was to write a makefile to download and build the runtime/libs with clearly specified versions of libraries and utilities. This predates the MacPorts/fink/apt approach, but you have much more control on it, although you have to do everything by hand. Of course, this can become a nightmare on its own if the runtime grows, but if you find a problem, you can use patch and fix the issue on the downloaded package, then build it.
I have multiple questions:
What is your technique to prepare a well-defined runtime/library collection for your development?
Does MacPorts/fink/whatever allows me the same flexibility of rehacking if something goes wrong ?
Considering my makefile solution, when my software is finally out for download, what are your suggestions about solving the potential troubles between my development environment and the actual platform on my user's machines ?
Edit: What I don't understand in particular is that other projects don't give me hints. For example, I just downloaded scipy, a complex library with lots of dependencies. Developers must have all the deps setup before working on it. Despite this, there's nothing in the svn that creates this environment.
Edit: Added a bounty to the question. I think this is an important issue and it deserves to get more answers. I will consider best those answers with real world examples with particular attention towards any arisen issues and their solution.
Additional questions to inspire for the Bounty:
Do you perform testing on your environment (to check proper installation, e.g. on an integration machine) ?
How do you include your environment at shipping time ? If it's C, do you statically link it, or ship the dynamic library, tinkering the LD_LIBRARY_PATH before running the executable? What about the same issue for python, perl, and other ?
Do you stick to the runtime, or update it as time passes? Do you download "trunk" packages of your dependency libraries or a fixed version?
How do you deal with situations like: library foo needs python 2.5, but you need to develop in python 2.4 because library bar does not work with python 2.5 ?
We use a CMake script that generates Makefiles that download (mainly through SVN)/configure/build all our dependencies. Why CMake? Multiplatform. This works quite well, and we support invocation of scons/autopain/cmake. As we build on several platforms (Windows, MacOSX, a bunch of Linux variants) we also support different compile flags etc based on the operating system. Typically a library has a default configuration, and if we encounter a system that needs special configuration the configuration is replaced with a specialized configuration. This works quite well. We did not really find any ready solution that would fit our purpose.
That being said, it is a PITA to get it up and running - there's a lot of knobs to turn when you need to support several operating systems. I don't think it will become a maintainance-nightmare as the dependencies are quite fixed (libraries are upgraded regularly, but we rarely introduce new one).
virtualenv is good, but it can't do magic - e.g. if you want use a library that just MUST have Python 2.4 and another one that absolutely NEEDS 2.5 instead, you're out of luck. Nor can virtualenv (or any other tool) help when there's a brand new release of an OS and half the tools &c just don't support it yet, as you mentioned for Snow Leopard: some problems are just impossible to solve (two libraries with absolutely conflicting needs within the same build), others just require patience (until all tools you need are ported to the new OS's release, you just need to stick with the previous OS release).