How should I install custom packages in an lxc container? - containers

I would like to start a container with the basic ubuntu template - but I'd like it to automatically install a couple of extra packages - or ideally run a bash script.
It seems like I should be using hooks, and when I create a container pass in a configuration file which sets a particular hook as my bash script. But I can't help but think there must be an easier way?

Recent versions of the lxc-ubuntu template supports a --packages option which lets you get extra packages in there.
Otherwise, you can indeed use a start hook to run stuff inside the container.
If using the ubuntu-cloud template, you could also pass it a cloud-init config file which can do that kind of stuff for you.
Or if you just want to always do the same kind of configuration, simply create an ubuntu container, start it, customize it to your liking and from that point on, just use lxc-clone instead of lxc-create to create new containYou can indeeders based on the one you customized.

Related

Make: Redo some targets if configuration changes

I want to reexecute some targets when the configuration changes.
Consider this example:
I have a configuration variable (that is either read from environment variables or a config.local file):
CONF:=...
Based on this variable CONF, I assemble a header file conf.hpp like this:
conf.hpp:
buildConfHeader $(CONF)
Now, of course, I want to rebuild this header if the configuration variable changes, because otherwise the header would not reflect the new configuration. But how can I track this with make? The configuration variable is not tied to a file, as it may be read from environment variables.
Is there any way to achieve this?
I have figured it out. Hopefully this will help anyone having the same problem:
I build a file name from the configuration itself, so if we have
CONF:=a b c d e
then I create a configuration identifier by replacing the spaces with underscores, i.e.,
null:=
space:= $(null) #
CONFID:= $(subst $(space),_,$(strip $(CONF))).conf
which will result in CONFID=a_b_c_d_e.conf
Now, I use this $(CONFID) as dependency for the conf.hpp target. In addition, I add a rule for $(CONFID) to delete old .conf files and create a new one:
$(CONFID):
rm -f *.conf #remove old .conf files, -f so no error when no .conf files are found
touch $(CONFID) #create a new file with the right name
conf.hpp: $(CONFID)
buildConfHeader $(CONF)
Now everything works fine. The file with name $(CONFID) tracks the configuration used to build the current conf.hpp. If the configuration changes, then $(CONFID) will point to a non-existant .conf file. Thus, the first rule will be executed, the old conf will be deleted and a new one will be created. The header will be updated. Exactly what I want :)
There is no way for make to know what to rebuild if the configuration changed via a macro or environment variable.
You can, however, use a target that simply updates the timestamp of conf.hpp, which will force it to always be rebuilt:
conf.hpp: confupdate
buildConfHeader $(CONF)
confupdate:
#touch conf.hpp
However, as I said, conf.hpp will always be built, meaning any targets that depend upon it will need rebuilt as well. A much more friendly solution is to generate the makefile itself. CMake or the GNU Autotools are good for this, except you sacrifice a lot of control over the makefile. You could also use a build script that creates the makefile, but I'd advise against this since there exist tools that will allow you to build one much more easily.

How do I assign Sublime key mappings for commands in the sidebar context menu?

When browsing files in the Sublime sidebar, I would like to quickly access the commands available in the context menu via shortcuts. E.g. Delete file, rename file/folder, new file/folder.
(FYI: super+N is not an ideal solution for creating new files - it isn't context aware and will usually choose an inappropriate default location).
You can enable command logging by inserting the following into the console sublime.log_commands(True). This will give you the commands and arguments being executed. You can then create a key binding with the appropriate command. My guess is the commands will use some sort of path, so you may need to write a small plugin to inject the correct paths for the various commands.
For new file creation specifically, you may want to take a look at AdvancedNewFile. Disclaimer - I'm the current maintainer of the plugin. I've tried to make it a more flexible than it originally was, with regards to specifying where to create the file. Though if you do decide to use it and have feature request, feel free to create an issue about it.

Msbuild running in Jenkins target calling HgPull fails with HgProcessException: The command <hg.exe> is not available

I am porting over an MSBuild script from CCnet to run in Jenkins. The MSBuild project is used to create a deployment package. I would rather have Jenkins drive this process itself but that's a longer term aim.
The problem i am having is (as in the title) when we try and use the HgPull target, from the MSBuild mecurial task (http://msbuildhg.codeplex.com/) we get the error message
HgProcessException: The command hg.exe is not available [Path to project]
I have seen on the project web site that someone solved it by adding the LibraryLocation proeprty to the target but that seems to have made no difference. My target currently looks like this:
Target Name="UpdateSources">
<HgPull
LibraryLocation="C:\Program Files\TortoiseHg\hg.exe"
Force="true"
Update="true"
LocalPath="$(SourcePath)"
/>
<HgUpdate
LibraryLocation="C:\Program Files\TortoiseHg\hg.exe"
Clean="true"
LocalPath="$(SourcePath)"
/>
I'm rather at a loss. Please let me know if you need any more information added to this post to solve this issue. I'm really quite new to MSBuild so really not sure where to start investigating this.
EDIT:
One thing i forgot to mention was that i have tried running the MSbuild command in a console window on the build server and still get the same result. This is really odd given it works fine in CCNet, what magic is CCNet doing to make this command work?
This is now resolved, unfortunately i'm not sure what changes i made to correct these. I believe it may have been down to path separators and whether they where trailing or not in another part of the config file. It does so annoy me with the lack of resilience/consistency between applications where you need to specify paths with/without trailing slashes.
Just a thought, but try adding 'C:\Program Files\TortoiseHg' to your system path. Maybe CCNet has it specified somewhere that Jenkins doesn't have access to.
Also, just for sanity's sake, verify that hg.exe actually exists at that location.

SSIS Deployment: Dev Stage Live AppSettings

The main problem is: How do i incorporate an appSettings.Config file with a particular build(dev, stage, live)? My appSettings.Config changes the conx strings for data sources based on which server the package is being deployed to. I am able to go through Package configurations and add my appSettings.Config, however, I can only specifically add one file dev, stage, or live. What i need to do is be able to build the solution and based on teh build type incorporate the dev/stage/live appsettings. How could I do this?
You could include all of the configuration files in the install and then just point to the correct one through an environment variable. I know you're wanting to switch the configuration file based on the solution build configuration, but you'll be looking at a complex solution when a simpler alternative exists.
Its quite straight-forward to add registry information during the package install that will set the machine's environment variable under the key:
HKLM\SYSTEM\CurrentControlSet\Control\Session Manager\Environment\MyVariable
...to the path of the .dtsConfig for the current environment.

WIX: Using a temporary file during install

I am writing a WIX installer and I have a following requirement:
During installation, I need to pass an absolute path to a file (lets call it A) included in my installer to a COM component, which already exists on the hard drive and is a part of another program. I have already written an appropriate Custom Action which expects a path to the file A. I don't want to include A as a file installed in the Program Files folder and removed during the uninstallation process. Instead, I would like to put A only temporary on the hard drive, call my Custom Action which will cause the COM component to use the content of A, and then remove A from disk. Is there an easy way to accomplish this goal?
I have tried to utilize the Binary Table and store A there, however I don't know how to reference A using absolute path. I know I could put A outside of MSI file but I would like to keep every file installer needs in a single MSI.
Any help would be appreciated.
Deleting a file that MSI installed means that MSI will consider it "broken" and try to auto-repair it if called on to do so. That happens automatically in several cases (e.g., advertised shortcuts and COM registration) so I'd recommend against it. Leave the file there instead -- it's done its job and there's no harm in leaving it there.
I would take this approach.
Install the file "A" into any directory. Run your custom action needed to update the COM component. Then run another custom action or modify the currently written one to remove the file after it is no longer in use. This would leave no trace of the file "A" and if you schedule the custom action to only run during the install you won't have to worry about it on uninstall.