Bolt extension with extra dependencies in composer.json? - bolt-cms

I'm trying to create a local Bolt extension that uses extra libraries. Here is a snippet of my composer.json:
"type": "bolt-extension",
"require": {
"bolt/bolt": ">=2.0.0,<3.0.0",
"oyejorge/less.php": "~1.7"
}
First, is it even possible/advisable to manage dependencies this way in local Bolt extensions? Or, do I need to manually include the library and autoload the files?
Secondly, what is the mechanism by which I should update the composer.json file in my extension? Should I browse to the directory and run composer update, or is there a more Bolt-y way of doing it?

At present, we disable the packagist repo in you extensions/composer.json by default — due to performance.
However, in the CLI, if you change to your extension's installed directory, a composer update will pull in/update the dependency for you.
All that doesn't mean this mightn't change, I am reworking the Composer code at present, so keep an eye on the changelog.

Related

Can I use opam to make a package out of a local file and install it?

I'm new to opam and trying to figure out how to use it properly. For a class, I want to set up students with an environment that has some custom packages installed. (The package will consist of some raw .ml files that I got from a colleague at another school; the files are on their github but there's no .opam file that I can see, and as far as I know they're not in any official package release.)
Can I somehow call these local .ml files a package and ask opam to install it? Do the files have to be on github first, and if so can I use my colleague's existing repository as the source? I don't want to make any of this public, since it is not my own work; I just want to configure my local environment so that the code in the files can be included easily as a package. Basically I don't know the best way to proceed so I'm happy for any advice.
You can add a custom opam file in the base directory of the project. See the documentation for how to create that file.
Then you can enter opam pin add . in the base directory and your project will be installed as if it was an opam package. Check opam pin --help for more info (you can also pin to a remote git project for instance).
Note that though the default repository is hosted on github, this is in no way a requirement for opam. Opam is dependent on git but you can absolutely use it with a private git repository. If you want to use your colleague's repository as the source, that is totally doable though it is often preferable to have the opam file at the root of the directory (you can do a PR on their repository or make your own fork of it on github, the site makes it clear you copied the code).
If pinning is not to your taste, you can also create your own repository though this is probably a bit too heavyweight for your needs.
Good luck!

Yii 2 - Overriding files of mdm extension in vendor to extension folder using aliases in config

I have installed mdm extension for admin in basic version of Yii2, that is located in vendor directory and I want to override some files into extension directory for UI changes, I referred this link and added some code in web and console file as:
'aliases' => [
'#mdm/admin' => '#app/extensions/mdm/yii2-admin',
],
But no changes are reflected after doing this.
Most likely version installed by Composer conflicts with version which you unpacked manually. Instruction in documentation are for installing extension without using Composer, so they're definitely not considering the fact, that you have the same extension installed in two different places.
If you want to edit this extension you should uninstall composer version and copy whole extension into extensions/mdm/yii2-admin.
But probably better option would be to fork it, and add repository to your composer.json which will point to your fork:
"repositories": [
{
"type": "vcs",
"url": "https://github.com/yourname/yii2-admin.git"
}
],
And change dependency constraints to use master branch:
"require": {
"mdmsoft/yii2-admin": "dev-master as 2.8.0",
// ...
},
Then all necessary changes you're preforming in your fork (at https://github.com/yourname/yii2-admin.git) and fetching changes to main project using composer update mdmsoft/yii2-admin. It will simplify syncing changes from upstream - you just need to merge changes for upstream if you want to update your fork with lates changes from original extension repository.

Managing composer and deployment

So, I'm enjoying using composer, but I'm struggling to understand how others use it in relation to a deployment service. Currently I'm using deployhq, and yes, I can set it to deploy and run composer when there is an update to the repo, but this doesn't make sense to me now.
My main composer repo, containing just the json file of all of the packages I want to include in my build, only gets updated when I add a new package to the list.
When I update my theme, or custom extension (which is referenced in the json file), there is no "hook" to update my deployment service. So I have to log in to my server and manually run composer (which takes the site down until it's finished).
So how do others manage this? Should I only run composer locally and include the vendor folder in my repo?
Any answers would be greatly appreciated.
James
There will always be arguments as to the best way to do things such as this and there are different answers and different options - the trick is to find the one that works best for you.
Firstly
I would first take a step back and look at how you are managing your composer.json
I would recommend that all of your packages in composer.json be locked down to the exact version number of the item in Packagist. If you are using github repo's for any of the packages (or they are set to dev-master) then I would ensure that these packages are locked to a specific commit hash! It sounds like you are basically there with this as you say nothing updates out of the packages when you run it.
Why?
This is to ensure that when you run composer update on the server, these packages are taken from the cache if they exist and to ensure that you dont accidentally deploy untested code if one of the modules happens to get updated between you testing and your deployment.
Actual deployments
Possible Method 1
My opinion is slightly controversial in that when it comes to Composer for many of my projects that don't go through a CI system, I will commit the entire vendor directory to version control. This is quite simply to ensure that I have a completely deployable branch at any stage, it also makes deployments incredibly quick and easy (git pull).
There will already be people saying that this is unnecessary and that locking down the version numbers will be enough to ensure any remote system failures will be handled, it clogs up the VCS tree etc etc - I won't go into these now, there are arguments for and against (a lot of it opinion based), but as you mentioned it in your question I thought I would let you know that it has served me well on a lot of projects in the past and it is a viable option.
Possible Method 2
By using symlinks on your server to your document root you can ensure that the build completes before you switch over the symlink to the new directory once you have confirmed the build completed.
This is the least resistance path towards a safe deployment for a basic code set using composer update on the server. I actually use this method in conjunction with most of my deployments (including the ones above and below).
Possible Method 3
Composer can use "artifacts" rather than a remote server, this will mean that you will basically be creating a "repository folder" of your vendor files, this is an alternative to adding the entire vendor folder into your VCS - but it also protects you against Github / Packagist outages / files being removed and various other potential issues. The files are retrieved from the artifacts folder and installed directly from the zip file rather than being retrieved from a server - this folder can be stored remotely - think of it as a poor mans private packagist (another option btw).
IMO - The best method overall
Set up a CI system (like Jenkins), create some tests for your application and have them respond to push webhooks on your VCS so it builds each time something is pushed. In this build you will set up the system to:
run tests on your application (If they exist)
run composer update
generate an artifact of these files (if the above items succeed)
Jenkins can also do an actual deployment for you if you wish (and the build process doesn't fail), it can:
push the artifact to the server via SSH
deploy the artifact using a script
But if you already have a deployment system in place, having a tested artifact to be deployed will probably be one of its deployment scenarios.
Hope this helps :)

Composer could not find a composer.json file even when the file exists

I have installed Laravel and and I am playing around with it.
I am following the tutorial :-
http://geekanddummy.com/how-to-laravel-4-tutorial-part-3-using-external-libraries/
I need a new library and have added that in composer.json.
But when I run
php composer.phar update
I get the following message :-
Composer could not find a composer.json file in
What do I need to do to get this corrected..?
I'm the author of the tutorial you link to (hope our Laravel tutorial series is of some use, by the way). I can't quite tell from your answer whether you restored all the content of the original composer.json file when you created a new file of the same name...? You definitely shouldn't need to nuke the composer.json file and start again. As you probably know, that file contains all the Composer-based dependencies for your Laravel project, so you would of course lose significant functionality if you were to wipe it and leave anything out when you start again.
The error message looks to me like you're either in the wrong directory or it's a permissions problem with the composer.json file. It's too late to be certain now, but if you come across the problem again, try running:
chmod a+r composer.json
at the console/ssh shell before running composer update. This restores read permissions (for all users). You may also wish to check file ownership. (Both commands assume we're in a Unix/Linux system, so YMMV.)
Watch out also for instances where you're running a Composer command with the global parameter. This will cause an change that relates to wherever you've installed Composer, rather than the directory you're currently working in.
Note that you can also update using the --working-dir parameter, thus:
php composer.phar --working-dir=/path/to/laravel/project update
Hope this helps.
Rob
Deleting the existing composer.json after copying the content and then saving a new file with the same name works.
If your on windows (at least in my experience) you actually use just 'composer', not composer.phar/.json or any extension.
composer update
composer dump-autoload
This is all I need to type in to be able to access composer in my laravel directory.
i had the same issue, tried several many solutions that worked for others but not in my case.
then some-one told me to drop the 'php' and '.phar' from my command. i.e. directly use
composer update
composer create-project etc.etc.
and to my surprise it worked..
Occasionally, and for reasons that remain a bit vague, restarting the server seems to solve the issue. That step might be worth trying before making changes to file permissions.
Go to https://git-scm.com/download/win
Then download related version and run your composer command inside the git bash
Or else you can use vscode bash;
enter image description here

Can Jenkins store artifacts outside the job directory?

I currently have Jenkins set up with a number of jobs, but it's proving difficult to back up because the artifacts are stored within the job directory. I'd like to back up the job configurations and artifacts separately. I'm sure I remember reading somewhere that Jenkins now has an option to store them outside the job, but I can't find this.
Is there any configuration option that does this while still making the artifacts visible from within the job on the Jenkins interface? (ie rather than merely an add-in that copies the artifacts elsewhere)
Go to your jenkins configuration page, e.g.
http://mybuildserver.acme.com/configure
At the top of the configuration page there is a "home directory" setting. Click the "advanced..." button below it.
Now set the "Workspace Root Directory" to e:\jenkins-workspaces\${ITEM_FULL_NAME}, and "Build Record Root Directory" to e:\jenkins-builds\${ITEM_FULL_NAME} or something similar.
Warning: I run Jenkins 2.7.2 and noticed that certain features don't work properly after configuring Jenkins like that. I saw problems with folders and problems with the multi-branch project plugin. Check the status of those issues if your rely on these features.
As you can see here, there are many plugins to deploy artifacts anywhere you want/need, on FTP, CIFS, Confluence, Artifactory.... especially the ArtifactsDeployer that will allow you to make a copy of the artifacts in the Jenkins Home.
Thank you Sam, for your post, which directed me into the right direction to solve my problem.
Have been searching for a way on how can I make a symlink to the Job-Archive of a build for multibranch projects. Up to now, we used to manually search for the correct folder basename in the filesystem and added that one to the Jenkinsfile.
Now, I can simply use
jobOutputFolder = currentBuild.rawBuild.artifactsDir.path
and use that in my script.
If security is a concern, I could implement that as a shared library additionally.
Try the Use Custom Workspace build option. From the Jenkins popup help:
For each job on Jenkins, Jenkins allocates a unique "workspace
directory." This is the directory where the code is checked out and
builds happen. Normally you should let Jenkins allocate and clean up
workspace directories, but in several situations this is problematic,
and in such case, this option lets you specify the workspace location
manually.
This option is also available under advanced project properties of multi-configuration project builds.
A groovy script under "Prepare an environment for the run" will always run on the master, and this groovy script can create a symlink to where you really want artifacts archiving to archive_to which SHOULD include the job name and build number:
if (! Files.createSymbolicLink(Paths.get(currentBuild.artifactsDir.path),
Paths.get(archive_to.getCanonicalPath()))) {
throw new RuntimeException("Can't create symlink to archive dir")
}
Of course (sadly) when old builds are purged by Jenkins the old artifacts are left because jenkins will not follow a symlink when purging, even if jenkins owns the symlink and the target (shame).
I workaround for that may be to point a symlink back from the new archive dir, then, when jenkins purges it's archive dir, the new symlink will dangle and a cron job can then later delete the new job archive dir
Copy Artifact Plugin (https://wiki.jenkins-ci.org/display/JENKINS/Copy+Artifact+Plugin) adds a build step for retrieving files from another project's workspace to current and work from there.