Programmatically create gitlab-ci.yml file? - configuration

Is there any tool to generate .gitlab-ci.yml file like Jenkins has job-dsl-plugin to create jobs?
Jenkins DSL plugin allows me to generate jobs using Groovy, which outputs an xml that describes a job for Jenkins.
I can use DSL and a json file to generate jobs in Jenkins. What I’m looking for is a tool to help me generate .gitlab-ci.yml based on a specification.

The main question i have to ask what is your goal?
just reduce maintenance effort for repeating job snippets:
Sometimes .gitlab-ci.yml file are pretty similar in a lot of projects, and you want to manage them centrally. Then i recommend to take a look at Having Gitlab Projects calling the same gitlab-ci.yml stored in a central location - which shows multiple ways of centralizing your build,
generate pipeline configuration as the build is highly flexible
Actually this is more a templating task, and can be achieved in nearly every script language you like.
With simple bash, groovy, python, go, .. you name it. In the end the question is, what kind of flexibility you strive for, and what kind of logic you need for the generation. I will not go into the detail on how to generate a the .gitlab-ci.yml file, but how to use it for your next step. Because this is in my opinion the most crucial step. There is the way of simply generating and committing it, but you can also use GitLab CI to generate a file for you, which will be used in the next job of your pipeline.
setup:
script:
- echo ".." # generate your yaml file here, maybe use a custom image
artifacts:
paths:
- generated.gitlab-ci.yml
trigger:
needs:
- setup
trigger:
include:
- artifact: generated.gitlab-ci.yml
job: setup
strategy: depend
This allows you to generate a child pipeline and execute it - we use this for highly generic builds in monorepos.
see for further reading
GitLab JSONNET Example - documentation example for generated yml files within a pipeline
Dynamic Childpipelines - documentation for dynamically created pipelines

Related

Does Terraform support CloudFormation Templates with minor manipulations

Assumption: Terraform installed on MS Visual Studio Code.
Since CloudFormation supports both JSON templates and Terraform supports JSON this seems like a yes. However when I load a CloudFormation template into MS VisualStudio Code, and change the name from test.json to test.tf VS Code doesn't recognize the formatting, well visually as the name implies.
Also tried to just Run the test.json and test.tf files and Code says it doesn't know how to debug json. Also Code can't find a json debugger in the marketplace (which seems a little hard to imagine)
Anyone else have experience with this?
Since CloudFormation supports both JSON templates and Terraform supports JSON this seems like a yes.
This is far from being true.
Although, both Terraform and CloudFormation have support for JSON files, this does not means that the syntax of those JSON files are understood by both of them. They are entirely different products developed by different maintainers. They have different ways of defining and managing resources which you would want to provision.
Terraform's AWS provider has support for creating CloudFormation stacks, more info in the documentation. If you really want to, you might be able to provision resources from CFN files, but certainly this is not accomplished just by renaming a test.json to test.tf.
It seems that you have misunderstood some things:
Both CloudFormation JSON or YML files and Terraform TF (or JSON) are in a declarative language. This is true for JSON and YML in general. You can't debug or run these files, as they are only describing an infrastructure (or an object in general) and don't implement any logic.
For Terraform you need to install the HashiCorp.terraform extension. This will give you syntax highlighting.
For Cloudformation I recommend cf-lint extension.
Both in CloudFormation and in Terraform you edit the files (JSON, YML, TF) and use a CLI to deploy your code. Terraform works in a much higher-level than CloudFormation. CloudFormation can only be used to deploy stacks, i.e. a set of resources. If you also need to deploy application code, you must have done that beforehand in a S3 bucket and reference it from there. Also in many situations you need to create a stack and then update it. Or you need to combine two or more stacks in a Stack Set.
Terraform manages all this complexity for you and offers a lot of features on top of CloudFormation. As already said, the JSON file formats isn't the same and can't be used interchangeably. You can provision a CloudFormation stack with Terraform though.

Using HTML in job description for Jenkins job generated by DSL

I'm migrating some Jenkins jobs to DSL code from the current manual configurations. Some of these jobs have descriptions which contain HTML, but I can't find a way to enter this HTML in the seed job so that the generated job contains the same description. In one example, the current job has this description:
Multi-Platform Build <br/><br/>
Builds nightly but only if there has been SCM revisions against the application Core Trunk. <br/><br/>
This is being replaced by application-multi-platform-new
Which results in a nicely formatted job description with line breaks and a hyperlink as well.
I want to replicate this when I generate the same job from a DSL script but there doesn't seem to be a way to do this.
It should be possible with just specifying the html-tags that you need. What is your output?
description("""
Multi-Platform Build <br/><br/>
Builds nightly but only if there has been SCM revisions against the application Core Trunk. <br/><br/>
This is being replaced by application-multi-platform-new
""")
I've managed to find a workaround but I'd prefer to do this directly.
It's possible to use the below snippet:
job('multi-platform-build') {
description(readFileFromWorkspace('description.html'))
}
This allows you to have a separate file the workspace of the seed job which is called to provide the description.
This works but it's far from ideal as this means configuration being stored in two separate locations.

How to allow web-component-tester to run tests stored with my components

I am experimenting with the framework to build an SPA using polymer. This will include a large number of custom elements at various levels in the overall application hierarchy. I would like to use web-component-tester to run the module tests on them.
web-component-tester seems to be opinionated about where the tests will be stored - in a separate test directory, where it will run all files found.
I am of an opposite opinion. I would like to store tests in the same directory as the element definition. I would like to differentiate tests by naming them xxx.test.html (or possibly xxx.test.js). I also want to run different "sets" of tests controlled by gulp some of which will be watching for changes and then running the tests (for the app side of my project) and some of which will be elements that use core-ajax to unit test my server side scripts. These will more than likely be in a totally different directory hierarchy (my dist directory) and will be served by a proper web server.
I "think" the "suite" config option wct-conf.js file in my project root might be how I can define this, or alternatively a wct command with some file globs. Unfortunately web-component-tester's README is somewhat confusing on any detail and when you have your own web server it says "You'll need to save WCT's browser.js in order to go this route." What does that mean?
Can someone enlighten me on how can get WCT to run each of the elements/**/*.test.html files as its own "suite" ( I actually intend to use describe, it format - but I assume I still need to use the term suite).
Can someone also explain what I need to do the browser.js when I have my own web server.
I ran some experiments and did a bit of debugging with node-inspector. Firstly, the command line overwrites the suites parameter in the config file
wct app/elements**/*.test.html
does find all my module tests if I have them stored with the elements and ignores the contents of the wct.conf.js file's suites parameter.
also putting the same value (ie app/elements/**/*.test.html) in the wct-conf.js file for the suite parameter does the same job. In fact in this mode, gulp test:local
Also works correctly
So to run different tests for module and distribution, I just need to set up for wct.conf.js for my module tests, and set up gulp to run a command line with the correct location of my test file
I still haven't understood the instructions for running with your own web server.

Is this a job for Yeoman?

I use Yeoman, and I dig it.
However recently I have been wanting more complex code generation tools - now I know I can build custom generators, but I am wondering if people think this is the role/job/whatever that Yeoman is built to play.
Examples are,
Generating a base REST API (in Node) from a JSON schema
Generating MySQL DB Schema from JSON schema etc.
Although I could bend Yeoman to do this - do people think this is a realistic direction?
Is there a better tool for the job?
(Currently I have a bunch of custom Node scripts that suffice).
My humble opinion:
Yeoman is first and foremost a front end tool to create webapps.
Your task seems to be backend related.
You can still use grunt to scaffold your project though.
http://gruntjs.com/project-scaffolding
Cheers

Checkstyle and Findbugs for changed files only on Jenkins (and/or Hudson)

We work with a lot of legacy code and we think about introducing some metrics for new code. Is it possible to let Findbugs and Checkstyle run on changed files only instead of a complete project?
It would be nice to assure that only file with a minimum of quality is checked in, but the code base itself is not (yet) touched and evaluated not to confuse people by thousands of issues.
In theory, it would be possible. You would use a shell script to parse the SVN (or whatever SCM) change logs after a given start date, identify the .java files from these change sets and build two patterns from these:
The Findbugs Maven Plugin expects a comma-separated list of class (or
package) names for the parameter onlyAnalyze, so you'll have
to translate file names to fully qualified class names (this will get
tricky when you're dealing with inner classes)
The Maven Checkstyle Plugin is even worse, it expects a
configuration file for its packageNamesLocation parameter.
Unfortunately, only packages are allowed, not individual files. So
you'll have to translate file names to packages.
In the above examples I assume that you are using maven. I am pretty sure that similar things can be done with ant, but I wouldn't know.
I myself would probably use a Groovy script instead of a shell script to achieve the above results.
Findbugs has ant tasks that can do diffs against different findbugs results to see just the deltas, so only reporting new bugs, see
http://findbugs.sourceforge.net/manual/datamining.html