Is it possible to configure in maven plugin,for which classes (like include/exclude by name pattern) JUNIT tests should be generated?
As i don't need tests for every class.
I managed to do it with command line but I need to do it in maven.
It isn't possible to do that, but you can use the -Dcuts and -DcutsFile command line arguments to generate test classes for 1..N classes.
Writing a small class that generates the class names for a given package structure is pretty trivial. You can then use the results to create a comma-delimited string that can be pasted into the file you associate with the -DcutsFile argument, e.g.:
mvn evosuite:generate evosuite:export -DcutsFile=c:\temp\cutsFile.txt
Where the contents of cutsFile.txt are:
com.foo.A, com.foo.B
Another way to generate the list of testable classes is to run the EvoSuite jar from the command line with the -listClasses and -target parameters and pipe the output to a file (Windows example below):
java -jar c:\evosuite\evosuite-master-1.0.5.jar -listClasses -target target/classes > c:temp\testableClasses.txt
From there you can just pick and choose the classes you'd want to add to the CUTs file or add them to the command line using the -Dcuts command line argument.
Related
I have a workflow Knime, in the middle I must execute an external program to create an Excel file.
Exists some node that allows me to achieve this? I don't need to put any input or output, only execute the program and wait to generate the Excel file (I require to use this Excel for the next nodes).
There are (at least) two “External Tool” nodes which allow running executables on the command line:
External Tool
External Tool (Labs)
In case that should not be enough, you can always go for a Java Snippet node. The java.lang.Runtime class should be your entry point.
It's could be used the External tool node. The node requires inputs and outputs... but, you can use a table creator node for input:
This create an empty table.
In the external tool node, you must include an Input file and Output file, depending on your request, this config could be meaningless but require to the Node works.
In this case, the external app creates a text with the result of the execution, so, in the initial table (Table creator node), will be read the file and get the information into Knime.
I'm still pretty new to running anything in PyCharm more advanced than just a simple script. I'm writing a test in pytest right now and I want to have the test results output to a junit xml file; I'm thinking the best naming convention will be based on the current date/time, so I am trying to pipe in the current date using the date shell command as an environment variable as seen below:
Current Configuration:
However, when I run the configuration as-is, it just names the .xml file based on the command without actually executing it. Any ideas what I'm missing, or if this is even possible?
Thanks!
Yes, it is possible with a workaround. I don't think what you are trying to achieve is possible using a single configuration. The the value you set in Environment variables are substituted as-is and wouldn't be executed in bash prior to that.
The workaround would be use multiple configurations.
Store the following line in a bash file.
export PYTEST_EXEC_TIME=$(date '+%Y-%m-%d%H:%M:%S')
Add a bash configuration to which executes this file.
Add that configuration to the pytest configuration as a "Before Launch" configuration and use the $PYTEST_EXEC_TIME in the additional parameters.
Note: Here is a detailed answer showing step by step process of setting up a "Before Launch" configuration.
I'm using Google Cloud Deployment and I am trying to get external input into my template. Namely, I want to set a metadata variable on my instance (when creating the instance) but provide this value on execution.
I've tried:
gcloud deployment-manager deployments create test-api-backend --config test-api-backend.yaml --properties 'my_value=hello'
Which fails (The properties flag should only be used when passing in a template as your config file.)
I've tried:
my_value=hello gcloud deployment-manager deployments create test-api-backend --config test-api-backend.yaml
And use {{env['my_value']}} but the value isn't picked up.
I guess I could add the property in a .jinja file and re-write this file before I run everything, but it feels like a hack. That, or my idea of passing a variable from shell into Deploy Manager is a hack. I'm honestly not sure.
As the error message indicates, the command line properties can only be used with a template. They are essentially meant to replace the config yaml file.
The easiest thing to do is to just rename your yaml file to a .py or .jinja file. Then use that template as the file in the gcloud command instead of the yaml file.
In that new template file, add any defaults you would like if you don't pass them in on the command line.
For python, something like:
if 'myparam' in context.properties:
valuetouse = context.properities['myparam']
else:
valuetouse = mydefaultvalue
If the template uses another template then you'll also need to create a schema file for the new, top level template so you can do the imports there instead of the yaml file.
See the schema file in this github example.
https://github.com/GoogleCloudPlatform/deploymentmanager-samples/blob/master/examples/v2/igm-updater/ha-service.py.schema
If you want, you can ignore all the properties and just do the imports section.
Calling a file resulting from the concatenation (bash: cat ... >> app.js) of the following three files:
/usr/share/ceylon/1.2.0/repo/ceylon/language/1.2.0/ceylon.language-1.2.0.js
modules/com/example/helloworld/1.0.0/com.example.helloworld-1.0.0-model.js
modules/com/example/helloworld/1.0.0/com.example.helloworld-1.0.0.js
with the command nodejs app.js does nothing. The same when used in a web page. How do have I to call that javascript program so that it runs without using require.js ?
Please give the rules how ceylon modules and the run function and other functions contained within translate to javascript and are to be called.
How can I get one javascript file from compilation of several ceylon modules without concatenating them manually or with require.js?
The above is without using google closure compiler.
Given the size of 1.6 MB of the language module, it makes no sense to run ceylon-js without using google closure compiler.
Compiling "ceylon.language-1.2.0.js" alone with google closure compiler results in a lot of warnings.
java -jar compiler.jar --compilation_level ADVANCED_OPTIMIZATIONS --js /usr/share/ceylon/1.2.0/repo/ceylon/language/1.2.0/ceylon.language-1.2.0.js --js_output_file lib-compiled.js
How can I get rid of those warnings?
In what order do I have to chain together files resulting from ceylon-js with the model file and the language file to compile them in advanced mode with google closure compiler for dead code elimination.
These are 3 questions, really.
A Ceylon module is compiled to a CommonJS module. Concatenating the resulting files won't work because each file is on CommonJS format, which is a big function that returns an object with the exported declarations.
You can compile the modules with the --no-module option to get just the generated code, without it being wrapped in CommonJS format. For the language module, you can copy the file and just delete the first line and the last 5 lines.
I do not yet know how to get rid of the warnings you mention in the second question.
And as for the third question, I would recommend putting the language module first, then the rest of the files. If you have any toplevel declarations with the same name in different modules, you'll have conflicts (only the last declaration will remain), even if they're not shared, since they're all in the same module/unit.
Well, I think require.js can run the compilation of the modules to one file and then run the google-closure-compiler, see: http://www.requirejs.org/docs/optimization.html
I have two scripts in the pre-build step in a Jenkins job, the first one a perl script, the second a system groovy script using the groovy plugin. I need information from the first perl script in my second groovy script. I think the best way would be to set some environment variable, and was wondering how that can be realized.
Or any other better way.
Thanks for your time.
The way to propagate environment variables among build steps is via EnvInject Plugin.
Here are some previous answers that show how to do it:
How to set environment variables in Jenkins?
Jenkins : Report results of intermediate [windows batch] build steps in email body
In your case, however, it may be simpler just to write to a file in one build step and read that file in another. To make sure you do not accidentally read from a previous version of the file you can incorporate BUILD_ID in the file name.
Using EnvInject Plugin from job configuration you should use Inject environment variables to the build process / Evaluated Groovy script.
Depending on the setup you may execute Groovy or shell command and save it in map containing environment variables:
Example
By either getting command result with execute method:
return [DATE: 'date'.execute().text]
or with Groovy equivalent if one exists:
return [DATE: new Date()]