Changes of Code Coverage in SonarQube 6.2 - junit

I have a problem: I used to use SonarQube version 5.1 where I had JUnit and Integration tests' code coverage set up. Both of these were represented by three items in SonarQube (coverage, it_coverage, overall_coverage - the previous two merged).
Since the 6.2 version, the coverage and it_coverage shouldn't be available anymore, there is supposed to have just an item called coverage which is the value of the item overall_coverage of previous versions (if report paths to jaccoco are set up correctly for jUnit reports and integration tests reports), everything should work well.
My question is, how can I recognize that the integration tests are included in the coverage cause, I don't have any indicator which makes me know that these tests were part of the analysis, cause I still see just the number of unit tests where the Jameleon ones are not listed in.

There is no visual way in SonarQube to be sure that your IT coverage has been taken into account.
You can probably look at the analysis logs to verify that the reports are found (if not, you will get a warning message).
If you want to be sure, you can also run an analysis without specifying the IT report and see if there's a difference or not.

Related

Sonarqube php duplication in new code

I am trying to get the Sonarqube code code coverage and duplication metrics displayed for new PHP code added. I can see the code issues/code smells for the new code added. But there is no count displayed for code duplication (I have even added a few duplicate lines of code, to make sure I have duplicated lines for getting the metrics).
Below is my working environment
PHP Symfony Framework: 2.7
Sonarqube version: 6.3.1
sonar-php-plugin-2.10.0.2087
Sonarscanner version: 3.0.1.733-windows
Operating System: Windows 7, 64 bit
Steps followed
Execute PHP test cases to generate a xml report (which I will mention in the sonar properties file)
Run Sonarcode analysis with initial code (say "version1") which resulted in some value for Bugs & Vulnerabilities, Code Smells, Coverage (based on the sonar.php.coverage.reportPaths value in sonar properties file) and Code Duplications.
I added new code and PHP unit test cases
Execute PHP unit test cases and added the xml report in sonar properties file with the result xml for version1 (comma separated)
Updated the version of the project (say "version2") in the sonar properties file
Rerun the Sonar code analysis
I see a new column added in the report "Leak Period: since version1". This column shows New bugs & vulnerabilities, new code smells (indicating the issues in the newly added code). But it does not display data for Coverage and Duplications
I did not see much of documentation on how to get the reports for new code additions. Any help on this is very much appreciated.
Thank you
Duplications detection happens automatically unless you turn it off. It is likely that the duplicated blocks you're adding are too small to be detected.

How to take screenshot on test failure with junit 5

Can someone tell me please: how to take a screenshot when test method fails (jUnit 5). I have a base test class with BeforeEach and AfterEach methods. Any other classes with #Test methods extends base class.
Well, it is possible to write java code that takes screenshots, see here for example.
But I am very much wondering about the real problem you are trying to solve this way. I am not sure if you figured that yet, but the main intention of JUnit is to provide you a framework that runs your tests in various environments.
Of course it is nice that you can run JUnit within your IDE, and maybe you would find it helpful to get a screenshot. But: "normally" unit tests also run during nightly builds and such - in environments where "taking a screenshot" might not make any sense!
Beyond that: screenshorts are an extremely ineffective way of collecting information! When you have a fail, you should be locking for textual log files, html/xml reports, whatever. You want that failing tests generate information that can be easily digested.
So, the real answer here is: step back from what you are doing right now, and re-consider non-screenshot solutions to the problem you actually want to solve!
You don't need to take screen shots for JUnit test failes/passes, rather the recommended way is to generate various reports (Tests Passed/Failed Report, Code coverage Report, Code complexity Report etc..) automatically using the below tools/plugins.
You can use Cobertura maven plugin or Sonarqube code quality tool so that these will automatically generate the reports for you.
You can look here for Cobertura-maven-plugin and here for Sonarqube for more details.
You need to integrate these tools with your CI (Continuous Integration) environments and ensure that if the code is NOT passing certain quality (in terms of tests coverage, code complexity, etc..) then the project build (war/ear) should fail automatically.

How about an Application Centralized Configuration Management System?

We have a build pipeline to manage the artifacts' life cycle. The pipline is consist of four stages:
1.commit(runing unit/ingetration tests)
2.at(deploy artifact to at environment and runn automated acceptance tests)
3.uat(deploy artifact to uat environment and run manual acceptance tests)
4.pt(deploy to pt environment and run performance tests)
5.//TODO we're trying to support the production environment.
The pipeline supports environment varialbles so we can deploy artifacts with different configurations by triggerting with options. The problem is sometimes there are too many configuration items making the deploy script contains too many replacement tasks.
I have an idea of building a centralized configuration managment system (ccm for short name), so we can maintain the configuration items over there and leave only a url(connect to the ccm) replacement task (handling different stages) in the deploy script. Therefore, the artifact doesnt hold the configuration values, it asks the ccm for them.
Is this feasible or a bad idea of the first place?
My concern is that the potential mismatch between the configuration key (defined in the artifact) and value (set in the ccm) is not solved by this solution and may even worse.
Configuration files should remain with the project or set as configuration variables where they are run. The reasoning behind this is that you're adding a new point of failure in your architecture, you have to take into account that your configuration server could go down thus breaking everything that depends on it.
I would advice against putting yourself in this situation.
There is no problem in having a long list of environment variables defined for a project, besides that could even mean you're doing things properly.
If for some reason you find yourself changing configuration files a lot (for ex. database connection strings, api ednpoints, etc...) then the problem might be this need to change a lot configurations, which should stay almost always the same.

Redeploying SSIS packages - Cache?

We have noticed an issue recently that redeployed SSIS packages sometime don't seem to include the latest changes... When I search the dtsx using notepad I see the amended script in the code so the changes are definitely there.
My assumption was that script components of SSIS packages are eventually compiled into an assembly somewhere in the process - this is quite likely since I would imagine C# code cannot run without something compiling it first. So in theory if these assemblies would then end up being cached and not immediately overwritten (for some reason) that would explain this issue.
The only "evidence" that makes me think that my theory is correct is if I keep running the package at some point it suddenly shifts to the new code.
However, so far I haven't found why and how this is happening, if is... Can anybody help?
UPDATE:
MSDN says: "Unlike earlier versions where you could indicate whether the scripts were precompiled, all scripts are precompiled in SQL Server 2008 Integration Services (SSIS) and later versions." - If by pre-compiled they mean that instead of the actual package a pre-compiled version runs (I think this because the package itself does not seem to be compiled since the code is visible in Notepad) there must be a way to force the engine to overwrite the pre-compiled assembly... but how?
UPDATE:
One of the four core components of SSIS is the SQL ServerIntegration Services service, which is a windows service. Apparently this service will cache component/task metadata so that the SSIS runtime engine can poll the cache to see what is installed, which may help speed up package load times. However, if the packages are stored in the file system (not in SQL Integration Services) and executed by Agent Jobs, the agent job will use the 64 bit version of DTEXEC to execute the packages. I haven't yet found evidence that any caching would be involved there, but there are certainly options to check a number of parameters in the validation phase of the execution, such as version numbers - may be for a reason.
Have you looked at sysssispackages to compare the version build number of the package in msdb to your build number in Visual Studio / SSIS?
SELECT name, verbuild
FROM msdb.dbo.sysssispackages
WHERE name LIKE '%bla%'
(Adjust WHERE-clause as necessary to find your package. Do NOT ever "SELECT * FROM msdb.dbo.sysssispackages" as it contains the package XML in one of the columns.)
And in Visual Studio, open the package, then right-click at the background of the package and select "Properties" from the context menu. Look at the field VersionBuild. It should match the number from the SELECT above!
I know this is not an actual solution to your problem but it may help locate where the cause of the problem is. If the number is older, it means that your package deployment did not work.
This sounds somewhat familiar to something I ran into a while back. Unfortunately, I don't remember exactly when I ran into this (so I can't check for sure), but I believe the fix I found was to make sure that I explicitly invoked the Build | Build st_5bd541c294054c25b9e7eb55b92bd0e2 command from the script editor (VSTA) menu before closing the window. (The specific project name will be different for each script, obviously, since it's based on a GUID; however, there will only be one possible submenu under Build.)
Explicitly invoking the Build command ensures that the binary code for the script gets ASCII-encoded and saved in the XML of the resulting .dtsx file. I'd gotten used to SSIS 2005 always building for me whenever I closed the script editor. Apparently, there are bizarre edge cases where SSIS 2008 doesn't always build the script project when the editor closes.
BTW, the precompiled binaries appear to be stored in a tag of the source XML called BinaryItem:
<DTS:Executable DTS:ExecutableType="Microsoft.SqlServer.Dts.Tasks.ScriptTask.ScriptTask, Microsoft.SqlServer.ScriptTask, Version=10.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91" DTS:ThreadHint="0">
<DTS:Property DTS:Name="ObjectName">SCR_StepOne</DTS:Property>
<DTS:ObjectData>
<ScriptProject Name="ST_5bd541c294054c25b9e7eb55b92bd0e2" VSTAMajorVersion="2" VSTAMinorVersion="1" Language="CSharp" EntryPoint="Main" ReadOnlyVariables="User::FileOneName,User::OutputFolder" ReadWriteVariables="">
<BinaryItem Name="\bin\release\st_5bd541c294054c25b9e7eb55b92bd0e2.csproj.dll">
TVqQAAMAAAAEAAAA//8AALgAAAAAAAAAQAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
AAAAgAAAAA4fug4AtAnNIbgBTM0hVGhpcyBwcm9ncmFtIGNhbm5vdCBiZSBydW4gaW4gRE9TIG1v
ZGUuDQ0KJAAAAAAAAABQRQAATAEDADuOb04AAAAAAAAAAOAAAiELAQgAABAAAAAIAAAAAAAAPi8A
AAAgAAAAQAAAAABAAAAgAAAAAgAABAAAAAAAAAAEAAAAAAAAAACAAAAAAgAAAAAAAAMAQIUAABAA
It might be worth checking your source code control system history to see if that was getting updated for some of those screwy errors.
Caveat: I haven't found official Microsoft documentation on this.
This doesn't specifically solve the mystery you have, but if you are running file system-based packages and want to verify that the package that is running is the package you deployed, there is a way to do that.
Build your package.
Open the properties on your package and note down the "Version Build" property (alternatively, open the .dtsx in notepad and find the DTS:VersionBuild attribute.)
Deploy your package.
In your SQL Agent job step, go to the Verification tab.
Enter the Version Build in the "Verify package build" input box.
Execute the job step.
I don't know if this will force SSIS to throw out its cache and get the newly deployed package, but I do know if you modify the .dtsx package's build number by hand and then try to re-run the job step it fails because the package build doesn't match what it's looking for so it is definitely doing a run-time check of that value.

Hudson - save artifacts only when less than 90% passes

I am new at this and I was wondering how I can setup that I save the artifacts, only if less than 90% of the tests have passed.
Any idea how I can do this?
thanks
This is not currently possible with Hudson. What is the motivation to avoid archiving artifacts on every build?
How about a rather simple workaround. You create a post build step (or additional build step) that calls your tests from the command line. Be sure to capture all errors so Hudson don't count it as a failure. Than you evaluate your condition and set the error level accordingly. In addition you need to save reports (probably outside hudson) before you set the error level, so they are available even or only when the build fails.
My assumption here is, that it is OK, not to run the tests when building the app fails. However, you can separate the building and testing in two jobs. See here.