Sonarqube php duplication in new code - duplicates

I am trying to get the Sonarqube code code coverage and duplication metrics displayed for new PHP code added. I can see the code issues/code smells for the new code added. But there is no count displayed for code duplication (I have even added a few duplicate lines of code, to make sure I have duplicated lines for getting the metrics).
Below is my working environment
PHP Symfony Framework: 2.7
Sonarqube version: 6.3.1
sonar-php-plugin-2.10.0.2087
Sonarscanner version: 3.0.1.733-windows
Operating System: Windows 7, 64 bit
Steps followed
Execute PHP test cases to generate a xml report (which I will mention in the sonar properties file)
Run Sonarcode analysis with initial code (say "version1") which resulted in some value for Bugs & Vulnerabilities, Code Smells, Coverage (based on the sonar.php.coverage.reportPaths value in sonar properties file) and Code Duplications.
I added new code and PHP unit test cases
Execute PHP unit test cases and added the xml report in sonar properties file with the result xml for version1 (comma separated)
Updated the version of the project (say "version2") in the sonar properties file
Rerun the Sonar code analysis
I see a new column added in the report "Leak Period: since version1". This column shows New bugs & vulnerabilities, new code smells (indicating the issues in the newly added code). But it does not display data for Coverage and Duplications
I did not see much of documentation on how to get the reports for new code additions. Any help on this is very much appreciated.
Thank you

Duplications detection happens automatically unless you turn it off. It is likely that the duplicated blocks you're adding are too small to be detected.

Related

How could I go about loading functions from NTDLL without linking against it or any other DLLs?

I've been experimenting with loading functions from the Windows system DLLs using only the loader functions exported by NTDLL. This works as expected. For the sake of curiosity and getting an even better understanding of the process structure in NT-based systems, I've started trying to load functions from NTDLL by doing the following steps:
Load the PEB of the process from gs:[60h]
Iterate over the modules loaded into the process according to the loader to find NTDLL's base address
Parse the PE headers of NTDLL
Try to parse the export table to find LdrLoadDll, LdrGetDllHandle, and LdrGetProcedureAddress
This fails at step 4. After stepping through it in a debugger (both VS2019 and WinDbg Preview), it seems as though the offsets I've tried yield an invalid structure that leads to an access violation when my code compares the current function name to one of the ones I'm searching for. My code is being compiled and run on a 64-bit copy of Windows 10 Pro build 21364. Note that I'm using my own header that contains definitions for the structures used for this (these definitions are from winnt.h and here) because the Windows headers don't really play nice with the rest of my code. The function trying to do this is here. For the record, this is part of an attempt to implement my own libc (again, for the sake of curiosity). The code that calls the functions is here. Any help with this is tremendously appreciated.
Nevermind, turns out I had outdated verbose definitions of the structures I was using. I found better (more up-to-date) definitions at https://vergiliusproject.com.

Changes of Code Coverage in SonarQube 6.2

I have a problem: I used to use SonarQube version 5.1 where I had JUnit and Integration tests' code coverage set up. Both of these were represented by three items in SonarQube (coverage, it_coverage, overall_coverage - the previous two merged).
Since the 6.2 version, the coverage and it_coverage shouldn't be available anymore, there is supposed to have just an item called coverage which is the value of the item overall_coverage of previous versions (if report paths to jaccoco are set up correctly for jUnit reports and integration tests reports), everything should work well.
My question is, how can I recognize that the integration tests are included in the coverage cause, I don't have any indicator which makes me know that these tests were part of the analysis, cause I still see just the number of unit tests where the Jameleon ones are not listed in.
There is no visual way in SonarQube to be sure that your IT coverage has been taken into account.
You can probably look at the analysis logs to verify that the reports are found (if not, you will get a warning message).
If you want to be sure, you can also run an analysis without specifying the IT report and see if there's a difference or not.

Issue with creating JSON output for Busted lua unit testing

So I have some simple unit tests setup in busted. I am a little new to LUA, so I may be missing something obvious.
When I run:
lua test.lua
I get expected results (7 succeed, 1 failed on purpose to try out busted) in the nice terminal output.
My ultimate goal however is to output JSON results, and have a script that consumes JSON from multiple tests to make some summary pages for my fellow WoW addon developers.
When I run:
lua test.lua -o json
my terminal pauses for a brief second, and I am returned to the command line.
There is no terminal output, nor is any file created.
I am relatively new to lua and busted in general, could you provide me any pointers?
Here is a screenshot:
And here is a link to Busted's website.
The issue in question was caused by dkjson module not using functions in tables properly. The bug was fixed in pull request #449, so, You should wait for the fix to get to next release candidate (>2.0.rc10-0) of Busted or just download and build recent version from here. Btw, relevant bug report - #448.

Xtext project creation concerns

Before I begin I must admit that I am new to Xtext and the designing of DSLs. Some of my questions on this matter may be somewhat "less than intelligent".
I have created an Xtext project using the IDE, and I am simultaneously using one of the sample projects provided with Xtext as a guide to what I need to do in my language. I am seeing a lot of warnings that are making me nervous.
Apparently, when the development environment creates a new project, it somehow configures that project to use the Java 5 libraries. I am using Java 6, and as a result I get warnings saying that my project is configured for Java 5 and there is no Java 5 on my system (which there isn't!).
I have tried altering the build path so that it uses Java 6 libraries, but this generates a number of other warnings -- including warnings that the Java 6referenced in my manifest.mf file is invalid!
Then there are the "plugin.xml" warnings. Apparently, the build.properties file references a file called "plugin.xml" which is not created when the IDE creates the project. I have no idea whether or not this file is important enough to create, and I have no idea what should go into it.
Frankly, I hate warnings. Warnings tend to lead to future problems in what I produce. I like clean compiles and clean deployments. I would like to eliminate these warnings, before they start screwing me up down the road (like putting in Java6 classes that would break in a Java5 library).
Has anyone been able to eliminate these warnings reliably? Please advise.
For the JDK warning, you simply switch in the Manifest.MF to a target environment matching your preferred JDK ('JavaSE-1.6 ' in your case).
The warning regarding the missing plugin.xml will be gone as soon as you have run the grammar generator the first time, as it will produce such a file.

Redeploying SSIS packages - Cache?

We have noticed an issue recently that redeployed SSIS packages sometime don't seem to include the latest changes... When I search the dtsx using notepad I see the amended script in the code so the changes are definitely there.
My assumption was that script components of SSIS packages are eventually compiled into an assembly somewhere in the process - this is quite likely since I would imagine C# code cannot run without something compiling it first. So in theory if these assemblies would then end up being cached and not immediately overwritten (for some reason) that would explain this issue.
The only "evidence" that makes me think that my theory is correct is if I keep running the package at some point it suddenly shifts to the new code.
However, so far I haven't found why and how this is happening, if is... Can anybody help?
UPDATE:
MSDN says: "Unlike earlier versions where you could indicate whether the scripts were precompiled, all scripts are precompiled in SQL Server 2008 Integration Services (SSIS) and later versions." - If by pre-compiled they mean that instead of the actual package a pre-compiled version runs (I think this because the package itself does not seem to be compiled since the code is visible in Notepad) there must be a way to force the engine to overwrite the pre-compiled assembly... but how?
UPDATE:
One of the four core components of SSIS is the SQL ServerIntegration Services service, which is a windows service. Apparently this service will cache component/task metadata so that the SSIS runtime engine can poll the cache to see what is installed, which may help speed up package load times. However, if the packages are stored in the file system (not in SQL Integration Services) and executed by Agent Jobs, the agent job will use the 64 bit version of DTEXEC to execute the packages. I haven't yet found evidence that any caching would be involved there, but there are certainly options to check a number of parameters in the validation phase of the execution, such as version numbers - may be for a reason.
Have you looked at sysssispackages to compare the version build number of the package in msdb to your build number in Visual Studio / SSIS?
SELECT name, verbuild
FROM msdb.dbo.sysssispackages
WHERE name LIKE '%bla%'
(Adjust WHERE-clause as necessary to find your package. Do NOT ever "SELECT * FROM msdb.dbo.sysssispackages" as it contains the package XML in one of the columns.)
And in Visual Studio, open the package, then right-click at the background of the package and select "Properties" from the context menu. Look at the field VersionBuild. It should match the number from the SELECT above!
I know this is not an actual solution to your problem but it may help locate where the cause of the problem is. If the number is older, it means that your package deployment did not work.
This sounds somewhat familiar to something I ran into a while back. Unfortunately, I don't remember exactly when I ran into this (so I can't check for sure), but I believe the fix I found was to make sure that I explicitly invoked the Build | Build st_5bd541c294054c25b9e7eb55b92bd0e2 command from the script editor (VSTA) menu before closing the window. (The specific project name will be different for each script, obviously, since it's based on a GUID; however, there will only be one possible submenu under Build.)
Explicitly invoking the Build command ensures that the binary code for the script gets ASCII-encoded and saved in the XML of the resulting .dtsx file. I'd gotten used to SSIS 2005 always building for me whenever I closed the script editor. Apparently, there are bizarre edge cases where SSIS 2008 doesn't always build the script project when the editor closes.
BTW, the precompiled binaries appear to be stored in a tag of the source XML called BinaryItem:
<DTS:Executable DTS:ExecutableType="Microsoft.SqlServer.Dts.Tasks.ScriptTask.ScriptTask, Microsoft.SqlServer.ScriptTask, Version=10.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91" DTS:ThreadHint="0">
<DTS:Property DTS:Name="ObjectName">SCR_StepOne</DTS:Property>
<DTS:ObjectData>
<ScriptProject Name="ST_5bd541c294054c25b9e7eb55b92bd0e2" VSTAMajorVersion="2" VSTAMinorVersion="1" Language="CSharp" EntryPoint="Main" ReadOnlyVariables="User::FileOneName,User::OutputFolder" ReadWriteVariables="">
<BinaryItem Name="\bin\release\st_5bd541c294054c25b9e7eb55b92bd0e2.csproj.dll">
TVqQAAMAAAAEAAAA//8AALgAAAAAAAAAQAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
AAAAgAAAAA4fug4AtAnNIbgBTM0hVGhpcyBwcm9ncmFtIGNhbm5vdCBiZSBydW4gaW4gRE9TIG1v
ZGUuDQ0KJAAAAAAAAABQRQAATAEDADuOb04AAAAAAAAAAOAAAiELAQgAABAAAAAIAAAAAAAAPi8A
AAAgAAAAQAAAAABAAAAgAAAAAgAABAAAAAAAAAAEAAAAAAAAAACAAAAAAgAAAAAAAAMAQIUAABAA
It might be worth checking your source code control system history to see if that was getting updated for some of those screwy errors.
Caveat: I haven't found official Microsoft documentation on this.
This doesn't specifically solve the mystery you have, but if you are running file system-based packages and want to verify that the package that is running is the package you deployed, there is a way to do that.
Build your package.
Open the properties on your package and note down the "Version Build" property (alternatively, open the .dtsx in notepad and find the DTS:VersionBuild attribute.)
Deploy your package.
In your SQL Agent job step, go to the Verification tab.
Enter the Version Build in the "Verify package build" input box.
Execute the job step.
I don't know if this will force SSIS to throw out its cache and get the newly deployed package, but I do know if you modify the .dtsx package's build number by hand and then try to re-run the job step it fails because the package build doesn't match what it's looking for so it is definitely doing a run-time check of that value.