I have project build with hudson CBS. and i am using cobertura for test coverage. Reports are generated and i am happy about it.
but i cannot find the delta of coverage %.
for e.g.
check-in #1 - code coverage is 90%
check-in #2 - code coverage is 75% i.e down by 15%.
can i achieve this in hudson cobertura plug-in? is there any alternative?
I solve this by parsing the cobertura XML files and pushing the individual build data into a database. You can do this with other metrics, like number of tests and complexity.
Placing the results into a database gives you a wide range of display options. We use Excel and SharePoint to display are most important metrics. A simple web page with charts and graphs (is it still simple?) will also do the trick.
Related
I have an assignment of displaying the difference in coverage data between two coverage reports. For example, suppose I already have a GCOV and corresponding LCOV file, and then I made some changes and generated new GCOV and LCOV files. Now, I want to find out what the delta or the difference in these reports are, i.e., whether my latest code change covered more code or not. Is there any tool which can find that out? Or what are the suggested steps I should proceed with?
I tried searching tools in the internet which could generate this difference but could not find any.
I've got a collection of Jenkins jobs which are all essentially tests packs - running lots of JUnit tests.
I keep the results for 7 days and, with the aid of the global build stats plugin and build metrics plugin, I can get a percentage of the number of builds (test packs) that had at least one failure in the last week.
What I'm now interested in doing is getting the percentage of all test failures over one week, to get a better idea as to how badly the set of builds failed - was it just one test that caused each build to fail? Or all the tests?. Is it possible with an existing plugin?
I know the data is there because the home page of any of my jobs has a graph on the right where the green area represents test passes and red fails, for all of the previous builds. This gives me some idea, but I'd like a figure to report with.
You may want to take a look at the Unit Test History Generator or Test Results Analyzer plugins.
I'm using Hudson CI to automate the integration of my FPGA projects. In one of the build steps, I run a logic synthesis tool which produces a plain-text report file. The report contains a few metrics, such as the maximum frequency, which I would like to monitor over time. Here's how the maximum frequency appears in the report:
Minimum period: 5.720ns (Maximum Frequency: 174.821MHz)
How can I extract and monitor/chart such metrics in Hudson?
This question has been answered on the Hudson forum: http://www.eclipse.org/forums/index.php/mv/msg/452719/1007740/#msg_1007740
The solution is to use the Plot plugin.
I have 3 questions :
What is CodeCoverage ?
What is it good for ?
What tools are used for
analyzing Code Coverage ?
You can get very good information from SO WEB SITE
Free code coverage tools
What is Code Coverage and how do YOU measure it?
Code Coverage is a measurement of how many lines/blocks/arcs of your code are executed while the automated tests are running.CC is collected by using a specialized tool to instrument the binaries to add tracing calls and run a full set of automated tests against the instrumented product. A good CC tools will give you not only the percentage of the code that is executed, but also will allow you to drill into the data and see exactly which lines of code were executed during particular test.
Code coverage algorithms were first created to address the problem of assessing a source code by looking directly at the source code. Code coverage belongs to the structural testing category because of the assertions made on the internal parts of the program and not on system outputs. Therefore code coverage aims at finding parts of the code that are not worth testing.
http://www.stickyminds.com/sitewide.asp?Function=edetail&ObjectType=ART&ObjectId=7580
alt text http://www.codecoveragetools.com/images/stories/software_lifecycle.jpg
Its Good for
Functional coverage aiming at finding how many functions or procedures were executed.
Statement or line coverage which identifies the number of lines in the source code has been executed.
Condition coverage or decision coverage answers the question about the number of loop conditions were executed in the program.
Path coverage which focuses on finding all possible paths from a given starting point in the code has been executed.
Entry and exit coverage which finds how many functions (C/C++, Java) or procedures (Pascal) were executing from the beginning to the end.
TOOLS
http://www.codecoveragetools.com/
http://java-source.net/open-source/code-coverage
http://www.codecoveragetools.com/index.php/coverage-process/code-coverage-tools-java.html
http://open-tube.com/10-code-coverage-tools-c-c/
http://csharp-source.net/open-source/code-coverage
http://www.kdedevelopers.org/node/3190
From wikipedia article
Code coverage is a measure used in
software testing. It describes the
degree to which the source code of a
program has been tested. It is a form
of testing that inspects the code
directly and is therefore a form of
white box testing1. Currently, the
use of code coverage is extended to
the field of digital hardware, the
contemporary design methodology of
which relies on Hardware description
languages (HDLs).
Advocating the use of code coverage
A code coverage tool simply keeps
track of which parts of your code get
executed and which parts do not.
Usually, the results are granular down
to the level of each line of code. So
in a typical situation, you launch
your application with a code coverage
tool configured to monitor it. When
you exit the application, the tool
will produce a code coverage report
which shows which lines of code were
executed and which ones were not. If
you count the total number of lines
which were executed and divide by the
total number of lines which could have
been executed, you get a percentage.
If you believe in code coverage, the
higher the percentage, the better. In
practice, reaching 100% is extremely
rare.
The use of a code coverage tool is
usually combined with the use of some
kind of automated test suite. Without
automated testing, a code coverage
tool merely tells you which features a
human user remembered to use. Such a
tool is far more useful when it is
measuring how complete your test suite
is with respect to the code you have
written.
Related articles
The Future of Code-Coverage Tools
The effectiveness of code coverage tools in software testing
Tools
Open Source Code Coverage Tools in Java
Code coverage is a metric, showing how "well" the source code is tested. There are several types of code coverage: line coverage, function coverage, branch coverage.
In order to measure the coverage, you shall run the application either manually or by automated test.
Tools can be divided in two categories:
- the ones that run the compiled code in a modified environment (like the debugger), counting the required points (functions, lines, etc.);
- the ones that require special compilation - in this case the resulting binary already contains the code which actually does the counting.
There are several tools for measuring and visualizing the result, they depend from platform, from source code's language.
Please read article on Wikipedia
To provide you tools, please define for which OS and language do you use.
Code coverage is a measure used in software testing. It describes the degree to which the source code of a program has been tested.
http://en.wikipedia.org/wiki/Code_coverage
The wikipedia definition is pretty good, but in my own words code coverage tells you how much automated testing you have accounted for. 100% would mean that ever single line of code in your application is being covered by a unit test.
NCover is an application for .NET
The term refers to how well your program is covered by your tests. See the following wikipedia article for more info:
http://en.wikipedia.org/wiki/Code_coverage
The other answers already cover what Code Coverage is. The think I'd like to stress is that you need to be careful not to treat high coverage as implicitly meaning you've tested all scenarios. It doesn't necessarily say how well you've tested the code or the quality of your tests, just that you've hit a certain percentage of code as part of the tests running.
High Code Coverage does not necessarily mean High Test Quality, but High Test Quality does mean High Code Coverage
In practice, I usually aim for 90-95% code coverage which is often achievable. The last few % are often too expensive to be worth trying to hit.
There are many ways to develop applications. One of those is "Extreme Programming" or "Test Driven Design (TDD)". It states that all code should be tested. Code Coverage is a means of measuring how much is tested.
I'd like to make a small remark about this: I don't think all code should be tested, nor that one should set a specific percentage of code coverage. Neither do I think that code shouldn't be tested with Unit Tests (code testing code). I do think one should decide what makes sense to test. Due to this reason I generally don't use code coverage.
One thing that some tools provide, is highlight the parts that are tested. This way you might run into some code that isn't tested but actually should be, which is the only thing I use it for.
Good answers.
My two cents is that there is no method of testing that catches all errors, but less testing will never catch more errors, so any testing is good. To my mind, coverage testing is not to show what code has been exercised, but to show what code has not been exercised, because that is where bugs love to lurk.
If you combine it with single-stepping, it is a very good way to review code and catch bugs. Here's an example.
Another useful tool for ensuring code quality(which encompasses code coverage) that I recently used is Sonar.
Here is the link http://www.sonarqube.org/
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
We currently use ActiveReports (by Data Dynamics, now Grape City) for canned reports, but are considering moving up to their Reports package. If you've used it, I'd love to hear your take on:
Performance - do you feel it will scale well for a web based app (particularly compared with ActiveReports)
Export to Excel - it appears to provide a much cleaner export to Excel (ActiveReports' Excel export is awful, our biggest reason for considering a switch)
Other pros/cons (my company is pretty small, the $3,000 for 2 licenses is a lot for us)
Here are some additional information for you to consider about ActiveReports & Data Dynamics Reports:
ActiveReports Licensing:
There license is per developer. There are no royalties. You can write as many applications as you want and deploy your application to as many users or as many servers as you want without any additional costs. Read the ActiveReports License agreement here.
Reporting to Excel:
First of all, schooner is absolutely correct that all the other reporting tools have a poor scenario when exporting to excel. We recognized the same after many years of experience with ActiveReports. Frankly, it is a very hard problem to take reports designed to be paginated or deployed on the web and put them into a cell-based layout of a spreadsheet.
However, with Data Dynamics Reports', we took a completely different approach. Instead of creating just another "export to Excel", where we look at "paginated" report output and try to fit it into a spreadsheet somehow, we generate the excel output based on two things: A template and the actual data in the report.
By using a template, which is actually a specially formatted excel sheet (cells have some special place holders in them) the reporting engine can output the report's content to an excel sheet completely independent of how the report is laid out when paginated. We call this concept a "Transformation Extension" for Excel since it takes the report's content and transforms it to excel based on a template.
By default DDReports will generate a default template that you will find more often than not has pretty good output. However, if the excel output is not what you want, you can instruct DDReports to save the template so you can customize the output in excel.
The best way to get an introduction to this is to watch the screencast for the Excel Transformation Extension in Data Dynamics Reports here. Jump to about 1:20 in the screencast if you get impatient and see an example of a simple template. Keep in mind this is a very simple template and the possibilities are much more sophisticated. Unfortunately, thus far we haven't published very good documentation on using the excel transformation extension template syntax yet, but let me know if you have questions and I'll help you out! Just comment on this post or send an email to our support team.
Scott Willeke
Data Dynamics / GrapeCity
I've used it and it rocks! It has a Report Designer control that allows you users to build thier own reports on the fly and supports multiple datasources used in a single reports. Best reporting tool on the market bar none.
We use both products and they are quite different from each other. I have been a long time user of Active Reports and have loved them. But when it came time to select a .net reporting tool we did not want to spend a bunch of $$ so we decised to get their DDR product. It took me a couple of weeks to get used to it as I kept trying to use it like Active Reports. Not a good idea. Anyways, once you get used to it it does a decent job. there are some things that they need to do to improve the product. Here are the things that stand out.
You cannot access the control collection in the code area. This is a huge problem if you want to change anything like data binding inside the report.
The database connection have to be refreshed if you repopen the report int he designer. This took a while to figure out and we wondered why our fields would not show up in the preview mode when re reloaded the report.
Their new tech support is terrable. They were bought out recently and now when you call tech supprt you get someone tht has no knowledge that always tells you that someone will call you back. 80% of the time you get no call back. The otehr 20% of the time you get a sample emaild to you that has nothing to do with your issue. Now this is accorss the board with both products. THey used to have great tech support. I hope they fix this.
Those are the main problems and I know they are workign to solve the issues. Like i said we use boh DDR and Active Reports. If you need to do complicated reports stick with Active Reports. If they are simple and you do not want to spend a lto fo money then DDR works fine. I see DDR getting better with each release but it will take a while to get the knks worked out.
Just my opinion
I've only used ActiveReports as well, but their web licensing model is a bit expensive in general in my view, espeically if you need to develop multiple apps on multiple servers. Then there is the per developer costs as well.
I use DevXpress XtraReports and have been fairly happy with it so far and it has some fairly decent export functionality and a much better licensing model.
Regarding export to Excel, I've not seen any reporting tool do it well, mainly due to the formatting issues with the report itself. What we typically do is provide the formatted report to the user, along with an additional link for an Excel export which is a similar but different query with the raw data the report uses.
Another option over formatted printable reports is using grids such as Infragistics which allow you to do sorting, grouping, summaries, and which has excellent Excel export features.
This is to give more information to Bill's response in this thread. I tried to post a comment, but ran out of room :)
Bill Thanks for your honest assessment. Let me give some comments for you from the inside on the issues you mentioned:
1: Admittedly it is not quite as intuitive to access the controls collection as it was with AR, but you /can/ do it. You need to do it outside of the report (not in the script/code embedded into the report). To do it you can load the rdlx file in a ReportDefinition object. For example:
var rpt = new DataDynamics.Reports.ReportDefinition(new FileInfo("myfile...rdlx"));
var list = (DataDynamics.Reports.ReportObjectModel.List)rpt.Report.Body.ReportItems["myList"];
var txt = (DataDynamics.Reports.ReportObjectModel.TextBox)list.ReportItems["myTextBox"];
txt.Value = "=Fields!MyField.Value";
However, depending on the scenario you're after there may be a better way to handle this than changing the binding on the control/reportItem itself. It is difficult to say more without knowing more about your particular scenario/goals.
2: There was recently some discussion I was involved in on how to improve this in the very near future. The dev team was gathering use cases and doing some investigation on various caching strategies to keep hitting the database to an absolutely minimum in the designer. So look for improvements in this area in an upcoming build.
3: Unfortunately, we're working through some challenges with our new technical support team. However, we are improving constantly and we're working hard to bring up the new guys as quickly as possible. If you have a problem with one of your incidents with support feel free to email me personally with your case number and I'll work to try get your case escalated or help out in any way I can (scott dot willeke at grapecity dot com).
Thanks again for your feedback, my next letter is an internal one based on your feedback to help us improve!
Scott Willeke
Program Manager
Data Dynamics / GrapeCity inc.
I have used this product since 2004. Great performance, licensing was great. The migration from earlier versions was great. It had its flaws like ghost images for high speed high volume in production environment and missing some of the goodies you get with Crystal and bar codes issues. But this the engine was fast. Then came version 7. What a mess!! rendering a 4 x 4 label went from 320 ms to 800 ms. Try getting a patch... Good luck with that. Try getting someone on the phone suddenly became like winning lottery. If performance is not a factor and you need just simple reports, go for it. Otherwise, think twice. As for us, this is the last version if our QA can pass it. We're shopping for a replacement product.
They are good and I am not trying to frighten you, but below is the fact, in my perspective :
Pros
Active Community ... you can expect responses overnight.
Good stuff to get you started - walkr-thrus, tutorials, examples, vides etc
Internal builds - Just like Linux kernel patches you can get "hot fixe" for the problems their developer team was able to solve
Web report viewer is available and also works within Visual Studio - just like other reporting tools.
Cons
Week rendering engines - you can not expect that they are going to be exported to word/excel w/o any issues, if you use a sub-report in a table row.
Poor bug fixes - takes over a year to fix a bug - I am following one since 11-11-2011, still they keep saying "we will let you know as soon as we fix this bug"
Not too active to release stable versions. - It takes a year some times for the, to release the next stable version.
Low control over rendering, you may not use events if you wish to embed some code, but yes, Data Dynamics does provide VB.net (and just VB.net ! ) (Custom Code) support, you may use it for validation typo stuff
I am sharing some links for your reference:
forums | How to section | Walkthrough(s) | Useful resources | drill throughs | videos | Convert Crystal reports (Remember: vice versa is not possible) | online help / Documentation - User Guide | Web Report Viewer