What makes up the "standard jmock libraries"? - junit

I'm following this guide http://javaeenotes.blogspot.com/2011/06/short-introduction-to-jmock.html
I've received the error
java.lang.SecurityException: class "org.hamcrest.TypeSafeMatcher"'s signer information does not match signer information of other classes in the same package.
In the guide the author says:
The solution is make sure the jMock libraries are included before the
standard jUnit libraries in the build path.
What makes up the "standard jmock libraries" and the "junit libraries"?
Junit only has one jar so that's easy, but jmock comes with over 10 different jars.
I've been using: j-unit4.10, jmock-2.5, hamrest-core and hamcrest-library
What are the hamcrest core and library classes for?

i'm a committer on both libraries. JMock depends on hamcrest to help it decide whether an call to an object is expected. I suggest just using the hamcrest-all jar. The split between hamcrest core and library was to separate the fundamental behaviour of matching and reporting differences from a convenient implementations of the most common cases.
Finally, if you're using hamcrest, I suggest you use the junit-dep jar to avoid clashes with some features of hamcrest that are included in the junit.jar

JUnit is used to do Unit test in order to test your methods. JMock is used to test your program inside a context, You will have to know what you are expecting to send to the context (ENV) and what will answer the context.
JMock use JUnit, that is why, in order to avoid dependency conflicts, you need to include it before JUnit.
The 10 libraries of JMock are kind of add-ons if you need to use JMock script or any other functionnality not available in the JMock core.
You don't need to know about Hamcrest-core library to use JMock. Just follows the guide on the web site (don't use version 1 of JMock) and Organize your libraries in the correct order (JUnit should be last in order to avoid your error)

mock frameworks licke jmock do some black magic behind the scenes
( including, but not limited to runtime byte code manipulation )
to provide mock methods classes and whatever. To be able to do this,
some tweaks in basic junit classes are necessary, and the only way to do this is to
register itself as java agent before JU classes are loaded.
Also, put your mock framework before junit in classpath

Related

Is it possible to write a dual pass checkstyle check?

I have two situations I need a checkstyle check for. Let's say I have a bunch of objects with the annotation #BusinessLogic. I want to do a first pass through all *.java files creating a Set with the full classnames of these objects. Let's say ONE of the classes here is MyBusinessLogic. NEXT, and as part of a custom checkstyle checker, I want to go through and fail the build if there is any lines of code that say "new MyBusinessLogic()" in any of the code. We want to force DI when objects are annotated with #BusinessLogic. Is this possible with checkstyle? I am not sure checkstyle does a dual pass.
Another option I am considering is some gradle plugin perhaps that scans all java files and writes to a file the list of classes annotated with #BusinessLogic and then running checkstyle after that where my checker reads in the file?
My next situation is I have a library delivered as a jar so in that jar, I also have classes annotated with #BusinessLogic and I need to make sure those are also added to my list of classes that should not be newed up manually and only created with dependency injection.
Follow up question from the previous question here after reading through checkstyle docs:
How to enforce this pattern via gradle plugins?
thanks,
Dean
Is it possible to write a dual pass checkstyle check?
Possible, yes, but not officially supported. Support would come at https://github.com/checkstyle/checkstyle/issues/3540 but it hasn't been agreed on.
Multi-file validation is possible with FileSets (still not officially supported), but it becomes harder with TreeWalker checks. This is because TreeWalker doesn't chain finishProcessing to the checks. You can implement your own TreeWalker that will chain this finishProcessing to implementation of AbstractChecks.
You will have to do everything in 1 pass with this method. Log all new XXX and classes with annotation #YYY. In the finishProcessing method, correlate the information obtained between the 2 and print a violation when you have a match.
I have a library delivered as a jar
Checkstyle does not support reading JARs or bytecode. You can always create a hard coded list as an alternative. The only other way is build your own reader into Checkstyle.

Specifying the helmet interface with Vorto

how to simply do the smoke test with vorto?
use vorto to integrate ditto and devices
in the topo, device-hono-vorto-ditto, we want the simplist way to test the workflow, is there any method?
I assume you are using the Vorto Semantic Middleware between Eclipse Hono and Eclipse Ditto, similar to https://github.com/eclipse/vorto-examples/tree/master/vorto-dashboard/docs/AssetTracking.md
The following tutorial, it is described how to set up the pipeline as well as to test it: https://github.com/eclipse/vorto/blob/development/docs/tutorials/create_mapping_pipeline.md
This tutorial does not mention Eclipse Ditto as a consumer of the pipeline, because it is important that the basic (Device-Hono-Vorto) pipeline works first before adding Eclipse Ditto in the equation.
Does your mapped / semantic data of the helmet appear in the middleware logs correctly ?

Which kind of test should I use for a library?

I'm developing a PHP library that I'd like to use in different projects. The library uses a REST-like service in the background. I don't want to write tests for the service API, but for the library.
Would I need to write unit tests? Or functional tests? Since it is a library I won't write acceptance test - I hope this is correct.
I don't know if this is important for the issue, but the library needs to login into the service API and uses an API-key for the next operations. Also, when the library gets tested, the operations before are important. It is a designer tool and I have operations like 'move rectangle', 'rotate rectangle' and so on and I would like to test several operations in a sequence that should bring a certain result.
I think that this is a kind of functional test. Or do I need both? Can unit tests work with a service in the background?

How does c3p0's JdbcProxyGenerator work (metaprogramming in Java‽)?

I've been using c3p0 with hibernate for a couple of years. When looking at exception stack traces, I see classes such as com.mchange.v2.c3p0.impl.NewProxyPreparedStatement in the stack. I went looking for the source code for these classes and came across the curous com.mchange.v2.c3p0.codegen package.
In particular, it looks like JdbcProxyGenerator is metaprogramming in Java. I'm having a hard time understanding the codegen mechanism and why it is used. The built jar contains these generated classes, so I'm assuming these classes are built during the build, perhaps as part of a two-phase build. The codegen package does not appear to be in the generated jar.
Any insight would be appreciated, just for my own curiosity. Thanks!
yes, you are absolutely right.
c3p0 uses code generation to generate non reflective proxy implementations of large JDBC interfaces, "java bean" classes with lots of properties, and some classes containing debug and logging flags (to set up conditional compilation within the build).
You can always see the generated classes by typing ant codegen in the source distribution, and then looking at the build/codebase directory. The latest binary distribution of c3p0 (0.9.2-pre2) includes the generated sources in a src.jar file, which you can also find as a maven artifact at http://repo1.maven.org/maven2/com/mchange/c3p0/0.9.2-pre2-RELEASE/c3p0-0.9.2-pre2-RELEASE-sources.jar
I hope this helps!

Do any "major" frameworks make use of monkey-patching/open classes

I am curious about the usage of the feature known as open classes or monkey-patching in languages like e.g. Ruby, Python, Groovy etc. This feature allows you to make modifications (like adding or replacing methods) to existing classes or objects at runtime.
Does anyone know if major frameworks (such as Rails/Grails/Zope) make (extensive) use of this opportunity in order to provide services to the developer? If so, please provide examples.
Rails does this to a (IMHO) ridiculous extent.
.Net allows it via extension methods.
Linq, specifically, relies heavily on extension methods monkey-patched onto the IEnumerable interface.
An example of its use on the Java platform (since you mentioned Groovy) is load-time weaving with something like AspectJ and JVM instrumentation. In this particular case, however, you have the option of using compile-time weaving instead. Interestingly, one of my recent SO questions was related to problems with using this load-time weaving, with some recommending compile-time as the only reliable option.
An example of AspectJ using load-time (run-time) weaving to provide a helpful service to the developer can be Spring's #Configuration annotation which allows you to use Dependency Injection on object not instantiated by Spring's BeanFactory.
You specifically mentioned modifying the method (or how it works), and an example of that being used is an aspect which intercepts am http request before being sent to the handler (either some Controller method or doPost, etc) and checking to see if the user is authorized to access that resource. Your aspect could then decide to return – prematurely – a response with a redirect to login. While not modifying the contents of the method per se, you are still modifying the way the method works my changing the return value it would otherwise give.