Generate code from spec in Springdoc Open API 3 - springdoc

I'm moving from swagger (Open API 2) to springdoc (Open API 3), but today in some cases I use swagger-codegen-maven-plugin to generate code (for client and provider)from yaml, following the Contract First strategy. Bellow the configuration example:
<plugin>
<groupId>io.swagger</groupId>
<artifactId>swagger-codegen-maven-plugin</artifactId>
<version>2.4.9</version>
<executions>
<execution>
<id>generate-provider-v1</id>
<phase>generate-resources</phase>
<goals>
<goal>generate</goal>
</goals>
</execution>
</executions>
<configuration>
<inputSpec>${project.basedir}/src/main/resources/swagger/my-api.yaml</inputSpec>
<output>${project.build.directory}/generated-sources/swagger</output>
<language>spring</language>
<library>spring-boot</library>
<modelPackage>br.com.sample.representation</modelPackage>
<apiPackage>br.com.sample.adapter.controller.v1</apiPackage>
<generateSupportingFiles>true</generateSupportingFiles>
<configOptions>
<interfaceOnly>true</interfaceOnly>
<delegatePattern>true</delegatePattern>
<dateLibrary>java8</dateLibrary>
</configOptions>
<modelNameSuffix>Representation</modelNameSuffix>
<generateSupportingFiles>false</generateSupportingFiles>
</configuration>
</plugin>
There are any equivalent option to generate code with springdoc-openapi-maven-plugin?

As described in the documentation:
The aim of springdoc-openapi-maven-plugin is to generate json and yaml OpenAPI description during build time. The plugin works during integration-tests phase, and generate the OpenAPI description.
You can have a look at the openapi-generator-maven-plugin for code generation from the spec:
https://openapi-generator.tech/docs/plugins

Related

RDF4J RIO UnsupportedRDFormatException when adding data to an HTTPRepository using a stand-alone application

I have an HTTPRepository initialised with a the URL to the repository. I use a RepositoryConnection to retrieve and add (weather) data to the repository. The data is retrieved from a web service, then transformed into RDF statements, and added to the repository. This is done periodically by a stand-alone application.
When I run this application within IntelliJ, everything works fine.
To run this application on a server I created a jar file (containing all dependencies). The application starts as expected and is able to retrieve data from the repository.
However, when the application tries to write data to the repository I get an UnsupportedRDFormatException:
org.eclipse.rdf4j.rio.UnsupportedRDFormatException: Did not recognise RDF format object BinaryRDF (mimeTypes=application/x-binary-rdf; ext=brf)
at org.eclipse.rdf4j.rio.Rio.lambda$unsupportedFormat$0(Rio.java:568) ~[weatherData-1.0-SNAPSHOT-jar-with-dependencies.jar:na]
at java.util.Optional.orElseThrow(Optional.java:290) ~[na:1.8.0_111]
at org.eclipse.rdf4j.rio.Rio.createWriter(Rio.java:134) ~[weatherData-1.0-SNAPSHOT-jar-with-dependencies.jar:na]
at org.eclipse.rdf4j.rio.Rio.write(Rio.java:371) ~[weatherData-1.0-SNAPSHOT-jar-with-dependencies.jar:na]
at org.eclipse.rdf4j.rio.Rio.write(Rio.java:324) ~[weatherData-1.0-SNAPSHOT-jar-with-dependencies.jar:na]
at org.eclipse.rdf4j.repository.http.HTTPRepositoryConnection.addModel(HTTPRepositoryConnection.java:588) ~[weatherData-1.0-SNAPSHOT-jar-with-dependencies.jar:na]
at org.eclipse.rdf4j.repository.http.HTTPRepositoryConnection.flushTransactionState(HTTPRepositoryConnection.java:662) ~[weatherData-1.0-SNAPSHOT-jar-with-dependencies.jar:na]
at org.eclipse.rdf4j.repository.http.HTTPRepositoryConnection.commit(HTTPRepositoryConnection.java:326) ~[weatherData-1.0-SNAPSHOT-jar-with-dependencies.jar:na]
at org.eclipse.rdf4j.repository.base.AbstractRepositoryConnection.conditionalCommit(AbstractRepositoryConnection.java:366) ~[weatherData-1.0-SNAPSHOT-jar-with-dependencies.jar:na]
at org.eclipse.rdf4j.repository.base.AbstractRepositoryConnection.add(AbstractRepositoryConnection.java:431) ~[weatherData-1.0-SNAPSHOT-jar-with-dependencies.jar:na]
at nl.wur.fbr.data.weather.WeatherApp.retrieveData(WeatherApp.java:122) ~[weatherData-1.0-SNAPSHOT-jar-with-dependencies.jar:na]
at nl.wur.fbr.data.weather.WeatherData$WeatherTask.run(WeatherData.java:105) [weatherData-1.0-SNAPSHOT-jar-with-dependencies.jar:na]
at java.util.TimerThread.mainLoop(Timer.java:555) [na:1.8.0_111]
at java.util.TimerThread.run(Timer.java:505) [na:1.8.0_111]
The source code in which the error occurs is:
public void retrieveData(){
logger.info("Retrieving data for weather for app: "+ID+" ");
RepositoryConnection connection = null;
ValueFactory vf = SimpleValueFactory.getInstance();
try {
connection = repository.getConnection();
// Retrieving the locations from the repository (no problem here).
List<Location> locations = this.retrieveLocations(connection);
List<Statement> statements = new ArrayList<>();
// Retrieving weather data from each location and transforming it to statements.
for(Location location : locations){
List<Weather> retrievedWeather = weatherService.retrieveWeatherData(location.name,location.latitude,location.longitude);
for(Weather weather : retrievedWeather){
BNode phenomenon = vf.createBNode();
statements.add(vf.createStatement(location.ID,WEATHER.HAS_WEATHER,phenomenon,rdfStoreGraph));
statements.addAll(weather.getStatements(phenomenon,vf,rdfStoreGraph));
statements = this.correctOMIRIs(statements,vf);
}
}
// Adding data retrieved from the weather API
// This is where the exception happens.
connection.add(statements,rdfStoreGraph);
} catch (Exception e) {
logger.error("Could not retrievedata for weather app: '"+ID+"' because no monitor locations could be found.",e);
} finally {
if(connection != null){
connection.close();
}
}
}
The HTTPRespository is initialised as so:
repository = new HTTPRepository(rdfStore.toString());
((HTTPRepository)repository).setPreferredRDFFormat(RDFFormat.BINARY);
((HTTPRepository)repository).setPreferredTupleQueryResultFormat(TupleQueryResultFormat.BINARY);
I've tried changing the formats to TURTLE. But it makes no difference.
Can you tell me how to solve this?
NB. Both the RDF4J server and library have version 2.0.1 (rdf4j).
To run this application on a server I created a jar file (containing all dependencies).
There's your problem: you created a "fat jar" and probably haven't properly merged SPI registry files.
RDF4J's Rio parsers (and several other modules as well) use Java's Service Provider Interface (SPI) mechanism to register themselves. This mechanism relies on a text file in META-INF\services in the jar file containing the fully-qualified name of each parser implementation.
The problem comes when you merge jars: each Rio parser jar has a registry file with the same name, but different contents. If you are using something like the maven assembly plugin to create the fat jar, each registry file gets overwritten by the next one. The consequence is that at the end, RDF4J can only find one parser - the one whose registry file was added last to the fat jar.
The solution is to either not create a fat jar at all, or if you must, use a different technique to create it, which merges the registry files rather than overwriting them. The maven shade plugin has a good config option for this: the ServicesResourceTransformer.
I am refloating this post because I got stucked with this for several hours. Finally, I could generate an executable jar by using maven shade plugin with the following configuration:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>3.2.4</version>
<configuration>
<filters>
<filter>
<artifact>*:*</artifact>
<excludes>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
</excludes>
</filter>
</filters>
</configuration>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<transformers>
<transformer implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>
<transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
<mainClass>${fully.qualified.main.class}</mainClass>
</transformer>
</transformers>
</configuration>
</execution>
</executions>
</plugin>
I used the shaded plugin with the ManifestResourceTransformer for creating the executable jar indicating the main class of my project, and with the ServicesResourceTransformer for dealing with the RDF4J package naming in order to avoid that one parser overrides the previous one. Also, I had to include the filter section to avoid JNI errors derived from package signatures.
I hope this is useful for someone.
Greetings.

How to show the name of the currently executed test method

I have a larger number of tests and want (triggered via maven) see on stdout/stderr, which test method is currently executed.
I do not want or need that inside the test itself, but rather follow what is going on -- a little bit like IntelliJ does that with the circles that then turn red or green.
Is there a command line option to do this, or do I have to write my own test runner, which does that for me?
Your best option is probably a RunListener, which you can plug into Maven:
RunListener listens to test events, such as test start, test end, test failure, test success etc.
public class RunListener {
public void testRunStarted(Description description) throws Exception {}
public void testRunFinished(Result result) throws Exception {}
public void testStarted(Description description) throws Exception {}
public void testFinished(Description description) throws Exception {}
public void testFailure(Failure failure) throws Exception {}
public void testAssumptionFailure(Failure failure) {}
public void testIgnored(Description description) throws Exception {}
}
Then, in surefire, you can specify a custom listener, using:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.10</version>
<configuration>
<properties>
<property>
<name>listener</name>
<value>com.mycompany.MyResultListener,com.mycompany.MyResultListener2</value>
</property>
</configuration>
</plugin>
This is from Maven Surefire Plugin, Using JUnit, Using custom listeners and reporters
What Matthew wrote works for JUnit and TestNG has a similar interface also.
I'm using TestNG, and setting the verbosity high does the job for me:
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>${maven-surefire-plugin.version}</version>
<configuration>
<properties>
<property>
<name>surefire.testng.verbose</name>
<value>10</value>
</property>
</properties>
</configuration>
</plugin>
</plugins>
</build>
It is documented here http://maven.apache.org/surefire/maven-surefire-plugin/examples/testng.html
The verbosity level is between 0 and 10 where 10 is the most detailed.
You can specify -1 and this will put TestNG in debug mode (no longer
slicing off stack traces and all). The default level is 0.
It has also been answered here Set TestNG's verbosity level from Maven
The output then contains such lines as the tests progress:
[TestNG] INVOKING: "Surefire test" -
org.example.MyTest.myMethod(java.lang.String,
java.lang.String)(value(s) ...
Also the #BeforeClass and #AfterClass methods are mentioned.

How to exclude all JUnit4 tests with a given category using Maven surefire?

I intend on annotating some of my JUnit4 tests with an #Category annotation. I would then like to exclude that category of tests from running on my Maven builds.
I see from the Maven documentation that it is possible to specify a category of tests to run. I want to know how to specify a category of tests not to run.
Thanks!
You can do this as follows:
Your pom.xml should contain the following setup.
<configuration>
<excludes>
<exclude>**/TestCircle.java</exclude>
<exclude>**/TestSquare.java</exclude>
</excludes>
</configuration>
If you want regex support just use
<excludes>
<exclude>%regex[.*[Cat|Dog].*Test.*]</exclude>
</excludes>
http://maven.apache.org/surefire/maven-surefire-plugin/examples/inclusion-exclusion.html
If you want to use Category annotation, you need to do the following:
#Category(com.myproject.annotations.Exclude)
#Test
public testFoo() {
....
}
In your maven configuration, you can have something like this
<configuration>
<excludedGroups>com.myproject.annotations.Exclude</excludedGroups>
</configuration>

Is there a way to divide artifacts between test and compile using the maven-dependency-plugin during the copy-dependencies goal?

I have the following configuration:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<version>2.6</version>
<executions>
<execution>
<id>analyze</id>
<goals>
<goal>analyze-only</goal>
</goals>
<configuration>
<failOnWarning>false</failOnWarning>
</configuration>
</execution>
<!--Copy the dependencies so ant build has the same versions-->
<execution>
<id>copy-dependencies</id>
<phase>package</phase>
<goals>
<goal>copy-dependencies</goal>
</goals>
<configuration>
<outputDirectory>${project.basedir}/lib</outputDirectory>
<overWriteIfNewer>true</overWriteIfNewer>
<stripVersion>true</stripVersion>
<overWriteReleases>false</overWriteReleases>
<overWriteSnapshots>true</overWriteSnapshots>
<excludeTransitive>false</excludeTransitive>
</configuration>
</execution>
</executions>
</plugin>
The above configuration dumps everything on the same folder. I tried excluding the test scope by adding the test configuration but gives an error:
Failed to execute goal org.apache.maven.plugins:maven-dependency-plugin:2.6:copy-dependencies (copy-dependencies) on project pcgen: Can't exclude Test scope, this will exclude everything.
Is there a way to separate test dependencies from the rest so I can copy to different folders?
I tried excluding the test scope by adding the test configuration but gives an error
I just stumbled across this, probably for very different reasons, but I think I found us both the answer. Try this, for example. You'll need pom.xml in the current directory, of course.
mvn dependency:copy-dependencies \
-DincludeScope=runtime \
-DexcludeScope=provided \
-DoutputDirectory=target/war/WEB-INF/lib
A huge belated thanks to Brian Fox, who writes on Maven Dependency Plugin Issue #128:
You shouldn't ever need to include or exclude two scopes at the same time because they are comprised of each other. The default is to include test scope, which includes everything. If you don't want any test dependencies or provided dependencies, then include runtime and exclude provided.
The scopes being interpreted are the scopes as maven sees them, not as specified in the pom. So the "test" scope includes everything, runtime includes compile but not provided etc.
In May 2013, the includeScope documentation was updated to:
/**
* Scope to include. An Empty string indicates all scopes (default).
* The scopes being interpreted are the scopes as
* Maven sees them, not as specified in the pom. In summary:
* <ul>
* <li><code>runtime</code> scope gives runtime and compile dependencies,</li>
* <li><code>compile</code> scope gives compile, provided, and system dependencies,</li>
* <li><code>test</code> (default) scope gives all dependencies,</li>
* <li><code>provided</code> scope just gives provided dependencies,</li>
* <li><code>system</code> scope just gives system dependencies.</li>
* </ul>
*
* #since 2.0
*/
#Parameter( property = "includeScope", defaultValue = "" )
protected String includeScope;
use includeScope indeed, test scope includes every scope, that is why fails.
<includeScope>runtime</includeScope>

InvalidUseOfMatchersException when setting expectations in Mockito and Groovy 2.0.4

I am getting an unexpected error when trying to perform a simple test with mocks.
#RunWith(MockitoJUnitRunner)
class AccessorTest {
#Mock
private DeviceBuilder deviceBuilder
#Test
void shouldCreateDeviceFromFilesystem() {
//given
URI uri = this.class.classLoader.getResource("sample-filesystem").toURI()
File deviceRoot = new File(uri)
Accessor accessor = new Accessor(deviceBuilder)
Device expectedDevice = new Device(deviceRoot)
when(deviceBuilder.build(eq(deviceRoot))).thenReturn(expectedDevice)
//when
Device device = accessor.readFrom(deviceRoot)
//then
assert device == expectedDevice
verify(deviceBuilder).build(deviceRoot)
}
}
The DeviceBuilder is a single method interface Device::DeviceBuilder#build(File root). Device has a well defined equals method as per Josh Bloch.
The exception is thrown on the when() line, and none of the variables in scope are null. The full exception is:
org.mockito.exceptions.misusing.InvalidUseOfMatchersException:
Invalid use of argument matchers!
0 matchers expected, 1 recorded.
This exception may occur if matchers are combined with raw values:
//incorrect:
someMethod(anyObject(), "raw String");
When using matchers, all arguments have to be provided by matchers.
For example:
//correct:
someMethod(anyObject(), eq("String by matcher"));
For what it's worth here's a snippet of my POM:
<build>
<plugins>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<version>2.3.2</version>
<configuration>
<compilerId>groovy-eclipse-compiler</compilerId>
</configuration>
<dependencies>
<dependency>
<groupId>org.codehaus.groovy</groupId>
<artifactId>groovy-eclipse-compiler</artifactId>
<version>2.7.0-01</version>
</dependency>
</dependencies>
</plugin>
</plugins>
</build>
<dependencies>
<dependency>
<groupId>org.codehaus.groovy</groupId>
<artifactId>groovy-all</artifactId>
<version>2.0.4</version>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.10</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.mockito</groupId>
<artifactId>mockito-all</artifactId>
<version>1.8.5</version>
<scope>test</scope>
</dependency>
</dependencies>
My suspicions are either some weirdness from the way Groovy manages classes, or some version incompatibility, but I hope it's something obvious I just can't see.
I know this is an old question, but this may be helpful to someone else.
The issue described in the question is detailed at http://code.google.com/p/mockito/issues/detail?id=303
I ran into the very same issue and I got it fixed by using the library provided at https://github.com/cyrusinnovation/mockito-groovy-support
Sometimes Mockito doesn't flag inappropriate uses at the time it occurrs but at the next invocation. You could have a look at the test before that one, maybe you have a problem there?
The message says that in an invocation you either have to provide only Matchers or only real objects, but not a combination thereof. You can usually fix that by using eq(obj) instead of the object itself (or sameInstance(obj)), to have all Matchers. However in this case I don't find any place in the code you posted that would fit that description, that's why I suspect a problem in an earlier place of the code.
I'm not answering the original question.
I've searched everywhere and the only question about the error I got points to this discussion.
In case if someone got this error: do not call mock methods to provide values for matchers. The call deviceBuilder.build should go BEFORE verify(.