I have a maven multi-modul project. The structure looks like:
-modulA (Main project)
- pom.xml
-parentModul (Aggregator)
- pom.xml
- ModulB (Integration Test Project)
-pom.xml
Package definition is like:
<project ...>
<modules>
<module>../modulA</module>
<module>ModulB</module>
</modules>
One of the moduls (ModulA) has the same hierarchy level, The other is in the parent modul.
I try to add a job in Jenkins to build all automatically. (clean package)..
How should I configure the job that parentModul finds the other modules and build the project.??
Add another aggregator. Place modulA and parentModul in the same directory and one level above them, simply add another pom like this:
aggregator/
|- modulA/
| |- pom.xml
|- parentModul/
| |- modulB/
| | |- pom.xml
| |- pom.xml
|- pom.xml
In the aggregator/pom.xml define a modules section as follows:
<project ...>
<modules>
<module>modulA</module>
<module>parentModul</module>
</modules>
</project>
One solution would be to use SVN Externals to assemble sane Maven project workspace from insane SVN structure even accross different svn servers. Using svn external is not an ideal solution as it will cause problems with maven release plugin and with CI's SCM polling in certain configuration. They tell you : "You should not use svn externals" as makes life more difficult. But so do leg cast and crutches but when you have broken leg (or svn structure that is fubar) you really do not mind the inconvenience.
There several ways of doing this, depending on your SVN structure and desired workspace structure:
svn co modulA
cd modulA
svn propedit svn:externals . NB: Note the (curdir) dot at the end of line!
add following line:
parentModul http://path.to.your.svn.location.of.parentModul
svn ci -m "added externals for parent"
svn up
and the svn retrieves (and commits) you code to following workspace structure
modulA/
| |- pom.xml
parentModul/
|- modulB/
| |- pom.xml
|- pom.xml
pom.xml
Or you could add a new directory to SVN 'workplace' and add new pom.xml there and get all modules you could ever want via svn externals to one level (or to multiple levels).
In my case it is not possible to add another parent project. ModulA is already checked in SVN independently from the others.
This indicates that you're probably wrong in putting moduleA in the <modules> section. If moduleA has different life cycle, then you should include it as a dependency.
Related
I have a C# .NET v4.6.1 compiled Azure Function, I am using Microsoft.NET.Sdk.Functions and I deploy it using my usual CI/CD Pipeline.
That being, a TeamCity MSBuild Build Step to create an OctoPack (because I have installed the OctoPack 3.6.3 NuGet package.
I then publish the resulting *.nupkg file to Octopus and Create a Release.
This is how I do all my Azure App Services, however as is nicely described in this post, compiled Azure Functions create a few extra files/folders when they are published to describe the entry point for the Function.
I can see (in the TeamCity build logs) that these extra files/folders are created by MSBuild (15.3.409.57025) but only AFTER it has prepared the OctoPack. Meaning my OctoPack artifact does not contain the necessary function specific folder(s) with the function.json file nor the functionsSdk.out.
I have managed to get around this issue by doing an extra TeamCity NuGet Pack Build Step to build the OctoPack again. I also had to create a *.nuspec file in the project root, where I tell NuGet Pack to include everything (see below) because using just the *.csproj file also ignored the extra folder/files.
<files>
<file src="bin\Release\net461\**\*.*" />
</files>
This works because it runs after the MSBuild Step and the extra folders/files are present. It will also be robust enough to support other Functions when are added to the Project going forward.
The need for this extra step and the *.nuspec file seems unnecessary. Can anyone see where I went wrong and why MSBuild seems to have the sequence of Publish and OctoPak wrong?
This could be a reason:
If the section exists, OctoPack by default won't attempt to
automatically add any extra files to your package, so you'll need to
be explicit about which files you want to include. You can override
this behavior with /p:OctoPackEnforceAddingFiles=true which will
instruct OctoPack to package a combination of files using its
conventions, and those defined by your section.
https://octopus.com/docs/packaging-applications/creating-packages/nuget-packages/using-octopack
Another idea — broken .csproj file. Please check it.
Maybe, during the merge these two lines were reordered:
<Import Project="$(MSBuildToolsPath)\Microsoft.CSharp.targets" />
<Import Project="..\packages\OctoPack.3.6.3\build\OctoPack.targets" Condition="Exists('..\packages\OctoPack.3.6.3\build\OctoPack.targets')" />
There should be Microsoft.CSharp.targets first. Order matters.
Workaround: OctoPack running after publish
<Target Name="SetRunOctoPack">
<PropertyGroup>
<RunOctoPack>true</RunOctoPack>
</PropertyGroup>
</Target>
<Target Name="AfterPublish" DependsOnTargets="SetRunOctoPack">
<CallTarget Targets="OctoPack"/>
</Target>
I am working with the following project structure
parent
+-- pom.xml (parent and reactor)
module-1
+-- pom.xml
module-...
+-- pom.xml
I would like to be able to do a mvn release:prepare on the parent project and have the resulting war as well as a consistent tag structure in svn.
Right now everything seems to work fine except the tagging of the modules, that is, a mvn release:prepare will tag the parent project but none of the child projects. I have already found and tried the switch commitByProject in the configuration of the parent-pom. I have entered and removed scm configurations in the moduel-poms, I have tried configuring the release-plugin in the module-poms all to no avail. The release-step never asks me for a tagname for any of the modules and consequently does not create a tag later on in the project.
How do I configure parent and module such that a mvn release:prepare will tag the modules?
I would suggest to reorganize the structure to fit Maven's best practice like the following:
root (pom.xml; parent)
+-- module-1 (pom.xml)
+-- module-2 (pom.xml)
+-- module-...
This will make your life easier with Maven and also in doing a release via mvn release:prepare etc.
I assume you have in VCS the following folder structure:
root
+-- parent (pom.xml)
+-- module-1 (pom.xml)
+-- module-2 (pom.xml)
+-- module-...
root is the folder which is checkedout from version control (trunk in SVN; or master git).
If you have given the correct relative path to the parent in the given modules everything should work without any problem....Configuring the scm part in parent.
After further, countless hours of searching I no longer assume, it is possible to tag each module independent from the others using the maven release plugin.
I have found (and lost) an explicit comment, that this is not possible with the release plugin and there are further hints, for example, that the release plugin only accepts exactly one scm tag in non-interactive mode.
As I'm a Java developer, not a maven developer I refuse to change my package structure and thus am stuck with doing the tagging by hand.
I developed a custom mediator and its corresponding Factory/Serializer classes so that I can configure complex configuration options for it inside a sequence. This was made with a carbon app project using carbon studio.
The thing is that the only way I can make the configuration element to work is by deploying the jar file outside the .car file.
Can I just deploy the .car file to make it work? Where do I have to put the META-INF/services folder in order to work properly?
Here's my CarbonAppProject structure
CarbonApp
+--artifacts
+--lib
+--library
+--bundle
+--jXLS <-- Java Library Artifact
+--synapse
+--mediator
+--XlsToObjectMediator <-- Custom Mediator Artifact
+--builder
+--META-INF
+--services
+--org.apache.synapse.config.xml.MediatorFactory <-- FILE
+--org.apache.synapse.config.xml.MediatorSerializer <-- FILE
+--src
+--main
+--java
+--<package> <-- mediator, factory and serializer clases
Any help will be much appreciated.
You cannot get this to work with this version of Carbon Studio but it is possible to get it work with newer version of it. You can get the newer version from http://builder1.us1.wso2.org/~developerstudio/developer-studio/2.0.0/RC1/wso2-developer-studio_2.0.0.RC1.zip
Steps.
Create a ESB Custom mediator project and create your mediator sources there
Copy the META-INF/services folder to the src/main/resources folder of the same project.
Build the Custom Mediator project with Maven
Create a Java Library Artifact project and make sure to add jXLS library to it.
Build the Java Library Artifact Project with Maven
Create Carbon Application Project
Add the Custom Mediator Project and Java Library Artifact project as dependencies of the Carbon Application Project.
Build the C-App project with Maven
Now you will be able to get the ESB Custom Mediator running in ESB without any issue.
.
|-- pom.xml
`-- src
`-- main
|-- java
| `-- ddd
| `-- dd.java
`-- resources
`-- META-INF
`-- services
|-- org.apache.synapse.config.xml.MediatorFactory
`-- org.apache.synapse.config.xml.MediatorSerializer
Your mediator project structure would be similar to above
Hope this helps!!
Thankss.
/Harshana
I don't think declaring a project level dependency between JavaLibraryArtifact project and Custom mediator project will solve this.
But you can achieve this by adding the dependency to the actual 3rd Party Library from the mediator project.
Steps:
Right click on the Custom mediator project, select Build Path -> Configure Build Path
Go to Libraries Tab and Select "Add Jars" button. This will populate the Project Browser dialog
Expand the JavaLibraryArtifact Project in the Project Browser and select the jXLS library from the file list in there and click on "Ok"
Now you have added the jXLS to your Custom mediator project buildpath. So you won't see any errors in your Custom Mediator project.
If you open the .Classpath file of the Custom mediator project, you will see an entry similar to following.
<classpathentry kind="lib" path="/JavaLibArtifactProject/jXLS.jar"/>
To avoid compilation errors from Maven, you need to add a Dependency to jXLS library in the Custom mediator project pom.xml.
By following the above steps you can avoid duplicating the JXLS library in your projects.
Hope this helps!!
Thanks.
/Harshana
I have a mercurial repository which has several projects (from an IntelliJ IDEA sense) within it. for example, I might have:
foo/
projects/
project1/
.idea/
project2/
.idea/
I can push,pull,commit etc fine with command-line and TortoiseHG. I've enabled Mercurial (hg4idea plugin) within IntelliJ yet almost nothing seems to be working. If I add a source code file it doesn't get added, for files I've added manually they show no changes. In IntelliJ the Mercurial menu is enabled but "add to VCS" is always greyed out. However, IntelliJ correctly lists available changesets from a remote repository.
In the Version Control window for commands like "hg status" I get errors like:
abort: repository C:/foo/projects/project1 not found!
Commands like "hg incoming" seem to be succeeding.
I suspect this might be because the project root (project1) is below the repository root (foo). Does anyone know how to resolve this problem? Is there a configuration change I can make? If so, where in the settings is it?
I'm using the latest (10.0.3) IntelliJ IDEA Community edition.
I managed to work this one out myself. When you enable the project for Mercurial, IntelliJ sets the project root by default as the mercurial repository directory. With the help of this help page I worked out this was what I needed to change.
Go to the Settings dialog (File-Settings)
Choose Version Control
At the top level it shows a list of Directories and VCS's, mine said "project root" and "Mercurial"
Click "Remove" to remove the existing mapping.
Click "Add" to add a mapping, and use the "..." button to choose the mercurial root - "foo" in the example above.
This changes the vcs.xml file in the .idea directory.
The steps provided by Nick work for me in a similar setup with multiple projects living in the same Mercurial repository.
I was worried that IDEA would use absolute paths in vcs.xml but it's more intelligent:
<?xml version="1.0" encoding="UTF-8"?>
<project version="4">
<component name="VcsDirectoryMappings">
<mapping directory="$PROJECT_DIR$/.." vcs="hg4idea" />
</component>
</project>
I've got a foo.war file from a third-party vendor. I've defined a context in my Tomcat configuration by creating conf/Catalina/localhost/foo.xml that contains:
<Context docBase="/path/to/foo.war" ...> ... </Context>
I want Tomcat to load up the foo context at startup. But the WEB-INF/web.xml (deployment descriptor) in the foo.war file does not include a <load-on-startup>, so Tomcat waits until the first request. I'd really rather not unpack the third-party foo.war to edit their web.xml. Plus, I'd have to do it every time the vendor releases a new version of their .war.
Is there any way within Tomcat configuration to tell Tomcat to load the foo context at startup? I know that within the <Context> element you can set parameters, env vars, etc without editing the web.xml. But I can't find anything in the Tomcat docs about loading on startup.
This is tricky. You're limited by the conventions of Tomcat and other containers, so there's no straightforward solution.
You could use the global web.xml to initialize specific servlets and/or JSPs from the .war using the <load-on-startup> element. This is the only way I know of to force load-on-startup without modifying the .war file or the WEB-INF/web.xml inside it. Note that you may need to initialize the servlets and JSPs using different names/paths to avoid conflicts.
Of course, doing it that way means you have to know enough about the .war to initialize the app, which might mean looking at its web.xml to determine what to load. This might defeat the purpose, since it's not exactly a hands-off approach to loading just any .war on startup. But with a little extra work, you could write a script that extracts the necessary information from the .war file's web.xml and adds it to your global web.xml automatically.
Now, if you're willing to consider script writing to modify the .war file, you could just write a script that extracts WEB-INF/web.xml from the .war file, adds <load-on-startup> child elements to all the <servlet> elements, and updates the .war with the new copy. I'm not sure what environment you're using to run Tomcat, but here's an example bash script that would do the job:
#!/bin/sh
TEMPDIR=/tmp/temp$$
WARFILE=/path-to-tomcat/webapps/foo.war
mkdir -p $TEMPDIR/WEB-INF
pushd $TEMPDIR
unzip -qq -c $WARFILE WEB-INF/web.xml \
| sed 's#</servlet>.*#<load-on-startup>99</load-on-startup></servlet>#' \
> WEB-INF/web.xml
zip -f $WARFILE WEB-INF/web.xml
popd
rm -rf $TEMPDIR
You could run this script or something similar as part of your Tomcat startup. Hope this helps.