MSBuild doing OctoPack before Publish - function

I have a C# .NET v4.6.1 compiled Azure Function, I am using Microsoft.NET.Sdk.Functions and I deploy it using my usual CI/CD Pipeline.
That being, a TeamCity MSBuild Build Step to create an OctoPack (because I have installed the OctoPack 3.6.3 NuGet package.
I then publish the resulting *.nupkg file to Octopus and Create a Release.
This is how I do all my Azure App Services, however as is nicely described in this post, compiled Azure Functions create a few extra files/folders when they are published to describe the entry point for the Function.
I can see (in the TeamCity build logs) that these extra files/folders are created by MSBuild (15.3.409.57025) but only AFTER it has prepared the OctoPack. Meaning my OctoPack artifact does not contain the necessary function specific folder(s) with the function.json file nor the functionsSdk.out.
I have managed to get around this issue by doing an extra TeamCity NuGet Pack Build Step to build the OctoPack again. I also had to create a *.nuspec file in the project root, where I tell NuGet Pack to include everything (see below) because using just the *.csproj file also ignored the extra folder/files.
<files>
<file src="bin\Release\net461\**\*.*" />
</files>
This works because it runs after the MSBuild Step and the extra folders/files are present. It will also be robust enough to support other Functions when are added to the Project going forward.
The need for this extra step and the *.nuspec file seems unnecessary. Can anyone see where I went wrong and why MSBuild seems to have the sequence of Publish and OctoPak wrong?

This could be a reason:
If the section exists, OctoPack by default won't attempt to
automatically add any extra files to your package, so you'll need to
be explicit about which files you want to include. You can override
this behavior with /p:OctoPackEnforceAddingFiles=true which will
instruct OctoPack to package a combination of files using its
conventions, and those defined by your section.
https://octopus.com/docs/packaging-applications/creating-packages/nuget-packages/using-octopack
Another idea — broken .csproj file. Please check it.
Maybe, during the merge these two lines were reordered:
<Import Project="$(MSBuildToolsPath)\Microsoft.CSharp.targets" />
<Import Project="..\packages\OctoPack.3.6.3\build\OctoPack.targets" Condition="Exists('..\packages\OctoPack.3.6.3\build\OctoPack.targets')" />
There should be Microsoft.CSharp.targets first. Order matters.

Workaround: OctoPack running after publish
<Target Name="SetRunOctoPack">
<PropertyGroup>
<RunOctoPack>true</RunOctoPack>
</PropertyGroup>
</Target>
<Target Name="AfterPublish" DependsOnTargets="SetRunOctoPack">
<CallTarget Targets="OctoPack"/>
</Target>

Related

SSIS Deployment Woes

I'm quite confused as to how to create a deployment in SSIS 2008 that I can use throughout the various sites we are going to deploy to. I'm using the deployment utility to deploy my ETL packages which are file based and executed using a SQL job.
When I rebuild my solution, the deployment files are created along with their configuration files which I bind my connection strings to. I've discovered that each of the packages are still referencing the configuration files in my project folder, rather than the configuration files in the deployment folder. I thought that when I created a deployment, the paths referencing the configuration files would be relative paths.
Ideally, what I would have liked to have been able to do would be to copy the contents of the deployment folder to a flash drive, plug it in at the site I'm deploying to and edit the configuration file per the customer site, execute the deployment manifest file in the folder and expect everything to work. But this doesn't seem to be the case.
I also notice that the SQL job has an option to specify the configuration files for the packages, but this doesn't seem to have an effect either. I must clearly be doing something wrong here, please could someone assist.
Seems like you are encountering these two issues with SSIS deployment and execution:
Configuration file references are stored with absolute paths (meaning the concrete path used in the development environment when the configuration file reference was created, and in production this is the same path that will be used).
Specifying a different configuration file at runtime in SSIS 2008 cannot override values specified at design time (see Understanding How SSIS Package Configurations Are Applied at Run Time).
To deploy your packages with a simple file copy the way you describe, you must change your packages to use a relative reference to your configuration files:
Right click the package file and select View Source to open the XML view of the package source. Search for your configuration file, which will include the path, and remove the path; keeping only the filename portion. Alternatively, change the absolute path to a relative path to the configuration file. Save and close the XML view of the package.
Now when you deploy the package and the configuration file together, ensuring they have the same relative location to each other, the package will find the config file by the relative path, and work the way you expect.
Note: from this point forward you will need to open the BIDS IDE by double-clicking on the project or solution file. If you launch Visual Studio, and then open the project or solution from within the IDE, the IDE will not be able to find the configuration file when you execute the package (the current directory will be Windows\System32, not your package folder).

Automated deployment options for SSRS

I have been tasked to look into ways to automate the deployment process for our SSRS 2012 reports. Are there any good tools out there? I'm thinking of something along the lines of press a button and the report gets deployed.
Thanks!
To deploy our SSRS reports, we're using this lovely powershell project:
https://github.com/timabell/ssrs-powershell-deploy
Usage:
.\Deploy-SSRSProject.ps1 -path YourReportsProject.rptproj -configuration Release -verbose
or you can use the alternate parameter set:
.\Deploy-SSRSProject\Deploy-SSRSProject.ps1 -path .\AFS.Reports.rptproj -ServerUrl http://localhost/Reportserver -Folder MyReports -DataSourceFolder "MyReports/Data Sources" -DataSetFolder "MyReports/Datasets" -verbose
The full deployment story (for us):
ssrs-powershell-deploy scripts, .rptproj, .rds, .rdl files are all packaged into a nuget package by our build server.
Octopus Deploy extracts the nuget package on our SSRS server and calls Deploy-SSRSProject.ps1
Visual Studio Deployment
Visual Studio is actually really good at automatic deployment. I've used it a number of times with great results. You need to split your solution into separate projects for each folder on the report server and then it will take a bit of time to configure each project & deployment environment. But after that initial time investment it works wonders and when you add a new project you can simply copy the deployment settings for an existing project.
MSDN article: Set Deployment Properties (Reporting Services)
Rs.exe Utility
Alternatively you can use the Rs.exe utility which comes with SSRS. It is a command-line utility used for automatic deployment and administration. I haven't personally use this one, but I know of it. It is my understanding that there are also third party utilities which leverage Rs.exe in order to automate report deployment but I haven't used any of them so I can't recommend any.
More info on MSDN: RS.exe Utility (SSRS)
I'm sure there are also other third-party tools you could get but I haven't ever looked into them. I've always found the Visual Studio deployment functionality sufficient for my needs.
I have done it using devenv which is located in:
C:\Program Files (x86)\Microsoft Visual Studio 10.0\Common7\IDE. The ... 10.0 folder is based on the version no. of your Visual Studio so I have only used this version so cannot vouch for anything else. If you view the help /? commandline switch you can see there are options to build and/or deploy a solution.
In brief I used: devenv {solutionfile} /Deploy {configuration} /Project {projectname}.
I did not use any deployment software but I did know how to code in C#.NET so pays to be familiar with System.IO and System.Xml name spaces.
Given the requirement deploy any file (so reports, datasets or datasources) that has been modified within the past 2 week sprint.
So basically my .net code worked as follows (NB: there are ares that you might have to consider first how well do you know how solution and project files work and if you have more than 1 platform - a platform for the uninitiated is a different set of project build and deployment settings):
Read the sln file line by line to get list of projects for the specific platform that are ready to be built/deployed, for simplicity my code assumed only one platform and all projects were to be deployed
The list from 1 gives me the subproject\subproject.rptproj settings in which I can break on the backslash to get the subfolder name from which I can iterate over all files in the project folder and check each file for the LastWritten datetime stamp of the file to determine what files that need to be deployed.
I back-up the entire file (declaration and contents)
If a file has not changed I edit the project file on the fly using xml and remove all unwanted files (ProjectItem's) not to be deployed
If there are dataset or datasource files changed then I also edit the respective configuration section of the project file and modify the particular configuration section accordingly
Run my build solution process i.e. devenv with commandline args (FYI: I did not encounter any .NET exceptions in this step)
Restore my project file
Providing your SSRS solution is configured correctly and the person running the .net commandline solution has permissions to deploy all should be well. Was easy enough to share my commandline solution source code to anyone else in my team to run to avoid having to white-list the exe if your company has employed such restrictions.

Jenkins build outside of workspace

I am new to Jenkins/Hudson and am trying to migrate a C make-based project from buildbot. For legacy reasons, the build system is hard-coded to build outside of the versioned source tree (git), one directory above, in a separate directory. E.g.:
workspace
.git
foo
bar
build
artifacts
Besides the fact that it ends up creating a directory outside the workspace, Jenkins won't recognize items in the build/ directory above to archive as artifacts.
How can I make this kind of build system work with Hudson? Building in-source-tree is not a short-term option. The only option I found was "use custom workspace," but all this does it hard-code the workspace directory to some other directory.
To answer my own question: there is indeed an option in Jenkins git plugin to check out to a local subdirectory instead of the root of the workspace. With the git plugin, click on the Advanced button and fill in the field "Local subdirectory for repo (optional)".
I don't find the option that djs mentioned, but you can specify a different work directory:
Configure job
Extended Project settings
Use custom work space
This can be set to everywhere you want, also the workspace of a different job.

Ant script generate a findbugs_result.xml, but the hudson can not display on main interface. why?

Findbugs script in build.xml:
Findbugs checks Coding...
Findbugs checks Coding...
genetate a findbugs_result.xml.
Download Findbugs plugin for hudson.
Enter findbugs_result.xml path for hudson.
But hudson can not display on main interface. why?
I've found some annoying inconsistencies in the way Hudson stores paths; sometimes you use the overall workspace directory for the Hudson job and other times you have to use the path to the code that the job checks out of source control. For example, if your code gets checked out to MyProject under your workspace directory, and then the test XML files go into MyProject/target/test-reports, try specifying the path with and without the MyProject at the beginning of the path.

Load a context/servlet at startup in Tomcat *WITHOUT* changing deployment descriptor (web.xml)

I've got a foo.war file from a third-party vendor. I've defined a context in my Tomcat configuration by creating conf/Catalina/localhost/foo.xml that contains:
<Context docBase="/path/to/foo.war" ...> ... </Context>
I want Tomcat to load up the foo context at startup. But the WEB-INF/web.xml (deployment descriptor) in the foo.war file does not include a <load-on-startup>, so Tomcat waits until the first request. I'd really rather not unpack the third-party foo.war to edit their web.xml. Plus, I'd have to do it every time the vendor releases a new version of their .war.
Is there any way within Tomcat configuration to tell Tomcat to load the foo context at startup? I know that within the <Context> element you can set parameters, env vars, etc without editing the web.xml. But I can't find anything in the Tomcat docs about loading on startup.
This is tricky. You're limited by the conventions of Tomcat and other containers, so there's no straightforward solution.
You could use the global web.xml to initialize specific servlets and/or JSPs from the .war using the <load-on-startup> element. This is the only way I know of to force load-on-startup without modifying the .war file or the WEB-INF/web.xml inside it. Note that you may need to initialize the servlets and JSPs using different names/paths to avoid conflicts.
Of course, doing it that way means you have to know enough about the .war to initialize the app, which might mean looking at its web.xml to determine what to load. This might defeat the purpose, since it's not exactly a hands-off approach to loading just any .war on startup. But with a little extra work, you could write a script that extracts the necessary information from the .war file's web.xml and adds it to your global web.xml automatically.
Now, if you're willing to consider script writing to modify the .war file, you could just write a script that extracts WEB-INF/web.xml from the .war file, adds <load-on-startup> child elements to all the <servlet> elements, and updates the .war with the new copy. I'm not sure what environment you're using to run Tomcat, but here's an example bash script that would do the job:
#!/bin/sh
TEMPDIR=/tmp/temp$$
WARFILE=/path-to-tomcat/webapps/foo.war
mkdir -p $TEMPDIR/WEB-INF
pushd $TEMPDIR
unzip -qq -c $WARFILE WEB-INF/web.xml \
| sed 's#</servlet>.*#<load-on-startup>99</load-on-startup></servlet>#' \
> WEB-INF/web.xml
zip -f $WARFILE WEB-INF/web.xml
popd
rm -rf $TEMPDIR
You could run this script or something similar as part of your Tomcat startup. Hope this helps.