How can i combine (mercurial) changesets from two jobs in jenkins - mercurial

so that instead of getting them individually on a per job basis, get them all in one report, and can also report them via Email-ext report.

use Multiple SCM Plugin

Related

Executing multiple SSRS reports from SSIS package

I have developed an SSIS package to run 3 reports from Reporting Services that are data driven subscriptions.
When I run the SSIS job it executes all the 3 reports at once, what I need is to run the reports sequentially, in other words, one by one. How can I do this?
This is an expected behavior. When you trigger a data driven subscription job, the SQL Server Agent starts the job and that completes the whole transaction. The SSIS package would then go on to trigger the next data driven subscription job and the next ( assuming you have put the job-triggering in sequence).
Now if you want to create a dependency in the way the jobs should run i.e. Job1 followed by Job2 followed by Job3, you need to manually write additional piece of code. The way to go about it would be to monitor the status code of the subscription.
In the ReportServer database there is a table called dbo.Subscriptions containing a column 'LastStatus'. Currently in my local db, I don't have any subscriptions and also am not able to find any documentation for the table. But I am pretty sure this would be either a boolean or a status flag such as 'Sucess' or 'Failure. Upon triggering the first job, you would need to write a .net Code to monitor this status with a polling interval. Once you get the desired outcome, move on to triggering the next job.
Hope this is clear. I would edit this answer with an working example.

Is it possible to run all the reports in a given folder from report server?

I have a folder with around 15 reports in it, these are Report Server reports. To run each report individually will take a while, so I want them to run together. So, what I want to be able to do is somehow run all the reports in this folder, is this possible?
This is somewhat of an ambiguous question. Let me explain. What are you asking specifically?
Q: Can you run multiple reports at the same time?
A: Yes, and there are several ways to accomplish this.
1. You can use SQL agents
2. Use batch files with task scheduler
3. Use an SSIS package and use an agent to run them at specific times...etc...
Hopefully one of the reports does not depend on another and another thing that you have to take in to consideration is how hard you will be hitting the SSRS or SQL server. Running them all at one time may take longer than one at a time. depending on the bandwidth of the SQL Server and what tables are going to be locked up during each of these processes.
You might want to give a little more detail in your question...
I would recommend an SSIS package, especially as it also one of the options presented by #Michael that can email the Excel workbook too which you mentioned in an earlier comment.
The following resource covers quite well the execution and export of an SSRS report using SSIS, including code you will need as a starting point: Executing an SSRS Report from an SSIS Package.
You could save some time in coding the solution by using the following custom Task that can be integrated into SSIS: SSIS ReportGenerator Task.
There is one problem in your requirements though which is merging reports into one Excel workbook where I assume you want separate sheets for each report within the same workbook?
Reporting Services can use multiple worksheets (to divide a report up into pages a.k.a pagination) but only for a single report; it can't merge reports into one Excel file. This can be accomplished with custom code however. There's a somewhat basic example here: Merging workbooks into a master workbook with separate sheet for each file.
One way to run all the reports at once is to add subscription to all of them and set same subscription start time in all of the reports. what will happen is once the start time arrived all the reports will run simultaneously and will generate excel/pdf (any format specified) file at shared location.

SSIS package data flow tasks report

We have many SSIS packages that move, import, export around large amount of data. What is the best way to get alerts or notifications if expected amount of data is not processed? or How to get daily report on how different SSIS packages are functioning. Is there a way to write/use a custom component and simply plug it in to SSIS packages instead of writing custom component for each package?
For your first question, we use user variables in SSIS to log the
number of rows processed by each step along with the package name
and execution id. You can then run reports on the history table,
and if any of the executions have a large variance in the rowcounts
processed, you can trigger an event.
Yes. See here, or in the alternative, google "custom ssis
component tutorial".

How to pass committers between jobs if only the first job refreshes sources from SVN?

There is one job which refreshes sources from SVN and builds these sources.
After build this job sends notification about build to committers.
Then this job triggers second job using "Trigger parameterized build on other projects" plugin.
Second job did not refresh from SVN anything. It just run some tools using classes compiled by the first job.
I need to send notification to the committers if second job will be failed.
Is it possible to pass committers from the first job to the second?
I use Blame Upstream Committers plugin. Currently it works properly only when the Parametrized Build plugin is invoked as a Post-Build Action, not as a build step (due to a bug in the Parameterized Build plugin which is being claimed to have been fixed, but not released, yet).

Configuring project report generation for Hudson within Hudson

There are a number of plugins for Hudson to create coverage, test result, metrics and other reports.
It seems that all of them require you to add extra configuration to your build scripts (or Maven POM) for every project that you want to have the reporting done. For example, if you want to have a FindBugs or a Cobertura report, you need to add the report-generating step to your projects.
Is it really necessary to update every single POM file? That is a lot of repeating oneself, and it requires updating the target project's source repository (where the POM is located).
Is it possible to instead have a setting just within Hudson to enable report generation? It seems that since all you are required to do is enable the respective Maven plugin with its default settings. Cannot this be done externally by Hudson.
This is not possible - the Hudson philosophy is that your build tools should generate the reports. Hudson will pick those up and render them in the UI.
I think this is what sonar is trying to solve. It does generate all kinds of reports (Code coverage/pmd/checkstyle, etc) for your projects without having to add configuration to your projects. This helps a lot to cut down duplication if you have many projects to check.