I have developed an SSIS package to run 3 reports from Reporting Services that are data driven subscriptions.
When I run the SSIS job it executes all the 3 reports at once, what I need is to run the reports sequentially, in other words, one by one. How can I do this?
This is an expected behavior. When you trigger a data driven subscription job, the SQL Server Agent starts the job and that completes the whole transaction. The SSIS package would then go on to trigger the next data driven subscription job and the next ( assuming you have put the job-triggering in sequence).
Now if you want to create a dependency in the way the jobs should run i.e. Job1 followed by Job2 followed by Job3, you need to manually write additional piece of code. The way to go about it would be to monitor the status code of the subscription.
In the ReportServer database there is a table called dbo.Subscriptions containing a column 'LastStatus'. Currently in my local db, I don't have any subscriptions and also am not able to find any documentation for the table. But I am pretty sure this would be either a boolean or a status flag such as 'Sucess' or 'Failure. Upon triggering the first job, you would need to write a .net Code to monitor this status with a polling interval. Once you get the desired outcome, move on to triggering the next job.
Hope this is clear. I would edit this answer with an working example.
Related
I have a simple package to produce a csv file from a query and update a 'lastrun' time on a table. This was written in 2014, running on a test server with 2014. The agent job that runs it simply executes it via an SSIS Package step. No other steps are involved.
However, I get the above error message in the agent log file. The job will successfully execute and produce a file, but ONLY after either restarting the agent service or changing the properties on the job (after refreshing the job list in SSMS). And because it seemingly deletes itself during execution, there is no job history to view, and then the schedule will stop repeating.
I can't find anything like this on here, and wondered if anyone has ever seen this, or has any ideas?
Thanks.
Note (update) : All other agent jobs run ok on the same server. The only difference with this one is that it's the only one that is calling an SSIS package.
It could be when you try to restore the same database (subscriber/distribution) as another database name, it will clear the job automatically.
We have many SSIS packages that move, import, export around large amount of data. What is the best way to get alerts or notifications if expected amount of data is not processed? or How to get daily report on how different SSIS packages are functioning. Is there a way to write/use a custom component and simply plug it in to SSIS packages instead of writing custom component for each package?
For your first question, we use user variables in SSIS to log the
number of rows processed by each step along with the package name
and execution id. You can then run reports on the history table,
and if any of the executions have a large variance in the rowcounts
processed, you can trigger an event.
Yes. See here, or in the alternative, google "custom ssis
component tutorial".
Shortly, is it possible to recover the report that was generated by SSRS when SMTP fails?
Sometimes we have network problems with SMTP and even the report was generated it couldn't be sent. We can send them manually later, but the problem is that most reports needs to be generated at a specific time, otherwise data my not be correct (for example, end of day reports needs to be generated at 12:00am, otherwise some updated numbers may not be correct. But we want to resend the report generated at 12:00am when we come to work in the morning)
I considered sending these reports though snapshots but some report subscriptions have parameters, and I couldn't find how to take snapshot for each subscription.
Any ideas?
Thanks in advance,
Onur
You could capture the data each day at 12am using a sql agent job which executes a stored procedure to populate a table/tables with the data.
If this is a report which can also be run live, you would run the report based on a hidden parameter #extract_type indicating whether or not this is live or snapshot. The default would be "live" and the subscription would pass "snapshot" for this parameter. The code in each dataset would filter based on #extract_type, perhaps through a view.
This way, the subscription could be re-run at any time of day with the same report results.
well, first you're not alone. i had issues with smtp errors in all places i worked.
there is a LastStatus column in [dbo].[Subscriptions] table. you can make a job that runs every few minutes, checks this table and restarts all subscriptions which have smtp errors.
like SQLDiver, i think it's a good idea to store the data behind reports in a table. it's not a job for a reporting tool to store your data.
I am wanting to create a one-time SSRS shared schedule that can be altered once run, to execute again at a future point in time. The reason I am having to set-up a subscription in this way is because I need to email a set of reports based on a bespoke accounting calendar; we don't have enterprise so data-driven is not an option.
I have tested this method in the ReportServer.dbo.Subscription and Schedule tables, but have so far not managed to get it to work. I have updated the NextRunTime and StartTime columns of the Schedule table to the time I require the reports to be sent, and this appears to update on the Report Server Shared Schedules page (Next Run) but the email does not send.
Is there another table on the ReportServer database that I need to update, or is this simply not possible?
Thanks
Andy
I suggest taking a look at the article below. It details how to set up data driven subscriptions on SQL Server Standard Edition, and goes through manually activating subscriptions via SQL Server Agent.
http://www.sqlservercentral.com/articles/Development/datadrivensubscriptions/2432/
I've used if for a couple of clients and it generally works pretty well. The only thing I've since altered is increasing the timer to 10 seconds between running each data driven report.
Regards,
Jason
In reading up on this, messing abound in the tables is not supported by Microsoft. USE AT YOUR OWN RISK!
I attempted the very same thing. When you change a subscription on the front end, the ReportServer.dbo.Schedule.StartDate column changes with each edit. I tried just altering this value, but adding a value here made no difference. There was no email and when I went back to my subscription in the SSRS webpage, the original datetime value was there. So that datetime must be stored elsewhere.
I did a SQL Trace and found these Sproc's were being called. I copied out the lines that looked promising. By changing the time value #active_start_time=160400 in the last one and running the entire batch, I could get an emailed report at the desired time. But I am not content with this; because when I go back and view My Subscriptions webpage, the original datetime is still listed.
exec msdb.dbo.sp_delete_job #job_name=N'D478A346-ED05-422D-A73E-023080AD56DD'
exec msdb.dbo.sp_add_job #job_name=N'D478A346-ED05-422D-A73E-023080AD56DD',#description=N'This job is owned by a report server process. Modifying this job could result in database incompatibilities. Use Report Manager or Management Studio to update this job.',#category_id=101
exec msdb.dbo.sp_add_jobserver #job_name=N'D478A346-ED05-422D-A73E-023080AD56DD',#server_name =N'(LOCAL)'
exec msdb.dbo.sp_add_jobstep #job_name=N'D478A346-ED05-422D-A73E-023080AD56DD',#step_name=N'D478A346-ED05-422D-A73E-023080AD56DD_step_1',#command=N'exec [ReportServer$instanceDB].dbo.AddEvent #EventType=''TimedSubscription'', #EventData=''db7d7b08-3eee-4bb8-b354-b58c653b8ab6'''
exec msdb.dbo.sp_add_jobschedule #job_name=N'D478A346-ED05-422D-A73E-023080AD56DD',#name=N'Schedule_1',#freq_type=1,#active_start_date=20180405,#active_start_time=160400,#freq_subday_type=1
Is it possible to distribute reports based on a database insert trigger, using the standard edition of SQL Server 2008 R2? The trigger will be looking for a certain column to have a certain value, which will indicate that it is time to fire off a report. This will not happen very often, probably a couple of times a day.
This must be a common scenario, and yet I can't find any information on it, which must mean that I'm asking the wrong question. (I am completely new to SQL Server.)
TSQL using FireEvent?
Thanks,
Mike
Is there any aversion to checking for the value in the application, then execute and deliver the report from there using the report execution webservice? I don't like the idea of database triggers kicking off external processes. Too many clunky configuration dependencies and opportunities for failure for my taste.
If you really want to pursue this, this seems to be least complicated way (it requires reporting services and application database to on same instance):
Create reporting services subscription that delivers report in desired format
Open the SQL Server agent job list and get the unique identifier of the job name that corresponds to the subscription that was created (you'll have to look at job creation and last/next run datetime to find which job relates to your subscription)
Create trigger that executes the msdb.dbo.sp_start_job procedure, passing the job name that matches your subscription
See what I mean about clunky dependencies? I would definitely stick with checking values and delivering the report in the application layer if at all possible.
The reason you do not want to use a trigger for this is that you will not be able to insert/update/delete if reporting is down.
Can you have some slack time (5 minutes or so)? If so do a trigger to send to a differnt table that the change was made and have a job to look at that table for the change every five minutes.