Crystal Reports Server - get list of currently running reports and their progress - reporting-services

Hope it's the right place to ask this question - usually I use SO to ask about programming...
I'm doing a project that involves Crystal Reports Server. From code, I'm able to schedule reports successfully, but when I look at the BI launch pad I don't see the report in My Recently Run Documents (I see failed reports in that list - ones that has wrong database credentials).
When I go to Central Management Console and I find my reports in folders and I go to Properties > History I see the report status as "Running" - and it has been like that for a long while (too long than it should) for 2 different reports I have sent.
How can I diagnose what the problem is? and why it is stuck? there are no error messages anywhere about it.
How can I get a full history of all reports in the system (not just one single report at a time)? and how can I see currently running reports?
How can I stop a running report?
I really hope this is the right place for these kind of questions... if not, would be very happy to get a referral.
Thanks

How can I get a full history of all reports in the system?
Open the CMC and then click on the Instance Manager. At the bottom of the page, you can filter on the object type and status. That way, you can get a full overview of all running reports on your platform.
How can I stop a running report?
If you select a running instance (either in a document's history page or in the Instance Manager), you'll notice that there is no stop button. Instead, you have to delete the running instance. It might not stop running immediately though (depending on what it's doing), but it will be removed immediately from the list of instances.
How can I diagnose what the problem is?
What I would recommend is to enable tracing on all related servers (thus your job server, processing server, etc) and then retry scheduling the report. This should generate additional logging on the server which you can use to diagnose the issue.
The trace files have the extension .glf (generic log file) and are located in the logging folder on your Crystal Server. Have a look at the command-line property of each of the servers for which you're enabling the tracing, you should find a log folder there somewhere.
Make sure to turn the tracing off again as soon as you're finished, as tracing will not only create extra strain on your servers (causing the system to slow down), but it will also result in very large log files.
Before starting with tracing, have a look at the existing log files to see if it doesn't already contain error messages that might help you diagnose the issue. Sort the log files by date, and look at the most recent one for each of the servers involved. If there's nothing in there, start with tracing, but remove the existing .glf files to minimise log contamination (some files will be locked, just ignore them).

Related

Sudden increase in packages failures executed against Sharepoint/Office 365 from SSIS

Beginning yesterday afternoon (12/9 Central US time) we began seeing a marked increase in SSIS packages execution failures. These packages have been in operation for several months and experienced no failures on 12/8. Initially I brushed it off as temporary, but now it seems as if "none" of them are working. Several of these packages run hourly, with the first failure around 10:30 on 12/9. Between 10:30 and 1500 'most' succeeded, but after 1500 on 12/9 most failed.
I'm testing with a relatively simple dataflow package. I have two sources (SQL and Sharepoint). From the sources, I compare the two and then update the Sharepoint list with any changes that have been made (SQL query is the authoritative record). The Source Sharepoint list is the same list that is being updated. As a further test, I removed all steps except for querying the Sharepoint list and sorting it. The initial query still fails.
Errors are happening inconsistently within the dataflow package. For example since I've been testing this morning, I had one (and only once) that made it through the package to the point it was should have tried to Add, Update or Delete list items. The table comparison resulted in updates to the Sharepoint list. The package failed when attempting to update the records. Most of them (and all recent attempts) are failing when the dataflow queries the Sharepoint list initially. There are only two records on the Sharepoint List and two records on the SQL table.
I'm connecting to Sharepoint using MS Graph. Testing the connection (Connection Manager) within VS 2019 has succeeded every time. I've verified that the secret I'm using is not expired. I created a new secret and am receiving the same error. 'Usually' if I attempt to preview the Sharepoint source that is successful, but not always. Even if it is successful attempting to debug and run the package fails. I'm not seeing any alerts on Microsoft or Azure that would provide any indication that the problem's there, though I feel like something must have changed there.
I have opened a support ticket with CozyRoc and they have directed me to open a ticket with Microsoft. Microsoft's support request workflow is directing me here.
In the production All Execution reports, the error I'm getting back is:
"Data Flow Task:Error: Attempt to read message string for 0xc02090f5 failed with error 0xc02090f2. Make sure all message related files are registered."
Initial research pointed me toward a data typing issue, but I've not changed anything in our Sharepoint, SSIS or SQL environment to have changed the data types.
This appears to be very repeatable so I can try providing more information if needed.
Look likes the answer was to wait. After about 1 week, the issue resolved as quickly as it appeared. I didn't find a way to report my issue directly to Microsoft without adding a support plan. I was in the process of finding alternative methods to address our needs when it resolved 'by itself'.

SSRS Subscription strategies

I'm hoping someone with a little more experience than me will know a better way to schedule SSRS reports.
Here is my situation: I have a report on my SSRS report server which takes about a minute to run. I have about a dozen subscriptions setup on the report which cause it output into an excel file on a network drive every hour. About a dozen users check this file regularly, as it contains an queue of work that's come in from their helpdesk.
I believe the users are leaving the file open in Excel, because periodically the files starts locking and I get these errors:
Failure writing file : The process cannot access the file
'' because it is being used by another process.
Even though I've told them all to make copies of the file and only open it locally, they still are having trouble with this.
It seems like a natural way to use SSRS, but because of human tendencies, it is not really working. Is there a better way to do this?

SSRS Reports location when being deployed

I'm just helping out while our regular SSRS guy is away and having an issue.
There is currently a very large report that's being generated. The subscription settings for this report is to ftp the report. The Subscription status for the report currently says "Processing delivery...". I assume this to mean that it finished generating and is now trying to send the file to the ftp location. I can see that the file is there in the ftp folder but the size is still 0kb. It has been saying "Processing delivery..." for a long time now.
My question is: Is there a location (folder or sql table) I can check to see if ssrs is actually doing something? I assume it would create temporary file somewhere locally first before sending it on its way.
Check the ExecutionLog views in the ReportServer DB of your SQL Instance. There are three views which show everything from the requested format to start time, complete times, and parameters used. Take a look at the time difference between start and complete for previous runs and compare it to the running time of the current report whose complete time will be null if it is still processing.
For bonus points: setup a SSRS report based on the view and set a subscription for daily email delivery. This saves a TON of time when troubleshooting reports where users can't tell you the parameters they use or the morning reports are empty, etc.
Also, funny as it sounds, you might want to check the free space on the server that the report is output to. With no free space the report will show processing delivery for a long time. If it's FTP and you don't have control of the FTP server, still might be worth the quick phone call.

TFS Requirements Overview Report showing wrong data

I have an interesting issue with TFS reports. When I run the QUERY: Team Queries->Planning and Tracking->Work Breakdown, I see the correct information, which is to say that I see the work items, etc. that are entered into TFS. However, when I run the REPORT: Reports->Project Management->Requirements Overview I see that same data PLUS data that is no longer in the system.
Important information:
* I am using TFS 2010
* When I originally created this project, I used a Microsoft Project plan to upload the work items. Before my team started using it, I decided to forget about Project and just use the web/studio interface, so I used the query "Delete all items" to clean the database.
While the clean worked in all other cases, this report seems to be holding on to those items, and I would like to know if there is a way to fix that. It has been several weeks, and I ran the cube reports to see if it was updating (everything updates fine).
Anyone have a clue what's going on here?
I'm not familiar with the query that you talk about, but if you do a delete of workitems, the delete may not have been propagated to your warehouse (and subsequently the cube). If you have a relatively small number of WorkItems in your TFSWorkItemTracking database, it may be a good idea to rebuild your TFSWarehouse, which will then refresh your cube.
Take a look at the SetupWarehouse.exe command, which should be installed on your Application Tier. This could take anywhere from an hour to a day to run, depending on your version control and work item tracking database, so you may want to do it off hours. It shouldn't affect the day-to-day execution of TFS, just the reports.
The above is for TFS 2008 Only. Per Matthew below, here's the answer for TFS 2010
From what I found SetupWarehouse.exe
no longer exists with TFS2010. In the
Administration Console, under
Application Tier->Reporting, there is
an option called "Start Rebuild".
Using this completely resolved my
problem. Thank you. It should be noted
that there is NO feedback from
clicking on "Start Rebuild". At first
it looked like the admin panel hung,
then it came back without feedback. It
took about an hour for reports to
start working again, which is the only
way I knew it was done.
If you ever get into the situation again where you need to permanently get rid of one or more workitems, you should get the TFS Power Tools. The TFPT utility has a "destroywi" command that allows you to permanently (and safely) remove workitems from TFS.
Power Tools are available here: http://msdn.microsoft.com/en-us/vstudio/bb980963

Having issues with the SSRS Reports site

So I'm working with SSRS (SQL Server 2005), which some of our applications use to generate downloads. The problem with it, though, is that the Reports website that is used to manage it seems to crash randomly. I haven't yet figured out a rhyme or reason to it - only that it will suddenly bust out with a 'Specified cast is not valid' exception and any further attempts to do anything will fail with
The item '/' cannot be found (RsItemNotFound)
Is there a place I can start looking to help me debug this issue? Are there logs that might have more in-depth information than the useless error messages I'm getting?
Is there anything in the standard event logs? You'll probably find lots of logs to go through in (C:\Program Files\Microsoft SQL Server\MSSQL.3\Reporting Services\LogFiles)...that's the default path at least.
we have a web app at where i work that occasinally, after it's been left alone for a while, has to "recompile" and will take a few seconds before it'll show a page. i've noticed that ssrs does this at times, and it might or might not be causing your problem.
you could test this by setting up an instance of ssrs on your local machine (that you know no one will get on) and running your downloads against that.
also, check and make sure no one is turning anything off or doing any backups when you are doing the downloads.