RSTempFiles - Linking Them To A Specific Report - reporting-services

Is there any way to identify which report/subscription caused an RSTempFile to be written? We have had a few sizeable ones recently, and short of matching approximate file times with the runs in the execution log, its hard to say exactly which report run caused it.
The file names all contain some soft of UniqueIdentifier but it doesn't link back to report/subscriptionID.
C:\Program Files\Microsoft SQL Server\MSRS##.MSSQLSERVER\Reporting Services\RSTempFiles
Thanks,
Ralph

Related

When executing report part in Report Builder, The path of the item "~" is not valid. The full path must be less than 260 characters long

All,
Recently ran into this issue while working on Report Parts for a client. I have a utility that creates report parts in batches based on information you provide to it. This includes the paths for the datasets and data sources.
After creating the Report Parts and the corresponding Data Sources (all were just tablixes based on the views) I tried to run them in report builder 3.0. I got this error:
The path of the item '{Path to Report Part Here}' is not valid. The full path must be less than 260 characters long
At first I assumed that the path length was the issue, since it IS a long path. However it was only around 160 characters. Not even close. When you expand the error message, which is essential, you get this:
...; other restrictions apply. If the report server is in native mode,
the path must start with slash.
This ended up being the issue. The person who ran the utility incorrectly eliminated the leading forward slash in the path to the data sets and data sources. Something along these lines is on MSDN, but I wanted to have it here as well.
My suggestion would be to open the xml files in Notepad ++ and replace the paths across all the files you have. You'll have to do this for both the report parts and their data sources as far as I know.

Crystal Reports Server - get list of currently running reports and their progress

Hope it's the right place to ask this question - usually I use SO to ask about programming...
I'm doing a project that involves Crystal Reports Server. From code, I'm able to schedule reports successfully, but when I look at the BI launch pad I don't see the report in My Recently Run Documents (I see failed reports in that list - ones that has wrong database credentials).
When I go to Central Management Console and I find my reports in folders and I go to Properties > History I see the report status as "Running" - and it has been like that for a long while (too long than it should) for 2 different reports I have sent.
How can I diagnose what the problem is? and why it is stuck? there are no error messages anywhere about it.
How can I get a full history of all reports in the system (not just one single report at a time)? and how can I see currently running reports?
How can I stop a running report?
I really hope this is the right place for these kind of questions... if not, would be very happy to get a referral.
Thanks
How can I get a full history of all reports in the system?
Open the CMC and then click on the Instance Manager. At the bottom of the page, you can filter on the object type and status. That way, you can get a full overview of all running reports on your platform.
How can I stop a running report?
If you select a running instance (either in a document's history page or in the Instance Manager), you'll notice that there is no stop button. Instead, you have to delete the running instance. It might not stop running immediately though (depending on what it's doing), but it will be removed immediately from the list of instances.
How can I diagnose what the problem is?
What I would recommend is to enable tracing on all related servers (thus your job server, processing server, etc) and then retry scheduling the report. This should generate additional logging on the server which you can use to diagnose the issue.
The trace files have the extension .glf (generic log file) and are located in the logging folder on your Crystal Server. Have a look at the command-line property of each of the servers for which you're enabling the tracing, you should find a log folder there somewhere.
Make sure to turn the tracing off again as soon as you're finished, as tracing will not only create extra strain on your servers (causing the system to slow down), but it will also result in very large log files.
Before starting with tracing, have a look at the existing log files to see if it doesn't already contain error messages that might help you diagnose the issue. Sort the log files by date, and look at the most recent one for each of the servers involved. If there's nothing in there, start with tracing, but remove the existing .glf files to minimise log contamination (some files will be locked, just ignore them).

SSIS ‘Data Flow Task’ No records in flat file destination

Please forgive my initial posting being a question instead of a solution.
I’ve got two SSIS packages that basically do the same thing. The last step of both is a ‘Data Flow Task’ that queries the database and attempts to write the results to a flat file. One of the packages builds the flat file correctly, the other builds the file but doesn’t populate it with any records. Running SQL Server 2008 R2.
This is in a University setting involving transferring degree_codes and demographics between two systems. The degree_code package is functioning, the demographics isn’t. Both ‘Data Flow Tasks’ consist of an OLE DB Source linked to a Flat File Destination (tab delimitated text). Both packages display the correct data set when previewing the OLE DB Source.
In the Flat File Destination, the mappings are correct in both packages. However, when previewing the data, the degree details display correctly, but there are no records in the demographic preview. That’s also true when looking at the connection managers. And when the packages run, the degree_codes file is correct while the demographics file only contains a header. It seems there is a problem with the link between the OLE DB Source and the Flat File Destination
Both packages run with only a warning about shared global memory impacting performance. I’ve deleted and rebuilt the non-functioning Data Flow Task and connection managers without fixing the problem. At this point I am at a loss of which direction to go and don’t know how to diagnose the problem. Have any of you folks run into a similar situation or do you have any suggestions how to chase it down. I’d be grateful for any solutions.
Try to export the data to a tmp table in your db, if the data is saved there the issue is on the file connection, if not your query needs to be rewritten
Verify the query columns you are executing on the tables are matching and data types are as expected in output , try putting all as string types initially and check if it works then apply correct data types after it executes successfully you can modify for data types as needed

Deadlock on logging variable value changes using a SQL task

Morning
I've been reading "SQL Server 2008 Integration Services Problem - Design - Solution". It outlines a way of logging variable changes which I'm trying to replicate in SQL 2005.
Create variables e.g. PackageId, RecordsAffected. - Set Raise ChangeEvent to true.
Create a string variable g.g. strVariableValue. - Set Raise ChangeEvent to false.
On the package event handler: OnVariableValueChanged add a script task "SCR Convert value to string".
Add ReadOnlyVariables: System::VariableValue
Add ReadWriteVariables: User::strVariableValue
In the script, set a local variable to System::VariableValue.value.tostring
Set the variable User::strVariableValue to the local variable
Add an "Execute SQL Task" component "SQL Log Variable Value Changed" calling a SP with no resultsets.
Set parameter mapping to User::PackageId, System::VariableName, User::strVariableValue
When this is run, I get a deadlock on User::PackageID
Error: 0xC001405B at SQL Log Variable Value Changed: A deadlock was detected while trying to lock variable "User::_PackageID" for read access. A lock could not be acquired after 16 attempts and timed out.
The script step succeeds but the Execute SQL task fails. I'm using Visual Studio 2005 Version 8.0.50727.42, Microsoft SQL Server Integration Services Designer Version 9.00.4035.00 and BIDSHelper Version 1.4.3.0.
Any ideas?
Eureka!
I had the same problem and led to a few deadend posts, then I discovered the root.
I had the framework working just fine and wanted to force some info to be logged.
So I changed the value of the framework variable "strVariableValue" and this caused the deadlock with the change event task.
I fixed by creating my own variable "strLogMe" and putting whatever I wanted to log.
Moral: don't touch the framework variables
Did you use the code sample from the book? All the files are available on the Wiley website for free. The code sample includes a SSIS package, sql scripts, and VB code for the script. If this doesn't work for you, then let me know since one of my team members found a way to log variable changes that was different from this methodology.
I was getting this error ("a deadlock was detected" etc), suddenly, which seemed to coincide with I.T. having done a Microsoft Windows patch on the server. There were packages which were using script tasks, with read-only and/or read-write variables in the SSIS UI. Even though it seemed to have been an environmental issue (because the packages had worked for months, then suddenly stopped working, even though I hadn't changed any code), I thought, well (as I had seen from various blog posts from years gone by), there were instances of companies doing server patches, then having their SSIS packages break; and the blogs seemed to say, change the way you're locking the variables, don't reference them in the UI; instead, lock them explicitly in code. So I tried the same thing. It didn't fix it.
It turns out some individual had removed the permissions of the user under whose identity the packages run, from the AD group; those permissions were required because it was trying to copy a file from a directory which required read permissions on the directory. These packages are typically called by a SQL agent job using a proxy identity. When the package was executed manually from SSMS, it worked. But when it was run by calling the SQL agent job, it failed.
The bottom line is, it was just coincidence that the packages started failing around the time of the Windows update. But the other (main) point is, if your package is trying to access a file on the network, and the identity (or proxy identity) under which that package runs does not have permissions to the source or target directory, then your package could fail and the problem could manifest itself in this cryptic way, where it looks like a variable deadlock issue, but it's actually a file share permissions issue. I only wasted a day on this, but... maybe this will be useful to somebody in the future.

How to fix SSIS : "Value, does not fall within expected range"?

When I open up the solution that contains SSIS packages created by a colleague, I get this awkward error that tells me nothing about what I'm supposed to do to fix it.
He left instructions to take all the "variables" out of the connection string in the dtsx file manually before opening up the solution. I have done that, now when try to view the package in the designer I just get an image of a red x and this message.
EDIT: You cannot see any design elements, no tabs across the top to switch to errors or data flows. Just a gray center area on the screen with a red x, and the message, its like VisualStudio dies in the process of reading the dtsx file.
The question is rather unspecific so it’s of course difficult to get on the right track here. All of the given answers focus different issues. I would say that PeterX had the best guess. The reason for the error could be as simple as a modified data source.
I came across with a bug "error output has no corresponding output" quite often when adding a new column to a table that needs to be processed by an existing SSIS package. This bug came along with an error message saying that a "Value does not fall within the expected range".
A newly added column needed to be processed by an existing SSIS Package. The expected behavior is that SSIS will recognize that there is a new column and select this column on the columns page of the OLEDB Source Task SSIS to be processed. However, when opening the OLEDB Source Task for the first time after having modified the table I got twice the following error message: "Value does not fall within the expected range." The error message showed up when opening the editor and when opening the Columns page of the editor. Within the Advanced Editor of the OLEDB Source Task the new column showed up in the OLEDB Source Output Columns Tree, but not in the OLEDB Source Error Output Columns Tree. This is the actual underlying problem of the error message. Unfortunately, there seems to be no way to add the missing column manually.
To solve the problem, remove and re-add the newly added column on the Columns Page of the normal Editor as mentioned by Jeff.
It is worth to be mentioned that the data source of the OLEDB Source task was a modified MDS View. Microsoft CRM Dynamics – as mentioned in the related thread – is using views, too. That leads me to the conclusion, that using views as a data source may produce either of the above mentioned errors, when modifying datatypes or adding/removing columns.
Related Thread: Error" ...The OLE DB Source.Outputs[OLE DB Source Output].Columns[XXXXXXXX] on the non-error output has no corresponding output
The described workaround refers to Visual Studio 2008 Version 9.0.30729.4462 QFE with Mircorsoft.NET Framework 3.5 SP1. The database is SQL Server 2008 R2 (SP2).
I had to delete and recreate the OLE DB Data source in my Data Flow - this is where I got the error. I also noted I had to "re-select" the "OLE DB connection manager" in the drop-down-list to force it to recognise the new connection.
This was probably a combination of getting the solution from TFS (where I noticed the data-sources didn't come-across properly and it complaining about a missing connection GUID) and/or copying and pasting the elements from another package.
(For BIDS 2008).
I had this issue for my OLE DB Source component with an SQL command after adding new columns to the database, and it wouldn't let me select columns or anything else to add the new columns.
I'm working with an Oracle database, and the only way I could get it to update was to change the SQL query to select 1 from dual, and preview it. Then revert it back to my old query.
You get a similar message if someone uses EncryptAllWithUserKey as the ProtectionLevel. However, I believe the message is slightly different (even though you get a grey design surface with a red X).
Have you tried viewing the file in Notepad? Is it just a series of GUIDs or is there anything in it that is humanly readable? If it doesn't have any readable code, then it was probably encyrpted with the user key.
If the employee deployed the packages to a server and used SQL Server as the deployment destination (not File System or SSIS Pacakge Store) then you can download the packages to your machine. Just connect to the SQL Server Integration Services engine, expand Stored Packages, expand MSDB, expand the relevant folder, right-click on the package, and click Export Package. Save the file on your local machine and open it. The package will probably lose annotations and pretty formatting, but otherwise it should be identical to what the employee deployed.
I just struck the same issue. After flailing about for a bit, I found the solution was to edit the Solution Configuration.
The Solution Configuration appeared to have a matching Project configuration, as shown:
However clicking the drop-down arrow for that Project (SSIS-Advance in this example) revealed that there was no Project Configuration for that project called Production - Sub Reports. I'm not sure how that came about - this Solution has a 7-year history and many developers.
Anyway once I created a New Project configuration (using that same drop-down menu), it is all happy now.
If it has Oracle data sources, you may need to install the Microsoft Connectors v4.0 for Oracle by Attunity:
https://www.microsoft.com/en-us/download/details.aspx?id=52950
I also had to use VS 2015 - the version originally used to create the project and package.
I had this exact problem and installing these connectors and using VS 2015 fixed the issue.
I had this occur as well when I tried to call a stored procedure with OUTPUT parameters with OLE DB.
I found this: http://sqlsolutions.blogspot.com/2013/04/ssis-value-does-not-fall-within.html, which resolved my issue. The relevant action was to rename the SSIS parameter mappings to '0', '1', etc.
So for example, when calling dbo.StoredProc #variable0 = ?, #variable1 = ? OUTPUT, #variable2 = ?;, in the parameter mapping dialog, you would name the parameters '0', '1', 2' to correspond to those. Ah, SSIS <3
I get this when I do not follow the convention for parameter naming, e.g. not name parameters 0,1,2,... in the right order for OLE DB connections.
The details are documented here.
In your connection manager, convert your connections to package level instead of project level
Delete connection manager and re-create and setup ssis package solve the problem.
I got this issue after I Add Existing Connection Manager in a SSIS project. I was just importing a Project Connection Manager from a different project (.conmgr) to my project. My solution to fix the issue was:
Deleting the imported .conmgr
Recreating it from scratch