Associated dimension of a detail disappears after reload / save - business-objects

In BusinessObjects webi, I want to merge two variables, one from the universe, one from an excel file.
Query CC data (universe): Site
Query Site mapping table (excel file): Phenix Mapping Key (Entity Name)
If I merge the two dimensions in the report, it works.
However, if I merge the two dimensions using intermediate variables, the "associated dimension" of the detail got lost when saving, closing and reopening the document (or when refreshing it).
When creating the variable, it works, the values in the table appears:
However, after saving, closing and reopening, the "associated dimension" disappears and there is a #datasync error in the table (which is normal with no associated dimension):
Is it a problem in my code? SAP bug?
Version of SAP: SAP BusinessObjects BI Platform 4.2 Support Pack 3 Patch 2
Version of webi: 14.2.3.2220

Related

SSRS Document Map is gone after exporting report with cached reports

We are using Microsoft SQL Server Reporting Services Version 15.0.1102.932 and Microsoft Power BI Report Server Version 15.0.1108.153.
Some of our reports have a Document Map. When we export to Excel we don't want the Document Map. In the DocumentMapLabel we have logic like this:
=IIF(Globals!RenderFormat.Name = "EXCELOPENXML",Nothing,Fields!PAGR_ID.Value)
This works great. However, we have found an issue when the Caching is set to use a cache that expires in X minutes or always run from a history snapshot (option 2 or 3). After you export to Excel then run the report again for the same parameters, the document map is gone.
It appears when you export, the server fetches the data again so it can render it in the desired output format and creates a new record in the SnapshotData table in ReportServerTempDB. I queried that table before running a report for the first time. Ran the report, queried again and saw a new SnapshotData record, as expected. I ran the report several times and noticed the table field TransientRefcount was increasing with each run. Nice to know we can look at that and see how many times a cache has been accessed. I then exported to Excel, queried the table and noticed a new record was created. When I ran the report again with the same parameters I saw that new records TransientRefcount increase instead of the previous SnapshotData record.
Since our report turns off the document map when exporting to Excel, this new SnapshotData record doesn't have the document map in it. Now when you run the report again with the same parameters, the document map is gone, because the server is grabbing the most recent cached data.
Has anyone encountered this and found a way to fix it? A different way to turn off the document map?
I tried creating two tables, one with the document map and one without where the visibility of the table with the document map was set to hide if the RenderFormat is Excel and the other table hides when it's NOT Excel. But that didn't work, the cached version only displays the version without the document map. Which makes sense since the most recent cached version doesn't have a document map and it can't turn something on that isn't there.
One thought I had was to set the Globals!RenderFormat.Name to "RDL" when the report first runs before any parameters are entered, but I can't figure out how to do that. Maybe with some code? I've not used code in a report before so don't know how to do that.
Any thoughts? Thanks very much!

Is there a way to publish main and sub SSRS reports using report builder?

Dears,
I am quite new with Report Builder, and also with SSRS so sorry in advance if I raise irrelevant questions. My goal, essentially, is to deploy a sort of SQL Monitoring dashboard (I just uploaded a demo picture of it, here the link 1 ), that is composed by a main report with some charts and toggled sub reports. All these sub reports are filtered by the SQL instance name, and other parameters like number of monitoring days, months etc...
Plus I make use only of shared datasets.
Now, if I deploy the whole solution with Visual Studio against my report server, it works perfectly and all the report parts are uploaded without any issue against the report site.
If I try to deploy the report using the report website manager (by trying to upload the whole solution file by file), it does not upload anything, even if I create first the data source and then trying to add the existing datasets.
If I try to deploy the solution with Report Builder (after having giving to it the target report server URL and an existing folder that starts with slash), the only option I have is to open the solution as file, and save it against the target report server configured. But then the annoying error appears even if I the path has less than 260 char and even if the path starts with the slash:
"Error: The path of the dataset "" is not valid. The full path must be
less than 260 characters long; other restrictions apply. If the report
server is in native mode, the path must start with slash.
(rsInvalidItemPath)"
What am I doing wrong? I thought that a tool like that, used also for publishing, would make the life easier, but apparently I may not know exactly how to use it.

TF293000: The data warehouse has detected data conflicts for the following work item fields

Hi I'm looking for help with the following issue:
In TFS on our SSRS report server whenever I run any of the out the box Sprint Burndown reports the report seems to run successfully but I get the following error in the bottom right hand corner:
Through some research I found that the issue was due to the field definitions in that particular Collection not matching the other collections that we have in TFS. Simple...
In order to determine which field definition in the collection was the issue I used the witadmin command listfields for all of my collections:
witadmin listfields /collection:Collection /n:Microsoft.VSTS.Common.ReviewedBy
This led me to find that the Synchronizes Identity Name Changes definition in the collection mentioned in the TF293000 error was set to a value of true, while it is false in all of my other collections. Issue Found! Should be easy from here...wrong.
The following command should solve my problem:
witadmin changefield /collection:Collection /n:Microsoft.VSTS.Common.ReviewedBy /syncnamechanges:false
*of course with the proper collection url subbed in for the word Collection
However when run and after I confirm that I want to make the change I get the following error:
TF401327: The operation is not supported. The feature is obselete.
I look the error up and it takes me to this page TFS Known Issue which tells me it's a known issue but was resolved in update 1 ... we have update 3.
I then attempted to simply edit the WIT .xml file and update the attribute for that WIT on that collection with false, but when I import the change to the server it tells me it has imported successfully however when I export it I see that the file has not changed.
I have also tried copying the the .xml file from the same WIT in another collection and uploading that to the offending collection and that will not work. I've never had an issue with uploading a WIT as we've made several changes to our TFS workflow before. I'm pretty stuck at this point and just wondering if anyone else has experienced this issue before, thanks!
According to the error info, seems there is a conflict in the TFS Data warehouse and this because 2 fields in different collection has different attributes in the data warehouse as it’s only one data warehouse. To avoid schema conflicts when you export and process data to the data warehouse databases, you must assign the same values to these attributes across all collections:
Field type (the value for this field cannot be changed for an
existing field).
Reporting type.
Reporting name.
What you have done is the correct operation, change/update the attribute for the field in one project collection to match the assignments that are made in other project collections.
You could try to narrow the issue, if this issue only happened on that specific field in the team project collection. All other work item filed working correctly? Also give a try with other collections, such as change the syncnamechanges=true, then set it back to syncnamechanges=false, to see if any issue occurs.
Run the command line on TFS sever machine instead of your develop machine. Clear TFS cahce. And if the filed is not use for reporting about those project collections, you could also try to mark it as non-reportable. More details please refer below links:
Resolve data warehouse schema conflicts
Change a reportable attribute for a work item field

How do I combine the controls from 2 or more C++ MFC OCXs into 1 OCX?

It has been decreed that we will merged several (C++) OCXs, all of which use MFC as a static library, into 1 OCX, to save the download overhead of multiple copies of C runtime. Management is afraid that the shared DLL setting may cause an installation problem on some version of windows. I am unwilling to play "you bet your job" that they are wrong.
I have merged 2, by turning 1 into a static library and linking it into the other and adding the linked type library as a resource. I am attempting to load the linked control into an ASPX , by its guid. IE finds the merged OCX (because I modified the registry). The combined OCX loads, but the linked control does not.
To FORCE it to load, I manually create a new instance in the OCX "application" code.
Now, it won't receive commands from IE or send events. Creating the linked control's frame fails, but the LastError is 0.
I've tried other approaches to merge the controls; this is the first one to build and register.
How do I combine the controls from 2 or more OCXen into 1?
How can I get some diagnostic information about my linked control's failure to create its frame?
What am I missing?

SSIS adding new columns to the Source

I am new to SSIS. I have a problem. i get data from two different data sources. I am able to merge them using Merge Component and feed the output to a Script component where i validate data and move it to the destination. Every thing is working fine. I want it to work when we add some extra columns in any sources.
The problem is when i add extra columns in source, i should add the input columns in the script component(check the ckeck boxes in Input Columns). Is there any way to do this?
Plz help
Try adding the new column to the source, open the package and then follow the flow through. You will have to go into subsequent controls including the merge control to refresh the data with this new field.
SSIS will prompt you with the exclamation mark at each stage where a refresh is mandatory(such as source control), although you will have to manually step through the flow components (such as merge) where a column output is optional.
For example, I added cost as a new column to the basic ole db source below
and after updating the source control, I have the opportunity to add it to the script component as you mention - but it is not mandatory for me to do so
the new field should be available in the sort component, even if its not used within the script control. You will however need to tick the field as a pass through on the sort control to get it into the merge component