I had a report published on the Power BI Service and was running fine but the Power BI Dataset started failing with the below error:
{"error":{"code":"Premium_ASWL_Error","pbi.error":{"code":"Premium_ASWL_Error","parameters":{},"details":[{"code":"Premium_ASWL_Error_Details_Label","detail":{"type":1,"value":"Refresh is not supported for datasets with a calculated table or calculated column that depends on a table which references Analysis Services using DirectQuery."}}],"exceptionCulprit":1}}}
Strangely the AAS model is not a DirectQuery when I checked in the Visual Studio(attached image below).
The refresh works fine on Power BI Desktop.
Wondering if anyone can give some guidance on the error and how to fix this.
TIA
Related
We've gearing up for a potential Cloud Server Migration as our In-House Server is aging and its capabilities are now being limited vs latest technologies. However, one constraint I'm looking at is the problem of Migrating our Power BI Report Server's reports.
In the past, I was able to migrate from SSRS to Power BI Report Server with the help of a script that automatically backups all Reports in the Server while also populating them in the same Folder Structure as they were in the Server.
I've done some extensive research but I haven't seen any similar approach for Power BI and I was wondering if anyone else have encountered the same problem and/or have a solution for it.
I found a Question posted here on Stackoverflow that is centered for SSRS but somehow also works for Power BI Server. You just need to install this PowerShell Extension and fire the script away.
The only issue here is that it only backs up paginated reports (.rdl) and not .pbix or Graphical Reports.
I am using Visual Studio 2019 and using Microsoft Reporting Services Projects Extension v2.6.7. The problem i am facing is i have a report that process about 60k records, the report is complex and has Groups, repeat headers, dataset filters and also VB Code.
The stored procedure used for this report runs in less than 10 seconds and when the report is deployed to Report Server the report completes rendering in less than 2 Mins. But when I run the same report using Visual Studio in preview or Run Mode (Report Viewer) the report runs for a whooping 17-20 mins. I have used SQL Profiler and see the Stored procedure execution time is almost same as the report execution time. The stored procedure is designed to handle parameter sniffing issue and I dont see any issue with the procedure.
From the report side, i have tried Keeptogether=false, Interactive size etc that could impact performance. They look fine.
I also tried to add WorkingSetMaximum to increase memory but still no luck. The client i am working with requires to have the RDLC File integrated in their app and will not want to deploy on Report Server for their own reason.
How can i make my report run faster in Visual Studio Preview Mode/Report Viewer (Run Mode) so that I can match the performance of the report with the performance i Get in Report Server.
Also if anyone could tell if there is a difference in how the report rendering works on Report Server vs Preview Mode.
Edit 1 - Report Server and the database is configured in my laptop and it is not having any different configuration.
Edit 2 - Another observation i have gathered by running SQL Profiler is that during Preview mode the connection is kept open and the data retrieval time justifies report run time. Both are same. But when i run the report through report manager from the same machine, procedure completes in seconds and even the report renders faster. And As i have mentioned above, i have taken care of parameter sniffing. I am now trying to understand if there is a difference in the way SSRS Engine treats report rending and data retrieval for Preview and when report is deployed to reporting service.
I came across this Q&A discussion in MSDN. I tried to replicate this and it gave me a fix by changing the trust level for CAS in config file. But still I have a question with regards to how Report Viewer in Visual Studio behaves. Is there any similar setting that we use in the application config that can be used to improve development and test performance in Visual Studio.
MSDN Blog
Use the existing framework but force the use of Legacy CAS [code access security] Security
In Winforms <NetFx40_LegacySecurityPolicy enabled="true" />
In ASP Net application <trust legacyCasModel="true" level="Full"/>
I am new to Power BI. I need to create a dashboard publish it so that users without the desktop app can still access it.
The table I am trying to retrieve statistics from is very large (400 MM rows). I can write queries that use a parameter to filter results to a much lower number (2MM rows). I need to create a link on the application that takes the user to the online dashboard showing the filtered results (retrieved based on the search parameter) which should be refreshed every time a user accesses the dashboard.
A few questions I have are:
1) Are 2 MM rows already too much data for Power BI Online to manage?
2) I know parameters are available within the Power BI Desktop but I read that if I publish the report with a parameter, it will use the param value that was set when publishing. Changing this parameter from the URI or per request won't be possible. Is this accurate?
3) Am I better off creating a SSRS report instead of querying the database directly from Power BI? And, if I do use SSRS, is there any value in showing the report through a Power BI dashboard? (rather than embedding the SSRS report in the application)
The database technology I am using currently is Oracle but I am going to migrate to SQL Server (either on prem or the cloud, I haven't decided that yet)
Thanks in advance!
It seems like you have many requirements that are difficult to mix. PowerBI can easily handle 400MM rows and get good, responsive interactivity.
You could easily schedule refresh this data a couple of times per day, but if you wanted it to refresh when a user accesses the report then that will cause some waiting, even if it was only loading 2 million rows.
Perhaps a solution is to have “near real time” data in PowerBI which contains all 400 million rows, and “live” data available through SSRS which will be less responsive and less interactive.
Migrating to SQL Server later might allow you to use DirectQuery which would give you live data (aggregated by the database engine) in PowerBI.
In Microsoft Visual Studio when i run a Report using a Preview the data is showing and when the same report i run from server side its not showing the same data as per the Visual Studio Preview
please suggest
Visual Studio caches locally data in order to reduce the generation time for design puroposes, it will only update the data if you change the parameters values so if your report doesn't have any parameter you will get old data.
To refresh Visual Studio data preview the report and press the refresh button inside the generated report.
Let me know if this can help you.
I had a very similar problem. When I previewed on VS 2015, everything displayed perfect, but when I deployed to my server successfully, only one field wasn't showing up.
What ended up working was changing my report data sets from shared to embedded. Even though they were referencing the same query, somehow there was a disconnect when referencing the shared data set rather than embedding it directly in the report.
I can see table matrix in preview but not in deployed report server ssrs. I have rebuild it several times with no luck. I am using shared data base and data sets for report. Is that causing this issue? Any Solutions for this issue?