Method not found: Microsoft.AnalysisServices - ssis

I have created an ETL setup for a datawarehouse with SSIS packages.
Everything is working fine until the very last step which is a "Analysis Services Processing Task Editor"
Whenever I add my cube and press ok I get the following error:
"Method not found: "'Void Microsoft.AnalysisServices.Commands.ProcessCommand.set_Type(Microsoft.AnalysisServices.ProcessType)'.
I suspect there is some issue with a DLL, but I'm not sure which.
I found a Microsoft.AnalysisServices.dll under my SQL Server install (C:\Program Files (x86)\Microsoft SQL Server\120\SDK\Assemblies)
I did not find it in my Visual Studio installation folders.

I was able to found a workaround.
I was trying to reproduce the problem with the AdventureWorks dataset, to verify that the problem was occuring due to the .dll and not my cube or anything else.
So I created a new OLTP & DWH with Cube from the AdventureWorks dataset and created a new SSIS project where the only step was to process the cube.
With this setup I did not get the same error as I did on the original project, seemingly, there was nothing wrong with the .dll?
However I also tried changing the target server and cube to my original one, and to my surprise it worked!
So i saved that package and imported it into my original project and excecuted the package from there and it works.
EDIT: Please also see Pavel's possbile solution.
Strange thing, if we create new SSIS project in Project deployment mode, leave 2017 version, and deploy it on our 2016 SSIS services - all is working perfectly fine. So... we just need to migrate to the Project deployment mode ))) – Pavel Botygin

We have same problem.
One interesting thing: you can try to switch your project compatibility to SQL Server vNext, then create your processing task normally, clicking OK and other stuff successfully (what a miracle it is!), then switch back to desired version and deploy.
But if you have Script Tasks in the same package - then you can try other workaround (which I'm actually using now). Use a Script Task to populate a variable (User::DimensionsProcessingCommand in my example) for the "Analysis Services Execute DDL Task". It's little complicated way, but very useful in the future.
public void Main()
{
Boolean tstFire = false;
Microsoft.AnalysisServices.Server myServer = new Microsoft.AnalysisServices.Server();
//Get connection to SSAS database from package
ConnectionManager myConn = Dts.Connections["SSAS"];
//Template for future use
String ProcessingCommandTemplate = "<Batch xmlns=\"http://schemas.microsoft.com/analysisservices/2003/engine\"><Parallel>XXXXXXX</Parallel></Batch>";
String myProcessingCommand = "";
//Array for gathering dimensions w/o duplicates
Dictionary<Dimension, Cube> amoDimDictionary = new Dictionary<Dimension, Cube>();
String myServerName = myConn.ConnectionString;
String myDatabaseName = myConn.Properties["InitialCatalog"].GetValue(myConn).ToString();
//Connect to SSAS server instance
myServer.Connect(myServerName);
Database amoDb = myServer.Databases.FindByName(myDatabaseName);
//Get all dimensions used in cubes
foreach (Cube amoCube in amoDb.Cubes)
{
foreach (CubeDimension amoDimension in amoCube.Dimensions)
{
if (!amoDimDictionary.ContainsKey(amoDimension.Dimension))
{
amoDimDictionary.Add(amoDimension.Dimension, amoCube);
}
}
}
//Start XML capture Dimensions
myServer.CaptureXml = true;
foreach (Dimension amoDimension in amoDimDictionary.Keys)
{
if (amoDimension.State == AnalysisState.Unprocessed)
{
amoDimension.Process(ProcessType.ProcessFull);
}
else
{
amoDimension.Process(ProcessType.ProcessUpdate);
}
}
myServer.CaptureXml = false;
//Build command
foreach (String strXML in myServer.CaptureLog)
{
myProcessingCommand = myProcessingCommand + strXML.ToString();
}
myProcessingCommand = ProcessingCommandTemplate.Replace("XXXXXXX", myProcessingCommand);
Dts.Variables["User::DimensionsProcessingCommand"].Value = myProcessingCommand.ToString();
//Command output to see at runtime from VS 2015
Dts.Events.FireInformation(1, "", Dts.Variables["User::DimensionsProcessingCommand"].Value.ToString(), "", 0, ref tstFire);
P.S.
On our DEV machine are installed from scratch: SQL Server 2016, Visual Studio 2015, SSDT 17.1
When we were trying to develop some SSIS packages under SQL Server 2016 compatibility - we stumbled on so many problems... so we just stopped counting them. GAC is googled and tuned back and forth without any result.
14.0 Microsoft DEV Environment seems so buggy and... just broken if you try to create something under 13.0 and lower versions.

This MS Forum post has the following advice.
If it exists, cut and paste the following folder from the GAC to somewhere else.
C:\Windows\Microsoft.NET\assembly\GAC_MSIL\Microsoft.AnalysisServices.DeploymentEngine
Then, rebuild the project.
Right click on the project
Click properties
Expand Configuration Properties -> select Deployment
At The Target Server : Enter \ < SSAS Instace Name> ( Make Sure the SSAS Server Is Multidimensional SSAS Instance )
Click Apply
Save Project. Restart the IDE ( SSDT )
Start The SSDT -> Select The Project - > And Rebuilt Solution \ Rebuilt Project.
Test again.

This seems to be because of a deployment version that is not supported by VS-2015 components.
Go to Project-> Properties
In Configuration Properties -> General
Set TargetServerVersion To SQL Server 2017

Related

The given key was not present in the dictionary - Visual studio SSDT Deploy

I have a SSDT project in Visual Studio 2017 and multiple reports working off a shared data source. I can build all the reports and also preview them with the ability to refresh the data. However when I click deploy I receive the error message
Error : The given key was not present in the dictionary.
I have checked the project deployment details e.g (target sever details, Target Report Folder etc.) and they are all correct. Could anyone advise the best way to debug this error?
Visual Studio Error
Deployment Settings
I had exactly the same problem using Microsoft Reporting Services 1.17 installed from Extensions and Updates in VS2017 15.3.5. I debugged with Visual Studio and decompiled with JetBrains dotPeek which pointed to an error in Microsoft.ReportingServices.BuildProcess.dll. Connection properties were being looked up by URL in a dictionary. As far as I can see, this dictionary never gets populated.
The solution was to uninstall the extension and install SSDT for Visual Studio 2017 (15.3.0 preview).
Thanks for everyone's feedback. I found the bug and will release a fix with the next version of the Reporting Services VSIX.
Thanks,
Matt
There is an .rdl.data file in the report project directory next to your .rdl file. If you close Report Designer, delete that file, and try and preview again, hope this helps.
I came across with the same issue. The problem was that I had no issues when I deploy my report and I tried to access it through URL, but when I tried to access programmatically the browser launch this error 'The given key was not present in the dictionary'.I investigated and found that I was not set up the visibility flag as 'false' on the code.
https://msdn.microsoft.com/en-us/library/microsoft.reporting.webforms.reportparameter.aspx
ReportParameter(String, String, Boolean)
Instantiates a new ReportParameter with a name, a value, and a visibility flag.
Finally, my code was something sort of this:
this.rptViewer.Reset();
rptViewer.ProcessingMode = Microsoft.Reporting.WebForms.ProcessingMode.Remote;
MyReportServerCredentials credencial = new MyReportServerCredentials();
rptViewer.ServerReport.ReportServerCredentials = credencial;
this.rptViewer.ServerReport.ReportServerUrl = new System.Uri(this.ReportServerUrl);
this.rptViewer.ServerReport.ReportPath = ReportUrl;
this.rptViewer.ServerReport.ReportServerUrl = new System.Uri(this.ReportServerUrl);
this.rptViewer.ServerReport.ReportPath = ReportUrl;
ReportParameter p1 = new ReportParameter("param1", new string[] { null }, false);
ReportParameter p2 = new ReportParameter("param2", new string[] { null }, false);
ReportParameter p3 = new ReportParameter("param3", new string[] { null }, false);
ReportParameter p4 = new ReportParameter("param4", new string[] { null }, false);
this.rptViewer.ServerReport.SetParameters(new ReportParameter[] { p1, p2, p3, p4 });
this.rptViewer.AsyncRendering = false;
this.rptViewer.SizeToReportContent = true;
this.rptViewer.ServerReport.Refresh();
I hope it helps!

Crystal report (runtime 13.0.2) not release OBDC connection to MySql sever

Using crystal report for .NET runtime 13.0.2 with MySQL server. report is designed with odbc connection. when export the report to pdf the connection is not release to MySQL server. search the net, found some suggestions about closing the report document will release the connection, but whatever I try don't seem to work. went as far as going through each table and disposing the object and closing connections on the main and subreports.
foreach (Table table in _reportDocument.Database.Tables)
{
table.Dispose();
}
// Now loop through all the sections and its objects to do the same for the subreports
//
foreach (Section section in _reportDocument.ReportDefinition.Sections)
{
// In each section we need to loop through all the reporting objects
foreach (ReportObject reportObject in section.ReportObjects)
{
if (reportObject.Kind == ReportObjectKind.SubreportObject)
{
var subReport = (SubreportObject)reportObject;
ReportDocument subDocument = subReport.OpenSubreport(subReport.SubreportName);
foreach (Table table in subDocument.Database.Tables)
{
table.Dispose();
}
subDocument.Database.Dispose();
subDocument.Close();
subDocument.Dispose();
}
}
}
_reportDocument.Database.Dispose();
_reportDocument.DataSourceConnections.Clear();
_reportDocument.Close();
_reportDocument.Dispose();
_reportDocument = null;
here's what we are running: MySQL 5.6.20, MySQL ODBC 5.3 ANSI Driver and asp.net 4.0 and crystal report 13.0.12 64 bit, windows server 2008 R2 64 bit but the system DSN is 64 bit DSN.
is it the issue with ODBC driver?
digging more into the crystal report support fourm. it turns out that it's a bug that they introduced into crystal report. see details discussion here:
http://scn.sap.com/thread/3664672
hope anybody else having the same issue with crystal report will stop pulling their hair (for now).
switching back to 13 SP 10 fix the issue!! please stay away from SP12.

How to override dtsx connection setting in sql server job step

I have a Web Service Task in a dtsx package developed in Visual Studio. It has an httpconnection with a Timeout setting of 30 seconds. The package is included as a step in a sql server (2008 r2) agent job. When I deployed the package, I set it up to be stored in SQL Server.
I would like to be able to change just the Timeout setting in the sql job step, but I'm not sure how to do this or even if it's possible. At the moment I'm changing the setting within VS then redeploying the package each time.
Can anyone give me any help on how to do this? Which tab of the job step should this be set on?
One thing to be aware of, there is the timeout property on the HTTP Connection Manager but that's for controlling the actual connection to the web service. It does not control the actual time for invoking a particular method, if that makes sense.
I had a 2005 package that consumed a web service for cleaning addresses. The webservice was hosted internally so the HTTP Connection was as LAN speeds, no issue there. The service itself could standardize one address pretty quick. When I need to bulk clean a few hundred thousand, then it takes a not insignificant amount of time. The XML task has a built in, as of 2008 R2, unchangable default timeout of 6 minutes. That's not so handy if you need it to be 3601 seconds or never time out. I'm having trouble finding documentation calling that out but you can verify the behaviour by ginning up a service that sleeps for 6+ minutes.
Our resolution was to use a script task to handle the actual service call so that we could override the Timeout property for the service call.
Public Sub Main()
Dim url As String
Dim inboundFile As String
Dim success As Boolean
Dim timeoutMs As Integer
' 1 hour = 60min * 60 sec * 1000 milliseconds
timeoutMs = 60 * 60 * 1000
inboundFile = CStr(Dts.Variables("NetworkShareInput").Value)
url = CStr(Dts.Variables("WebService").Value)
Try
Dim svc As New AddressCleanerService(url)
' Explicitly provide a timeout for the web service connection
svc.Timeout = timeoutMs
svc.Credentials = System.Net.CredentialCache.DefaultCredentials
success = svc.CleanBulkAddresses(inboundFile)
Catch ex As Exception
Dts.Events.FireError(0, "Address cleaning", "Something failed in the address component stuff", String.Empty, 0)
Dts.Events.FireError(0, "Address cleaning", ex.ToString(), String.Empty, 0)
End Try
If (success) Then
Dts.TaskResult = ScriptResults.Success
Else
Dts.TaskResult = ScriptResults.Failure
End If
End Sub
One way to do it is to use expressions and pass the timeout value from sql agent job. Below are highlevel steps:
Create a variable in your SSIS package to hold the timeout value.
In the properties window of the HTTP connection, click on the expressions eclipse button.
Expand Property dropdown in the property expression editor. Select Timeout.
And use the timeout variable you created earlier. Something like: #[User::Timeout]
In SQL Agent, use command line as job type, and use DTEXEC to execute the SSIS package.
In the DTEXEC command you can pass values to variables. Below is a commad example:
dtexec /f C:\SSIS\Package.dtsx /set \package.variables[Timeout].Value;45
So, when you want to change the timeout value simple change it in the SQL Agent job instead of redeploying the package.
First, if you still have control over the source code, I would point you to package configurations. Then you can edit these settings in an XML file or a data table.
Assuming you don't, you can push some values into the package using the "Set Values" tab of the job step. The hard part is getting the property path correct. Again, using Visual Studio and the package configurations feature, you should be able to find the right name.
Try this for the property path: \Package.Connections[myHttpConnection].Properties[Timeout].Value

SSIS Package won't do anything when Run through SQL Server Job

I have a simple SSIS Package, which has
a Excute SQL Task control on the Control Flow, which fetches some value from the database
In the DataFlow, am using a Script Component, which based on values given by 'Excute SQL Task', does this:
public override void CreateNewOutputRows()
{
try
{
string loginURL = "http://maps.googleapis.com/maps/api/geocode/xml?address=" + Variables.ProjectAddress + "&sensor=true";
WebClient client = new WebClient();
string downloadString = client.DownloadString(loginURL);
XmlDocument xml = new XmlDocument();
xml.LoadXml(downloadString);
///// setting output buffer variables
}
catch(Exception ex)
{
}
}
so basically am requesting a web service for latitude and longitude inside the package.
The retrieved values are then updated into the database:
Everything works fine, when I run the package from the Visual Studio SSIS project console.
But when I try to run the package through a SQL Server 2008 R2 Job, nothing happens. Job Executes successfully but no rows are updated(or inserted).
I tried importing the package into SQL MSDB and setting the protection level to all the items in the dropdown one by one as given here
...and then running this imported package from SQL Job. Still...nothing happened.
Does anyone know whats wrong?? How do I deal with following facts:
It has to do with permission of the sql user to make a web service request. How do I configure that out?
2.it has to do with the configuration file of imported ssis package. What should I look for?
Help me out please:
I hope I have given all the required info to look into the problem
is the job on an SQL Server Instance on your computer? I ask because it may be firewall or permission issues from the SQL Server to the computer you have the web service.
Also I advise removing that try catch and enabling package configurations so you can see if it is trowing an error
Regarding protection level, if you are using EncryptSensitiveWithUserKey the package wont load the database sensitive information (login and password) unless it is on the computer you developed it. Same thing applies to EncryptAllWithUserKey but in this case it wont even open the package

SSRS engine

This is regarding SSRS engine. For my project we are not going to buy SQL server software. Instead of what I just want to know, is there any posibility to use only SSRS engine. So that I can have the ref. (DLL) And then I can use wherever I want.
Please help on this.
If we don't consider the technical aspects of getting SSRS to run without SQL Server, you still need SQL Server for the license. SSRS is an integrated part of SQL Server, not a standalone component.
So no, without buying SQL Server licenses, you cannot use SSRS, even if you were technically able to do it, which I doubt anyway.
SQL Server Express, the free one, comes with Reporting Services. No need to shell out for a full blown SQL Server.
http://www.microsoft.com/Sqlserver/2005/en/us/express.aspx
The SQL Server Reporting services engine is not just a dll. So it will be not just as simple as adding a ref.
You typically have to deploy reports to a webserver and invoke the reports though there.
I'd advise you to look here: http://en.wikipedia.org/wiki/SQL_Server_Reporting_Services
You can create local reports that are not stored in a database if that's what you mean.
http://msdn.microsoft.com/en-us/library/ms252067(v=vs.80).aspx
SSRS needs SQL Server as the ReportServer and ReportServerTempDB databases must be hosted on SQL Server. SQL Agent is also used by the SSRS Service:
https://msdn.microsoft.com/en-us/library/ms156421.aspx
You could try PowerBI as an alternative solution for your project if you are not going to purchase SQL Server software licenses:
https://powerbi.microsoft.com/en-us/
You need to create local reports (.RDLC) using BIDS in visual studio. To render these reports, you need to write code to pass parameters and datasets values. You need to add reference of Report Viewer and use LocalReport to call Render method. It will return you byte array of rendered report.
using (Microsoft.Reporting.WebForms.LocalReport report = new LocalReport())
{
string appPath = AppDomain.CurrentDomain.BaseDirectory;
report.ReportPath = System.IO.Path.Combine(appPath, "Reports", RDLName.ToString());
foreach (KeyValuePair<string, object> dataSource in ReportDataSource)
{
report.DataSources.Add(new ReportDataSource(dataSource.Key, dataSource.Value));
}
if (ReportParameters != null)
{
foreach (KeyValuePair<string, object> parameter in ReportParameters)
{
report.SetParameters(new ReportParameter(parameter.Key, (parameter.Value == null ? null : parameter.Value.ToString())));
}
}
result = report.Render(Format);
report.Dispose();
return result;
}
using the report viewer runtime
download from microsoft
tutorial to use it
you can render rdl files and rdlc files locally, you just have to execute the query yourself