Continuous integration with Reporting Services - sql-server-2008

I'm implementing a continuous integration environment with SVN and reporting services.
The reports are stored in the SVN repository. when a change occurs, they are automatically downloaded from the repository, and any file changed should be uploaded to the reporting services server.
How could you automate the upload/update process for .rdl files?

One way would be to upload them via the reporting services web service. You'll have to generate a proxy and then write some code to read the reports from the local file system and upload them to the report server. You can use the ReportingService2005.CreateReport Method to do this.
You can also use the rs Utility to write a script. Here's a link to Scripting Deployment and Administrative Tasks

Related

Azure DevOps deploy overwrites not always cshtml files

Sometimes it happens that when we do a deploy, the last committed files with extension cshtml, are not updated. We have not seen other type files being not updated. It could be that a file is in use and can't be overwrited. Is it possible to add an extra step to the deployment process so we can avoid this?
That is strange. If you are using the Azure App Service Deploy task within Azure Pipelines to deploy to your Web App, you have the option to Remove additional files at destination. Enabling this option deletes files in the Azure App Service that have no matching files in the App Service artifact package or folder being deployed.
Based on the chosen deployment method, there are other helpful additional deployment options like:
Rename locked files: Rename any file that is still in use by the web server by enabling the msdeploy flag MSDEPLOY_RENAME_LOCKED_FILES=1 in the Azure App Service settings. This option, if set, enables msdeploy to rename files that are locked during app deployment. This way, you can avoid deployment failures with ERROR_FILE_IN_USE errors.
Take App Offline: Select this option to take the Azure App Service offline by placing an app_offline.htm file in the root directory before the synchronization operation begins. The file will be removed after the synchronization completes successfully.
Having these in place can streamline your deployments and make them robust. Here is the complete reference for the task: Azure App Service Deploy task.

SSRS passing Data Source credentials when deploying to SSRS Web Portal

I want certain reports to execute as a super user, but when I change the Credentials section for the .rdl embedded Data Sources, these changes do not propagate through to the web server. That is, users still have to enter their username/password on the web server, and run into permissions issues.
The current workaround is to "Manage" the report on the web server (using the ellipses...) and telling each report to log into the data source "Using the following credentials..." and putting the super user creds.
But this means that everytime we redeploy the report, we need to do this. We would prefer if the settings we have in the actual .rdl in SSRS actually show up on the web server after deployment.
I suggest to use shared datasources for your reports. For each database a report needs access to, create one such shared datasource. The default project settings for Report Server projects in Visual Studio is to not overwrite datasources that already exist on the server when deploying the project. This way, you will have to set the credentials in the Web Portal for each datasource only once, and you don't have to care about that when deploying updated reports.

Deploy SSRS Report to Remote Server

I'm trying to deploy an SSRS report to a remote server (that is not on my network). I'm not sure how to do this. For a machine on my network, I would just change the TargetURL, but I'm guessing there should be somewhere that I can associate credentials to deploy to a remote server, but I'm not finding it.
I know this is an old post but if someone is wondering about the same question.
I am using vs2017 enterprise. So when you configure your remote url and folder name and everything is perfect to build your project, you hit F5 or deploy your project. Upon successful building vs will prompt you for your report servers credential. Then you do the usual and vs will do its own job. In a moment you will be able to access your report.
Just read about the permission requirements before you try it.
Hope it will help someone.
Thanks
I face a similar issue delivering reports to various servers (customers and dev,qa,staging and production.) In visual studio the best way to do it is start a new project for the new server and import the new reports into it. You end up with a new project for each server.
I found TFS / VS unwieldy and my workflow to manage it unfortunately is doing it manually, or using one of a few open source report uploading tools (there are powershell scripts to do it but I find the tools are more user friendly.)
Best thing to start with is doing it manually; which will sort your initial problem.
Save the file out of your report writer to disk.
In Internet Explorer log into the Report Manager of the remote server http(s)://remoteservername/reports and navigate to the folder you want. Then upload the report.
When its uploaded you may need to fix the connection to the database.
Once you get used to doing this you can use a tool like reportsync to easily and quickly move reports between servers.

Is virus scanning files during upload into SQL via HttpHandler or HttpModule a bad idea?

We are implementing a COTS package that allows you to upload files from a Thick-Client - over HTTP Web Services - to SQL 2008, where the files are stored in a VarBinary(Max). The solution will be running on Microsoft based environment.
We have a requirement to "virus-scan the files" during upload.
I was wondering if doing this as an API call from an HttpHandler or HttpModule was a bad idea or not (or even feasible). has anyone done this before?
This should be possible with every antivirus that supports batch mode, this is for example how to do this with Security Essentials on the server:
Use Microsoft Security Essentials in C# when downloading email attachment
My only note to that would be to have a queue of pending files and process files one-by-one from the queue so that you don't run more instances of the antivirus when processing concurrent requests. This could probably slow down your servers.

Howto restart MSRS on changed files in Extended Datareader

I'm using Microsoft Reporting Services (SSRS) 2008 with an Extended Datareader/Dataset Provider (i.e. using a Datareader that I wrote myself and integrated it in SSRS). Everytime I have an update (i.e. copy the new binary files to the bin folder in SSRS), I manually have to restart the service.
I saw in IIS that the mechanism used in SSRS is similar to the IIS, but the IIS has a filewatcher and restarts automatically (or at least loads the new dlls/configs automatically) if files changed.
My question is if there is a mechanism (ideally integrated in SSRS already) that does the same for SQL Server Reporting Services 2008?
If not, what would be other options to handle this?
Seeing as no one is having a go at answering I'll have a stab. Could you not develop a small Windows service that monitors the directory for file updates using the FileSystemWatcher in .NET and then programatically restarts the SSRS service? You may be able to invoke a restart using WMI (check here). If not then you possibly could run a net stop and net start command, e.g.
net stop ReportServer$SQL2008
net start ReportServer$SQL2008
You may need to change the service name to match.
If you copying over binaries that are in use, then you'll need to stop the service before copying anyways. Which would mean more of a deployment script/app than a file watcher.
You can also use the ServiceController to stop and start services based on the name. So it wouldn't be that difficult to stop the service, push the new binaries and then start the service back up.
Here is an example for the service stopping and stopping.