Can you upload non-SSRS files to SSRS Server with Powershell? - reporting-services

SQL Server 2019. In the SSRS Report Server web interface I can upload any file, Pdf, Excel, Word, etc. I've got a lot of files I want to upload and the web interface only allows me to do one at a time. Can I upload all files in a folder to the SSRS server using Powershell? So far what I've found only seems to work for SSRS files - rdl, rsd, etc. Is there some other way to upload multiple non-SSRS files? Thanks!

You can use the below PowerShell script, you will need to change the folder location and the report server URL as well as the -RSFolder reference. The script will upload all files within the folder. Please be aware that SSRS does restrict some file types, these can be found using the following SQL code :
SELECT ConfigInfoID
,Name
,Value
FROM ReportServer.dbo.ConfigurationInfo
WHERE Name = 'AllowedResourceExtensionsForUpload'
-- PowerShell Script --
$FileLocation = "C:\Files\"
$Files = Get-ChildItem $FileLocation
foreach ($File in $Files)
{
Write-RSRestCatalogItem -Overwrite -ReportPortalUri http://ReportServer/Reports/ -Path $Files.FullName -RsFolder "Files" -RestApiVersion V1.0
}

Also should have mentioned that you need to install a PowerShell module:
Install-Module -Name ReportingServicesTools

Related

Downloading .csv file from website using .bat or xslt 2.0

I am trying to download a .csv file from a website using a .bat file in the current folder (or specified).
I think there is some issue with how the download works for this .csv file; I think it downloads only on button/tab click.
If you scroll down there is the option - Download this time series and green color tab .csv with the following uri:
https://www.ons.gov.uk/generator?format=csv&uri=/economy/inflationandpriceindices/timeseries/chaw/mm23
This is my .bat file content:
#echo off
SET "FILENAME=%~dp0\series.csv"
bitsadmin.exe /transfer "JobName" https://www.ons.gov.uk/generator?format=csv&uri=/economy/inflationandpriceindices/timeseries/chaw/mm23 "%FILENAME%"
I have even tried to do this using XSLT 2.0, but I was unable to automate this download.
I'm not so familiar with batch and I think powershell is much better tool for this job in terms of long time support and easiness. Here is very sample script to download file by given link
$url = "https://www.ons.gov.uk/generator?format=csv&uri=/economy/inflationandpriceindices/timeseries/chaw/mm23"
$output = "$PSScriptRoot\series.csv"
$start_time = Get-Date
Invoke-WebRequest -Uri $url -OutFile $output
Write-Output "Time taken: $((Get-Date).Subtract($start_time).Seconds) second(s)"
Create sample file with script named like download.ps1.
You'll need to cd to file directory and run it like .\download.ps1

To identify linked server in SSIS package

I have around 500 SSIS package. I wanted to get the list of SSIS package where linked server is used. The reason we have to get this is that we are now removing the linked server. I don't want to open every SSIS package and check all the task to see if link server is available.
Is there any way we can do this?
If you don't want to use powershell, I use something called FnR.EXE which you can google and download. Again you just search through the XML which is just a text file. If you know that names of your linked servers, that's good. If you don't know the names of your linked server you'll have to search for something of the form %.%.%.%. It would be much more reliable to know all the linked server names (it should be quicker to work that out than go through all of your packages)
You also need to consider if your package uses a view which in turn references a linked server. Then the linked server name won't actually appear in the package.
It's not really an answer, however it is too long for comment. You could simply search given text in SSIS packages. Those are nothing more than xml files.
You could use f.ex. PowerShell:
Get-ChildItem -recurse | Select-String -pattern "YOUR_LINKED_SERVER" | group path | select name
This will at least give you list of packages with linked server. Then depending on where is you linked server you might want to:
If it's SQL strings, just replace its name with empty string (PowerShell or something else)
If in other components, you might want to look into Microsoft.SqlServer.Dts.Runtime name space and write either PowerShell script or .NET app and alter files from code.

Accessing variables in JSON from within Azure VM?

I'm using the Azure Powershell command New-AzureRmResourceGroupDeployment together with a JSON template file, and I'm feeding a bunch of parameters to the command that it will use with the JSON file. The JSON also instructs the newly-created VM to download a Powershell script from Azure storage and run it.
I need to pass some values from my Azure Powershell script to that "VM-local" Powershell script. For argument's sake, let's say that my Azure Powershell script has a variable $foo with a value of bar, representing "the name of a folder to be created on C:\ (so C:\bar)".
How?
How can a script running within the VM access the value bar (by any means)? It's fine if I need to use the JSON file as "messenger", or any other necessary trick. I don't think I can modify the "VM-local" Powershell script between downloading it from Azure storage and subsequently running it.
If you use the script extension in your JSON template on the VM to run the script, you can specify the entire cmd line for that script. On that cmd line you would pass parameters just as you would running it interactively. IOW, think about the cmdline to run that script and that's what you would put into the script extension of the template.
Take a look at this example:
https://github.com/Azure/azure-quickstart-templates/blob/f18a95e857a4caf86b4c2e77e652cec678cd524c/201-vm-winrm-windows/azuredeploy.json
Look at the "commandToExecute" property. You can see how to invoke powershell.exe with params, the script file being one of those params and then the script file itself also accepts some params through a variable.
You could also do this with DSC (very similar in JSON, but very different PS) but if you already have a PS script you want to use, this should work.
Is that what you needed?
You can pass variables like this:
New-AzureRmResourceGroupDeployment -Name ((Get-ChildItem $TemplateFile).BaseName + '-' + ((Get-Date).ToUniversalTime()).ToString('MMdd-HHmm')) `
-ResourceGroupName $ResourceGroupName `
-TemplateFile $TemplateFile -TemplateParameterObject #{accountName=$AutomationAccount;moduleName=$Name;moduleURI=$ModuleURI} -Force -Verbose

How do I export an SSRS report to a pdf file in a batch job using X++?

Is it possible to generate an SSRS report in Dynamics AX 2009 and save it as a pdf file using X++ ?
The problem I have is that I need to generate the data for the report and then generate the report. Reporting server subscriptions wont work in this case as there is no way for them to call the x++ to generate the data.
I have also had a look at passing the rendering type to the SSRS report in the URL, but it doesnt seem to accept a filename to save the report as.
The logic that generates the data is not a straightforward query, and takes quite a while to run. I want to be able to turn this into a batch process so that several reports can be generated by a batch server.
Ensure that AX is configured as a batch server and then you will need to create a batch job.
The art of creating a batch class (for the batch job to call) which calls a report and generates a pdf file overnight has already been mastered here.
The following snippet for generating a PDF file is from the class EPSendDocument.makeDocument()
Filename file = "\\\\Server\\SharedFolder\\File.pdf";
printSettings = reportRun.parmReportContract().parmPrintSettings();
printSettings.printMediumType(SRSPrintMediumType::File);
printSettings.fileFormat(SRSReportFileFormat::PDF);
printSettings.fileName(file);
printSettings.overwriteFile(true);
Another link for converting a report to a pdf file.
Finally do check first if the files are generated by executing the class from within your AX client, and then when it is run on the batch server. There may be permission or path issues.

automatically download csv files from a website and add to database

I want to write a program that automatically downloads all the csv files on a webpage and then enter the data in those csv files into SQL Server. I have written the macro to enter the data from csv to SQL server. I just want you guys to help me in automatically downloading the files from a website everyday. is it similar to webscraping? I am new to web programming. So please guide me which languages to go through to do this
Quite easy with PowerShell:
$web = New-Object System.Net.WebClient
$web.Downloadstring("http://<your url here>") | out-file $env:tmp\MyFile.csv
Then use
Import-CSV
and the SQLServer PowerShell provider to inject your data into SQLServer.
I recommend Python, as usual.
After you figure out how to work with it, here are the modules I would use:
csv - for parsing CSV files.
urllib2 - for downloading files.
SQLAlchemy for database operations.
I also recommend PowerShell way to do the job.
Download data from web by WebClient like #David Brabant said
Then Use Import-Csv to get data row by row Import-Csv -Path file.csv
In the end, make use of SQL Server PowerShell Module in your script to upload data into SQL Server Invoke-Sqlcmd -Query '<sql statements>'
Here exists a complete sample script for you to download and learn. How to use SQL Server PowerShell Module to import data from CSV file