I'm trying to write some PowerShell that will copy an api and policy from one subscription to another.
This is what I have:
Connect-AzAccount
Set-AzContext -Subscription "x"
$ApiMgmtContext = New-AzApiManagementContext -ResourceGroupName "rg-apim-dev-001" -ServiceName "apim-dev-002"
Export-AzApiManagementApi -Context $ApiMgmtContext -ApiId "365-response" -SpecificationFormat OpenApi -SaveAs "C:\robtemp\365-response.yml"
$policy = Get-AzApiManagementPolicy -Context $ApiMgmtContext -ApiId "365-response" -OperationId "invoice"
Set-AzContext -Subscription "y"
$ApiMgmtContext = New-AzApiManagementContext -ResourceGroupName "rg-apim-int-001" -ServiceName "apim-int-001"
Import-AzApiManagementApi -Context $ApiMgmtContext -SpecificationFormat OpenApi -SpecificationPath "C:\robtemp\365-response.yml" -Path "apis"
Set-AzApiManagementPolicy -Context $ApiMgmtContext -ApiId "365-response" -OperationId "invoice" -Policy $policy.ToString()
It is able to populate the $policy variable ok but on the Set-AzApiManagementPolicy call, the following error is thrown:
Set-AzApiManagementPolicy:
Error Code: ValidationError
Error Message: Entity with specified identifier not found
Request Id: 7a0ece56-9e95-4eae-af58-f3b96f3ac23e
I needed to copy the source policy to a string rather than reference directly when assigning to the target operation.
The following did the trick:
param ($tempFolder = "c:\robtemp")
Connect-AzAccount
Set-AzContext -Subscription "x"
$ctx = New-AzApiManagementContext -ResourceGroupName "rg-apim-dev-001" -ServiceName "apim-dev-002"
Export-AzApiManagementApi -Context $ctx -ApiId "365-response" -SpecificationFormat OpenApi -SaveAs "$tempFolder/365-response.yml"
$policy = Get-AzApiManagementPolicy -Context $ctx -ApiId "365-response" -OperationId "invoice"
$policyToCopy = $policy.ToString()
Set-AzContext -Subscription "y"
$ctx = New-AzApiManagementContext -ResourceGroupName "rg-apim-int-001" -ServiceName "apim-int-001"
$api = Get-AzApiManagementApi -Context $ctx -Name "365-response"
if ($api) {
Remove-AzApiManagementApi -Context $ctx -ApiId $api.ApiId
}
Import-AzApiManagementApi -Context $ctx -SpecificationFormat OpenApi -SpecificationPath "$tempFolder/365-response.yml" -Path "apis"
$api = Get-AzApiManagementApi -Context $ctx -Name "365-response"
$operation = Get-AzApiManagementOperation -Context $ctx -ApiId $api.ApiId -OperationId "invoice"
Set-AzApiManagementPolicy -Context $ctx -ApiId $api.ApiId -OperationId $operation.OperationId -Policy $policyToCopy
Related
I would like to pass Json data to Powershell script.
PowerShell script:
Get-AzMySqlFirewallRule -ResourceGroupName "dev" -ServerName "dev-DB-Server" | Out-File "file.json"
New-AzMySqlFirewallRule -Name “” -ResourceGroupName "dev" -ServerName "dev-core" -EndIPAddress "" -StartIPAddress ""
In the above powershell script I need to get values to "" from Json file mentioned below.So how to get Json parameter values during run time and all 3 parameters should be passed to the above command and so that it will create new firewall rule to new DB server.
Also, when I run the powershell command (Get-AzMySqlFirewallRule -ResourceGroupName "dev" -ServerName "dev-DB-Server" | Out-File "file.json") I am getting my Json file data in the below format.Not sure whether this format looks good but I need the below values start from pdbr_home,1.2.3.4 and 5.6.7.8 and similarly another 2 rows of data should be passed to my powershell command here New-AzMySqlFirewallRule -Name “” -ResourceGroupName "dev" -ServerName "dev-core" -EndIPAddress "" -StartIPAddress "". via for loop.
file.Json:
[
{
"EndIPAddress": "1.3.2.2",
"Id": "/subscriptions/abcdefg/resourceGroups/dev/providers/Microsoft.DBforMySQL/servers/db-dev- vm/firewallRules/praveen_Home",
"Name": "praveen_Home",
"StartIPAddress": "4.3.1.2",
"Type": "Microsoft.DBforMySQL/servers/firewallRules"
},
{
"EndIPAddress": "2.4.5.6",
"Id": "/subscriptions/abcdefg/resourceGroups/dev/providers/Microsoft.DBforMySQL/servers/db-dev- vm/firewallRules/pdbr_Home",
"Name": "pdbr_Home",
"StartIPAddress": "3.2.1.2",
"Type": "Microsoft.DBforMySQL/servers/firewallRules"
}
]
The below command output as follows.
PS /home/praveen> Get-Command json
CommandType Name Version Source
----------- ---- ------- ------
Cmdlet ConvertFrom-Json 7.0.0.0 Microsoft.PowerShell.Utility
Cmdlet ConvertTo-Json 7.0.0.0 Microsoft.PowerShell.Utility
Cmdlet Test-Json 7.0.0.0 Microsoft.PowerShell.Utility
Application json_pp 0.0.0.0 /usr/bin/json_pp
Application json_pp 0.0.0.0 /bin/json_pp
Error:
Error:
New-AzMySqlFirewallRule: /home/praveen/dbtest.ps1:21
Line |
21 | … -ServerName "praveen-dev" -EndIPAddress $entry.EndIPAddress -StartI …
| ~~~~~~~~~~~~~~~~~~~
| Cannot bind argument to parameter 'EndIPAddress' because it is an empty string.
Final solution worked for me now:
##################### Updating Firewall rules from Soiurce DB server to Target DB server ##################
Write-Host -NoNewline "Updating Firewall rules from Soiurce DB server to Target DB server"
Get-AzMySqlFirewallRule -ResourceGroupName $ResourceGroupName -ServerName $SourceDBServerName | Select-Object Name, StartIPaddress, EndIPaddress | Convertto-Json | Out-File "firewallrule.json"
foreach ($frule in (Get-Content firewallrule.json -raw | ConvertFrom-Json)) {
New-AzMySqlFirewallRule -Name $frule.Name -ResourceGroupName $ResourceGroupName -ServerName $TargetDBServerName -EndIPAddress $frule.EndIPAddress -StartIPAddress $frule.StartIPAddress
}
Use Convertto-Json before writing file.
Docs: https://learn.microsoft.com/en-us/powershell/module/microsoft.powershell.utility/convertto-json?view=powershell-7.2
Get-AzMySqlFirewallRule -ResourceGroupName "dev" -ServerName "dev-DB-Server" | Convertto-Json | Out-File "file.json"
EDIT as requested:
Get-AzMySqlFirewallRule -ResourceGroupName "dev" -ServerName "dev-DB-Server" | Select-Object Name,"StartIP address", "EndIP address"| Convertto-Json | Out-File "file.json"
Another EDIT as requested (fixed my mistake - thank you #sage pourpre):
foreach ($entry in (Get-Content file.json -raw | ConvertFrom-Json) {
New-AzMySqlFirewallRule -name $entry.Name `
-ResourceGroupName "dev" `
-ServerName "dev-core" `
-StartIPAddress $entry.StartIPAddress `
-EndIPAddress $entry.EndIPAddress
}
try the below:
Get-AzMySqlFirewallRule -ResourceGroupName "dev" -ServerName "dev-DB-Server" | Convertto-Json | Out-File "file.json"
$data = Get-Content "C:\Users\me\file.json" | Out-String | ConvertFrom-Json #replace path to where you have exported the json file
foreach ($line in $data) {
New-AzMySqlFirewallRule -name $line.Name `
-ResourceGroupName "dev" `
-ServerName "dev-core" `
-StartIPAddress $line.StartIPAddress `
-EndIPAddress $line.EndIPAddress
}
A different method, although the question is for JSON, would be to just store the rules in a variable.
$rules = Get-AzMySqlFirewallRule -ResourceGroupName "dev" -ServerName "dev-DB-Server"
foreach ($rule in $rules){
New-AzMySqlFirewallRule -name $rule.Name `
-ResourceGroupName "dev" `
-ServerName "dev-core" `
-StartIPAddress $rule.StartIPAddress `
-EndIPAddress $rule.EndIPAddress
}
I was able to fix this solution with help of all the above inputs few days ago. Thanks to all.Really appreciated your help.
Updating Firewall rules from Source DB server to Target DB server
Write-Host -NoNewline "Updating Firewall rules from Soiurce DB server to Target DB server"
Get-AzMySqlFirewallRule -ResourceGroupName $ResourceGroupName -ServerName $SourceDBServerName | Select-Object Name, StartIPaddress, EndIPaddress | Convertto-Json | Out-File "firewallrule.json"
foreach ($frule in (Get-Content firewallrule.json -raw | ConvertFrom-Json)) {
New-AzMySqlFirewallRule -Name $frule.Name -ResourceGroupName $ResourceGroupName -ServerName $TargetDBServerName -EndIPAddress $frule.EndIPAddress -StartIPAddress $frule.StartIPAddress
}
//role-definition.json
{
"RoleName": "MyReadWriteRole",
"Type": "CustomRole",
"AssignableScopes": ["/"],
"Permissions": [{
"DataActions": [
"Microsoft.DocumentDB/databaseAccounts/readMetadata",
"Microsoft.DocumentDB/databaseAccounts/sqlDatabases/containers/items/*",
"Microsoft.DocumentDB/databaseAccounts/sqlDatabases/containers/*"
]
}]
}
Coverting it into json object as i need to pass it as an argument in a powershell parameter.
So i did below -
Created a powershell parameter
$roleDef= #"
{
"RoleName": "MyReadWriteRole",
"Type": "CustomRole",
"AssignableScopes": ["/"],
"Permissions": [{
"DataActions": [
"Microsoft.DocumentDB/databaseAccounts/readMetadata",
"Microsoft.DocumentDB/databaseAccounts/sqlDatabases/containers/items/",
"Microsoft.DocumentDB/databaseAccounts/sqlDatabases/containers/"
]
}]
}
"#
$jsonObject = $roleDef | ConvertFrom-Json
az cosmosdb sql role definition create --account-name cosmosdbaccname --resource-group 'my-rg' --body $jsonObject
but I get below error -
Failed to parse string as JSON:
#{RoleName=MyReadWriteRole; Type=CustomRole; AssignableScopes=System.Object[]; Permissions=System.Object[]}
Error detail: Expecting value: line 1 column 1 (char 0)
I am trying to retrieve files from a remote system via PowerShell. In order to do this i utilize New- PSSession and Invoke-Command in this session.
$latestFolder = Invoke-Command -Session $remoteSession -ScriptBlock{param($path) Get-ChildItem $path -Directory -Name} -ArgumentList $path
If i set path to $Path = "$env:SystemRoot\System32" it works just fine, but if i set it to a string read from a configuration file (json) it gives me the weirdest problems. One is that it cannot find the parameter -Directory, if i omit the -Directory and -Name parameters the error message is Ein Laufwerk mit dem Namen "$env" ist nicht vorhanden. Roughly translatet to A drive named "$env" is not available.
The configuration file looks like this:
{
"File": [
{
"Name": "TEST",
"Active": true,
"Path": "$env:SystemRoot\\System32"
}
]
}
The Powershell script is the following:
$logFilePath = Join-Path (get-item $PSScriptRoot).FullName "Test.json"
$logConfig = Get-Content -Path $logFilePath | ConvertFrom-Json
$windowsUser = "username"
$windowsUserPassword = "password"
$windowsUserPasswordsec = $windowsUserPassword | ConvertTo-SecureString -AsPlainText -Force
$Server = "server"
$Path = "$env:SystemRoot\System32"
$Path = $logConfig.File[0].Path
$sessioncred = new-object -typeName System.Management.Automation.PSCredential -ArgumentList $windowsUser, $windowsUserPasswordsec
$remoteSession = New-PSSession -ComputerName $Server -Credential $sessioncred -Authentication "Kerberos"
$latestFolder = Invoke-Command -Session $remoteSession -ScriptBlock{param($path) Get-ChildItem $path -Directory -Name} -ArgumentList $Path
$latestFolder
You need to explicitely expand the string read from the Path element in the Json file.
This should do it:
$Path = $ExecutionContext.InvokeCommand.ExpandString($logConfig.File[0].Path)
An alternative could be by using Invoke-Expression:
$Path = Invoke-Expression ('"{0}"' -f $logConfig.File[0].Path)
So i'm trying to convert XML files on an Azure Storage Container to JSON in the same container.
This way I'm able to read the information into an Azure SQL Database via an Azure Datafactory.
I'd like to stay clear from using Logic apps if able.
The JSON files need to be formatted.
And all this through the use of PowerShell scripting.
What i've got so far after some searching on the interwebs and shamelessly copying and pasting powershell code:
#Connect-AzAccount
# Helper function that converts a *simple* XML document to a nested hashtable
# with ordered keys.
function ConvertFrom-Xml {
param([parameter(Mandatory, ValueFromPipeline)] [System.Xml.XmlNode] $node)
process {
if ($node.DocumentElement) { $node = $node.DocumentElement }
$oht = [ordered] #{}
$name = $node.Name
if ($node.FirstChild -is [system.xml.xmltext]) {
$oht.$name = $node.FirstChild.InnerText
} else {
$oht.$name = New-Object System.Collections.ArrayList
foreach ($child in $node.ChildNodes) {
$null = $oht.$name.Add((ConvertFrom-Xml $child))
}
}
$oht
}
}
function Format-Json
{
<#
.SYNOPSIS
Prettifies JSON output.
.DESCRIPTION
Reformats a JSON string so the output looks better than what ConvertTo-Json outputs.
.PARAMETER Json
Required: [string] The JSON text to prettify.
.PARAMETER Minify
Optional: Returns the json string compressed.
.PARAMETER Indentation
Optional: The number of spaces (1..1024) to use for indentation. Defaults to 4.
.PARAMETER AsArray
Optional: If set, the output will be in the form of a string array, otherwise a single string is output.
.EXAMPLE
$json | ConvertTo-Json | Format-Json -Indentation 2
#>
[CmdletBinding(DefaultParameterSetName = 'Prettify')]
Param(
[Parameter(Mandatory = $true, Position = 0, ValueFromPipeline = $true)]
[string]$Json,
[Parameter(ParameterSetName = 'Minify')]
[switch]$Minify,
[Parameter(ParameterSetName = 'Prettify')]
[ValidateRange(1, 1024)]
[int]$Indentation = 4,
[Parameter(ParameterSetName = 'Prettify')]
[switch]$AsArray
)
if ($PSCmdlet.ParameterSetName -eq 'Minify')
{
return ($Json | ConvertFrom-Json) | ConvertTo-Json -Depth 100 -Compress
}
# If the input JSON text has been created with ConvertTo-Json -Compress
# then we first need to reconvert it without compression
if ($Json -notmatch '\r?\n')
{
$Json = ($Json | ConvertFrom-Json) | ConvertTo-Json -Depth 100
}
$indent = 0
$regexUnlessQuoted = '(?=([^"]*"[^"]*")*[^"]*$)'
$result = $Json -split '\r?\n' |
ForEach-Object {
# If the line contains a ] or } character,
# we need to decrement the indentation level unless it is inside quotes.
if ($_ -match "[}\]]$regexUnlessQuoted")
{
$indent = [Math]::Max($indent - $Indentation, 0)
}
# Replace all colon-space combinations by ": " unless it is inside quotes.
$line = (' ' * $indent) + ($_.TrimStart() -replace ":\s+$regexUnlessQuoted", ': ')
# If the line contains a [ or { character,
# we need to increment the indentation level unless it is inside quotes.
if ($_ -match "[\{\[]$regexUnlessQuoted")
{
$indent += $Indentation
}
$line
}
if ($AsArray) { return $result }
return $result -Join [Environment]::NewLine
}
# Storage account details
$resourceGroup = "insert resource group here"
$storageAccountName = "insert storage account name here"
$container = "insert container here"
$storageAccountKey = (Get-AzStorageAccountKey -ResourceGroupName $resourceGroup -Name $storageAccountName).Value[0]
$storageAccount = Get-AzStorageAccount -ResourceGroupName $resourceGroup -Name $storageAccountName
# Creating Storage context for Source, destination and log storage accounts
#$context = New-AzStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $storageAccountKey
$context = New-AzStorageContext -ConnectionString "insert connection string here"
$blob_list = Get-AzStorageBlob -Container $container -Context $context
foreach($blob_iterator in $blob_list){
[XML](Get-AzStorageBlobContent $blob_iterator.name -Container $container -Context $context) | ConvertFrom-Xml | ConvertTo-Json -Depth 11 | Format-Json | Set-Content ($blob_iterator.name + '.json')
}
Output =
Cannot convert value "Microsoft.WindowsAzure.Commands.Common.Storage.ResourceModel.AzureStorageBlob" to type "System.Xml.XmlDocument". Error: "The specified node cannot
be inserted as the valid child of this node, because the specified node is the wrong type."
At C:\Users\.....\Convert XML to JSON.ps1:116 char:6
+ [XML](Get-AzStorageBlobContent $blob_iterator.name -Container $c ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidArgument: (:) [], RuntimeException
+ FullyQualifiedErrorId : InvalidCastToXmlDocument
When I run the code the script asks me if I want to download the xml file to a local folder on my laptop.
This is not what I want, I want the conversion to be done in Azure on the Storage container.
And I think that I'm adding ".json" to the .xml file name.
So the output would become something like filename.xml.json instead of just filename.json
What's going wrong here?
And how can it be fixed?
Thank you in advance for your help.
In Azure CLI, there is az functionapp, but no such equivalent can be found in Powershell AzureRM-library nor Az-library.
Using raw Azure resources, I've attempted something like this to create a function app on my Application Service Plan:
New-AzResource -ResourceType 'Microsoft.Web/Sites' `
-ResourceGroupName "MyRgName" `
-Location "westeurope" `
-ResourceName "MyFunctionName" `
-kind 'functionapp' `
-Properties #{ServerFarmId="abc-123"; alwaysOn=$True;} `
-ApiVersion '2018-11-01' `
-Force;
It almost works, but doesn't create a 100% working Function App. Azure Portal will spit lots of errors and warnings, for example from missing Host Keys.
Alternatives:
ARM-templates. What to put into a template to successfully create Azure Function? I have no idea. The one generated by Azure Portal is useless.
Azure Portal: Not really handy approach for environment setup from Azure DevOps release pipeline, but it will create a fully working Function App.
The question is: How to create a Function App from a Powershell script?
I am doing the exact same thing to create a dev sandbox environment.
Provisioning function apps is a gap in the Az Powershell module but it does appear to be possible.
I provisioned my function app by following the steps here https://clouddeveloper.space/2017/10/26/deploy-azure-function-using-powershell/ but changed it to use an existing app service plan instead of consumption plan.
$AppServicePlan = "abc-123"
$AppInsightsKey = "your key here"
$ResourceGroup = "MyRgName"
$Location = "westeurope"
$FunctionAppName = "MyFunctionName"
$AzFunctionAppStorageAccountName = "MyFunctionAppStorageAccountName"
$FunctionAppSettings = #{
ServerFarmId="/subscriptions/<GUID>/resourceGroups/$ResourceGroup/providers/Microsoft.Web/serverfarms/$AppServicePlan";
alwaysOn=$True;
}
# Provision the function app service
New-AzResource -ResourceGroupName $ResourceGroup -Location $Location -ResourceName $FunctionAppName -ResourceType "microsoft.web/sites" -Kind "functionapp" -Properties $FunctionAppSettings -Force | Out-Null
$AzFunctionAppStorageAccountKey = Get-AzStorageAccountKey -ResourceGroupName $ResourceGroup -AccountName $AzFunctionAppStorageAccountName | Where-Object { $_.KeyName -eq "Key1" } | Select-Object Value
$AzFunctionAppStorageAccountConnectionString = "DefaultEndpointsProtocol=https;AccountName=$AzFunctionAppStorageAccountName;AccountKey=$($AzFunctionAppStorageAccountKey.Value)"
$AzFunctionAppSettings = #{
APPINSIGHTS_INSTRUMENTATIONKEY = $AppInsightsKey;
AzureWebJobsDashboard = $AzFunctionAppStorageAccountConnectionString;
AzureWebJobsStorage = $AzFunctionAppStorageAccountConnectionString;
FUNCTIONS_EXTENSION_VERSION = "~2";
FUNCTIONS_WORKER_RUNTIME = "dotnet";
}
# Set the correct application settings on the function app
Set-AzWebApp -Name $FunctionAppName -ResourceGroupName $ResourceGroup -AppSettings $AzFunctionAppSettings | Out-Null
This might help:
To create an Azure Function we have dependency over "Storage Account", "Service Plan", "a resource group" and "Application Insight"(optional). Below i am initially defining variables. Post that i am checking if Resource group exist. If not it will create a new one. Post which i created Azure Storage Account, Service Plan and Application Insight. In Azure function we need to select Runtime Stack which can be "Java"/"DotNet"/"Python" etc. Here I am using "Dotnet". Azure Function requires Storage Account keys to link the same, which is extracted below under variables "WEBSITE_CONTENTAZUREFILECONNECTIONSTRING" etc. To link AppInsight with Function we need to map Application_Insight_InstrumentationKey. Please follow inline comments :
#=============Defining All Variables=========
$location = 'Southeast Asia'
$resourceGroupName = 'functionrgnew1'
$storageAccount = 'functionsasdnewqq1'
$subscriptionId = '<id>'
$functionAppName = 'functionapppsdfsdnew1'
$appInsightsName = 'appinsightnameprdad'
$appServicePlanName = 'functionappplan'
$tier = 'Premium'
#========Creating Azure Resource Group========
$resourceGroup = Get-AzResourceGroup | Where-Object { $_.ResourceGroupName -eq $resourceGroupName }
if ($resourceGroup -eq $null)
{
New-AzResourceGroup -Name $resourceGroupName -Location $location -force
}
#selecting default azure subscription by name
Select-AzSubscription -SubscriptionID $subscriptionId
Set-AzContext $subscriptionId
#========Creating Azure Storage Account========
if(!(Test-AzureName -Storage $storageAccount))
{
New-AzStorageAccount -ResourceGroupName $resourceGroupName -AccountName $storageAccount -Location $location -SkuName "Standard_LRS"
}
#========Creating App Service Plan============
New-AzAppServicePlan -ResourceGroupName $resourceGroupName -Name $appServicePlanName -Location $location -Tier $tier
$functionAppSettings = #{
ServerFarmId="/subscriptions/$subscriptionId/resourceGroups/$resourceGroupName/providers/Microsoft.Web/serverfarms/$appServicePlanName";
alwaysOn=$True;
}
#========Creating Azure Function========
$functionAppResource = Get-AzResource | Where-Object { $_.ResourceName -eq $functionAppName -And $_.ResourceType -eq "Microsoft.Web/Sites" }
if ($functionAppResource -eq $null)
{
New-AzResource -ResourceType 'Microsoft.Web/Sites' -ResourceName $functionAppName -kind 'functionapp' -Location $location -ResourceGroupName $resourceGroupName -Properties $functionAppSettings -force
}
#========Creating AppInsight Resource========
New-AzApplicationInsights -ResourceGroupName $resourceGroupName -Name $appInsightsName -Location $location
$resource = Get-AzResource -Name $appInsightsName -ResourceType "Microsoft.Insights/components"
$details = Get-AzResource -ResourceId $resource.ResourceId
$appInsightsKey = $details.Properties.InstrumentationKey
#========Retrieving Keys========
$keys = Get-AzStorageAccountKey -ResourceGroupName $resourceGroupName -AccountName $storageAccount
$accountKey = $keys | Where-Object { $_.KeyName -eq 'Key1' } | Select-Object -ExpandProperty Value
$storageAccountConnectionString = 'DefaultEndpointsProtocol=https;AccountName='+$storageAccount+';AccountKey='+$accountKey
#========Defining Azure Function Settings========
$AppSettings =#{}
$AppSettings =#{'APPINSIGHTS_INSTRUMENTATIONKEY' = $appInsightsKey;
'AzureWebJobsDashboard' = $storageAccountConnectionString;
'AzureWebJobsStorage' = $storageAccountConnectionString;
'FUNCTIONS_EXTENSION_VERSION' = '~2';
'FUNCTIONS_WORKER_RUNTIME' = 'dotnet';
'WEBSITE_CONTENTAZUREFILECONNECTIONSTRING' = $storageAccountConnectionString;
'WEBSITE_CONTENTSHARE' = $storageAccount;}
Set-AzWebApp -Name $functionAppName -ResourceGroupName $resourceGroupName -AppSettings $AppSettings
The best way to do this from PowerShell is to use an ARM template, rather than try to create each resource individually. You can find an example template here. It also hooks up the app to github, but you can leave out that part if you just want an empty app
I've written the following PS script to delete log files from specific server paths. I'm a novice to PS but I'm getting some errors with a few of the functions that I have written in this script:
#* FileName: FileCleaner.ps1
#Clear the screen
Clear
#Read XML Config File to get settings
[xml]$configfile = Get-Content "C:\Users\pmcma\Documents\Projects\Replace FileCleaner with PowerShell Script\FileCleaner.config.xml"
#Declare and set variables from Config values
$hostServer = $configfile.Settings.HostServer
$dirs = #($configfile.Settings.DirectoryName.Split(",").Trim())
$scanSubDirectories = $configfile.Settings.ScanSubDirectories
$deleteAllFiles = $configfile.Settings.deleteAllFiles
$fileTypesToDelete = #($configfile.Settings.FileTypesToDelete.Split(";").Trim())
$liveSiteLogs = $configfile.Settings.LiveSiteLogs
$fileExclusions = #($configfile.Settings.FileExclusions.Split(";").Trim())
$retentionPeriod = $configfile.Settings.RetentionPeriod
$AICLogs = $configfile.Settings.AICLogs
$AICLogsRententionPeriod = $configfile.Settings.AICLogsRententionPeriod
$fileCleanerLogs = $configfile.Settings.FileCleanerLogs
$fileCleanerLogsRententionPeriod = $configfile.Settings.FileCleanerLogsRententionPeriod
#Setup FileCleaner output success logfiles
$successLogfile = $configfile.Settings.SuccessOutputLogfile
$dirName = [io.path]::GetDirectoryName($successLogfile)
$filename = [io.path]::GetFileNameWithoutExtension($successLogfile)
$ext = [io.path]::GetExtension($successLogfile)
$successLogfile = "$dirName\$filename$(get-date -Format yyyy-MM-dd)$ext"
#Setup FileCleaner output error logfiles
$errorLogfile = $configfile.Settings.ErrorOutputLogfile
$dirName = [io.path]::GetDirectoryName($errorLogfile)
$filename = [io.path]::GetFileNameWithoutExtension($errorLogfile)
$ext = [io.path]::GetExtension($errorLogfile)
$errorLogfile = "$dirName\$filename$(get-date -Format yyyy-MM-dd)$ext"
#Setup Retention Period
$LastWrite = (Get-Date).AddDays(-$retentionPeriod)#.ToString("d")
$AICLastWrite = (Get-Date).AddDays(-$AICLogsRententionPeriod)#.ToString("d")
$fileCleanerLastWrite = (Get-Date).AddDays(-$fileCleanerLogsRententionPeriod)
#EMAIL SETTINGS
$smtpServer = $configfile.Settings.SMTPServer
$emailFrom = $configfile.Settings.EmailFrom
$emailTo = $configfile.Settings.EmailTo
$emailSubject = $configfile.Settings.EmailSubject
#Update the email subject to display the Host Server value
$emailSubject -replace "HostServer", $hostServer
$countUnaccessibleUNCPaths = 0
#Check Logfiles exists, if not create them
if(!(Test-Path -Path $successLogfile))
{
New-Item -Path $successLogfile –itemtype file
}
if(!(Test-Path -Path $errorLogfile))
{
New-Item -Path $errorLogfile –itemtype file
}
foreach ($dir in $dirs)
{
#needs a check to determine if server/the UNC Path is accessible. If it fails to connect, it needs to move on to the next UNC share but a flag needs to
#be generate to alert us to investigate why the UNC share was not accessible during the job run.
If(Test-Path -Path $dir)
{
#write to output logfile Directory info
$Msg = Write-Output "$(Get-Date -UFormat "%D / %T") - Accessing: $dir"
$Msg | out-file $successLogfile
If ($scanSubDirectories -eq "True")
{
If ($deleteAllFiles -eq "True")
{
#ScanSubDirectories and delete all files older than the $retentionPeriod, include Sub-Directories / also forces the deletion of any hidden files
$logFiles = Get-ChildItem -Path $dir -Force -Recurse -Exclude $fileExclusions[0],$fileExclusions[1] | Where { $_.LastWriteTime -le "$LastWrite" }
DeleteLogFiles($logFiles)
#foreach($logFile in $logFiles)
#{
# if($logFile -ne $null)
# {
# $Msg = Write-Output "$("Deleting File $logFile")"
# $Msg | out-file $successLogfile -append
# Remove-Item $logFile.FullName -Force
# }
#}
}
Else
{
#"ScanSubDirectories but only delete specified file types."
$logFiles = Get-Childitem $dir -Include $fileTypesToDelete[0],$fileTypesToDelete[1],$fileTypesToDelete[2], $liveSiteLogs -Recurse -Exclude $fileExclusions[0],$fileExclusions[1] | Where {$_.LastWriteTime -le "$LastWrite"}
DeleteLogFiles($logFiles)
#foreach($logFile in $logFiles)
#{
# if($logFile -ne $null)
# {
# $Msg = Write-Output "$("Deleting File $logFile")"
# $Msg | out-file $successLogfile -append
# Remove-Item $logFile.FullName -Force
# }
#}
}
}
Else
{
#Only delete files in top level Directory
If ($deleteAllFiles -eq "True")
{
$logFiles = Get-ChildItem -Path $dir -Force -Exclude $fileExclusions[0],$fileExclusions[1] | Where { $_.LastWriteTime -le "$LastWrite" }
DeleteLogFiles($logFiles)
#foreach($logFile in $logFiles)
#{
# if($logFile -ne $null)
# {
# $Msg = Write-Output "$("Deleting File $logFile")"
# $Msg | out-file $successLogfile -append
# Remove-Item $logFile.FullName -Force
# }
#}
}
Else
{
$logFiles = Get-Childitem $dir -Include $fileTypesToDelete[0],$fileTypesToDelete[1],$fileTypesToDelete[2], $liveSiteLogs -Exclude $fileExclusions[0],$fileExclusions[1] | Where {$_.LastWriteTime -le "$LastWrite"}
DeleteLogFiles($logFiles)
#foreach($logFile in $logFiles)
#{
# if($logFile -ne $null)
# {
# $Msg = Write-Output "$("Deleting File $logFile")"
# $Msg | out-file $successLogfile -append
# Remove-Item $logFile.FullName -Force
# }
#}
}
}
}
Else
{
$countUnaccessibleUNCPaths++
#server/the UNC Path is unaccessible
$Msg = Write-Output "$(Get-Date -UFormat "%D / %T") Unable to access $dir."
$Msg | out-file $errorLogfile -append
}
# Call the function to Delete the AIC XML Logfiles
DeleteAICXMLLogs $dir
}
#If any of the directories were unaccessible send an email to alert the team
if($countUnaccessibleUNCPaths.count -gt 0)
{
# Call the function to send the email
SendEmail $emailSubject $emailFrom $emailTo
}
#Only keep 2 weeks worth of the FileCleaner App logs for reference purposes
If(Test-Path -Path $fileCleanerLogs)
{
#write to output logfile Directory info
$Msg = Write-Output "$(Get-Date -UFormat "%D / %T") - Accessing: $fileCleanerLogs"
$Msg | out-file $successLogfile
$fileCleanerLogs = Get-Childitem $fileCleanerLogs -Recurse | Where {$_.LastWriteTime -le "$fileCleanerLastWrite"}
DeleteLogFiles($fileCleanerLogs)
#foreach($fileCleanerLog in $fileCleanerLogs)
#{
# if($fileCleanerLog -ne $null)
# {
# $Msg = Write-Output "$("Deleting File $fileCleanerLog")"
# $Msg | out-file $successLogfile -append
# Remove-Item $fileCleanerLog.FullName -Force
# }
#}
}
Function DeleteLogFiles($logFiles)
{
foreach($logFile in $logFiles)
{
if($logFile -ne $null)
{
$Msg = Write-Output "$("Deleting File $logFile")"
$Msg | out-file $successLogfile -append
Remove-Item $logFile.FullName -Force
}
}
}
Function DeleteAICXMLLogs($dir)
{
#Split the UNC path $dir to retrieve the server value
$parentpath = "\\" + [string]::join("\",$dir.Split("\")[2])
#test access to the \\server\D$\DebugXML path
If(Test-Path -Path $parentpath$AICLogs)
{
$Msg = Write-Output "$(Get-Date -UFormat "%D / %T") - Accessing: $parentpath$AICLogs"
$Msg | out-file $successLogfile
#Concantenate server value to $AICLogs to delete all xml logs in \\server\D$\DebugXML with a retention period of 30Days
$XMLlogFiles = Get-ChildItem -Path $parentpath$AICLogs -Force -Include $fileTypesToDelete[3]-Recurse -Exclude $fileExclusions[0],$fileExclusions[1] | Where { $_.LastWriteTime -le "$AICLastWrite" }
#get each file and add the filename to be deleted to the successLogfile before deleting the file
DeleteLogFiles($XMLlogFiles)
#foreach($XMLlogFile in $XMLlogFiles)
#{
# if($XMLlogFile -ne $null)
# {
# $Msg = Write-Output "$("Deleting File $XMLlogFile")"
# $Msg | out-file $successLogfile -append
# Remove-Item $XMLlogFile.FullName -Force
# }
#}
}
Else
{
$Msg = Write-Output "$("$parentpath$AICLogs does not exist.")"
$Msg | out-file $successLogfile -append
}
}
Function SendEmail($emailSubject, $emailFrom, $emailTo)
{
$MailMessage = New-Object System.Net.Mail.MailMessage
$SMTPClient = New-Object System.Net.Mail.smtpClient
$SMTPClient.host = $smtpServer
$Recipient = New-Object System.Net.Mail.MailAddress($emailTo, "Recipient")
$Sender = New-Object System.Net.Mail.MailAddress($emailFrom, "Sender")
$MailMessage.Sender = $Sender
$MailMessage.From = $Sender
$MailMessage.Subject = $emailSubject
$MailMessage.Body = #"
This email was generated because the FileCleaner script was unable to access some UNC Paths, please refer to $errorLogfile for more information.
Please inform the Team if you plan to resolve this.
This is an automated email please do not respond.
"#
$SMTPClient.Send($MailMessage)
}
when debugging I'm getting these errors:
DeleteAICXMLLogs : The term 'DeleteAICXMLLogs' is not recognized as
the name of a cmdlet, function, script file, or operable program.
Check the spelling of the name, or if a path was included, verify
that the path is correct and try again. At
C:\Users\pmcma\Documents\Projects\Replace FileCleaner with PowerShell
Script\FileCleaner.ps1:158 char:5
+ DeleteAICXMLLogs $dir
+ ~~~~~~~~~~~~~~~~
+ CategoryInfo : ObjectNotFound: (DeleteAICXMLLogs:String) [], CommandNotFoundException
+ FullyQualifiedErrorId : CommandNotFoundException
SendEmail : The term 'SendEmail' is not recognized as the name of a
cmdlet, function, script file, or operable program. Check the spelling
of the name, or if a path was included, verify that the path is
correct and try again. At C:\Users\pmcma\Documents\Projects\Replace
FileCleaner with PowerShell Script\FileCleaner.ps1:164 char:5
+ SendEmail $emailSubject $emailFrom $emailTo
+ ~~~~~~~~~
+ CategoryInfo : ObjectNotFound: (SendEmail:String) [], CommandNotFoundException
+ FullyQualifiedErrorId : CommandNotFoundException
DeleteLogFiles : The term 'DeleteLogFiles' is not recognized as the
name of a cmdlet, function, script file, or operable program. Check
the spelling of the name, or if a path was included, verify that the
path is correct and try again. At
C:\Users\pmcma\Documents\Projects\Replace FileCleaner with PowerShell
Script\FileCleaner.ps1:175 char:5
+ DeleteLogFiles($fileCleanerLogs)
+ ~~~~~~~~~~~~~~
+ CategoryInfo : ObjectNotFound: (DeleteLogFiles:String) [], CommandNotFoundException
+ FullyQualifiedErrorId : CommandNotFoundException
I don't see anything wrong with how I'm declaring the functions or calling them. Any ideas why this script is failing?
PowerShell Scripts are read from the top to the bottom, so you can't use any references before they are defined, most probably that is why you are receiving errors.
Try adding your function definition blocks above the point where you call them.
Alternatively you can make a function having global scope. Just preface the function name with the keyword global: like,
function global:test ($x, $y)
{
$x * $y
}
I've had this happen as well. Try placing the functions before the business logic. This is a script, not compiled code. So the functions are yet to be declared before you are calling them.