Log Analyics adding Customer dimensions possible via powershell? - json

Hello I am trying to add a custom dimension or something similar called properties. Below I added a printscreen so something similar found online
Here is the code I used to try and create this
$JSON = #{
Type = 'SQL'
Subscriptionname = "123"
property = #{
SQLServerName = "myServer";
DatabaseName = "myDatabase";
}
}
$json2 = $JSON | ConvertTo-Json
# $json2
# Submit the data to the API endpoint
Post-LogAnalyticsData -customerId $customerId -sharedKey $sharedKey -body ([System.Text.Encoding]::UTF8.GetBytes($json2)) -logType "MyRecordTypetoo"
But the results came out like below
Anyone have any ideas to get this working?

Please the follow the below ways to fix the issue:
Way 1
using keithbabinec/AzurePowerShellUtilityFunctions Git Hub repo
#Import the AzurePowerShell Utility Function file from above github repo
Import-Module AzurePowerShellUtilityFunctions.psd1
# Adding custom properties using Event Telemetry
Send-AppInsightsEventTelemetry -InstrumentationKey '<Instrumentation Key of AI>' -EventName <Event Name> -CustomProperties #{ '<Custom Property>' = '<Property Value>'}
Way 2
Using PSModule
For adding custom dimension in application insights, we have PSModule to add the properties in application Insights. Refer here for the detailed steps.
Way 3
Using RESTAPI
I Hope you already tried with the same. By calling the RestAPI as same mentioned in the above PowerShell module.
Result

Related

definition inside a property in invoke-restmethod (JSON Body)

I'm pretty stuck and can't find anything about it on the internet. I'm also not sure how to describe the thing i'm looking for, so maybe someone can help me.
I've got some code to create a ticket in TopDesk through API using invoke-restmethod in PS.
For the request field in TopDesk, I need some output stored in a variable, but if I want to use a variable in the PS command, I need to define the JSON body with the use of #{} | covertTo-JSON (found that somewhere on the internet).
Now this parameter I need to put through, has to have a definition. I need to give in in the value is a email or a name.
$json = #{
"callerLookup" = "{ email : email#domain.com }"
} | Convertto-JSON
Now the thing is, TopDesk doesn't see the "{ email : email#domain.com }" as a correct value.
Before, I just the following (which will work, but can't use variables):
$body = '{"email": "automation#rid-utrecht.nl"}'
I hope I described my problem cleary enough and hope that someone can help me.
Thanks in advance.
Kind regards,
Damian
For ConvertTo-Json to produce the serialized { "property" : "value" } syntax, you must pass it an object that has a property called property and an associated value equal to value. You can easily create this scenario with the [pscustomobject] accelerator.
$json = #{
callerLookup = [pscustomobject]#{email = 'email#domain.com'}
} | ConvertTo-Json

List Agent real names from Get-CsCallQueue cmdlet

I run the Get-CsCallQueue | Select-Object -Property Name,Agents cmdlet, but I want to see the real names of the agents. Instead I get something like hashes(?).
How can I see the names?
Get-CsCallQueue | Select-Object -Property Name,Agents
Name Agents
---- ------
CQ1 {adfe5681-ebc8-xxx-xxxx-........, OptIn}
CQ2 {adfe5681-ebc8-xxx-xxxx-......., OptIn}
CQ3 {baae77b8-5ace-xxx-xxxx-......, OptOut}
Is this Skype for Business Online (SFBO) or on-prem? You need to match up the agent GUID with the agent name. You'll need to use different cmdlets depending on your answer. Here's an example of how to do this using SFBO:
$queue = get-cscallqueue -NameFilter "<queue name here>"
$agents = $queue.agents
foreach ($agent in $agents) {
$user = $agent.ObjectId | Get-CsOnlineUser
$agent | Add-Member -NotePropertyName Name -NotePropertyValue $user.alias
}
$agents|Select Name,OptIn
Thanks,
Jason
Working on this with a Script that does the heavy lifting for you.
Module is Teamsfunctions on PSgallery.
The command is Get-TeamsCallQueue. I have surfaced all friendly names for Get/New/Set/Remove for CallQueues (I still need to finish testing on them, so handle with care :), should be finished in the coming weeks)
There is also Find-AzureAdUser in my module so that you can get the Object by feeding it the UPN instead of the ObjectID.
Hope that helps :)

Powershell Json object manipulation

I am hitting a Rest API and collected a gnarly block of Json. I'm running convertfrom-json on that to get a powershell object which I would like to manipulate. Essentially I need to prune a number of field/values.
Its no issue to 'get' the fields I want to remove from the object as I can just drill down to the field and collect the value thats easy, where I am stuck is how to trim off that field from the posh object. Would appreciate any assistance. Thanks.
Example:
$sample_json = #"
{
"fields": {
"field_one": 1,
"field_two": 2,
"field_three": "three",
"field_four": "remove_me",
"field_five": 5
}
}
"#
Clear-Host
$json_object = ConvertFrom-Json -InputObject $sample_json
$json_object
Gives:
fields
------
#{field_one=1; field_two=2; field_three=three; field_four=remove_me; field_five=5}
So the question is how can I remove "field_four" key, and it's value, from $json_object ? Apologies if this is crazy simple; I'm a bit out of touch with Powershell these last few years.
You can remove "field_four" with the Remove method from PSObject.Properties:
$json_object.fields.PSObject.Properties.Remove("field_four")
Use the following statement
$json_object.fields.PSObject.Properties.Remove("field_four")

Making script read a CSV and output answer

I am looking to add a line(s) into a PowerShell script. I want to get my script to check if something exists in a CSV it will give a true/false response.
Basically I have a script to remove user access from O365 and AD including and mailbox changes.
My company also uses a lot of external portals which need removing manually.
I would like it to check the $EmailAddress input against CSV's (i.e DomainHostAccess.CSV, WebsiteHostAccess.CSV, SupplierAccess.Csv) then if there name is in the list it will output something similar to;
DomainHostAccess | True
WebsiteHostAccess | False
SupplierAccess | True
that way we know that we need to manually log into these services and remove accounts.
I have looked on the posts on here already and couldn't find anything suitable, I am fairly new to PS and this is a little advance for me so I would appreciate any help that can be given.
You could try something like this
$UserList = Import-Csv -Path C:\EmailAddress.CSV
$UserData = Import-Csv -Path C:\DomainHostAccess.CSV
ForEach ($Email in $UserList)
{
$userMatch = $UserData | where {$_.Name -like $Email.Name}
If($userMatch)
{
Write-host $Email.Name "Domain Access True"
}
Else
{
Write-host $Email.Name "Domain Access False"
}
}
You would need to have headers on your CSVs for this check to work.

writing csv file on hdfs but before writing on hdfs how to filter csv file using flume static interceptor

I used below configuration for filtering my csv file but it is unable to filter.
I used static interceptor for filtering, do I need to use any other interceptor? Please suggest me what I need to write in below config file.
My csv file sample below.
Id,Name,First,Last,Display,Job,Department,OfficeNo,OfficePh,MobilePh,Fax,Address,City,State,ZIP,Country
1,chris#contoso.com,Chris,Green,Chris Green,IT Manager,Information Technology,123451,123-555-1211,123-555-6641,9821,1 Microsoft way,Redmond,Wa,98052,United States
2,ben#contoso.com,Ben,Andrews,Ben Andrews,IT Manager,Information Technology,123452,123-555-1212,123-555-6642,9822,1 Microsoft way,Redmond,Wa,98052,United States
3,david#contoso.com,David,Longmuir,David Longmuir,IT Manager,Information Technology,123453,123-555-1213,123-555-6643,9823,1 Microsoft way,Redmond,Wa,98052,United States
4,cynthia#contoso.com,Cynthia,Carey,Cynthia Carey,IT Manager,Information Technology,123454,123-555-1214,123-555-6644,9824,1 Microsoft way,Redmond,Wa,98052,United States
my flume-conf.properties file is listed below, and I am expecting output like (Id=1) go through ch1 and (Id=2) go through ch2 and other Id(3,4) go through default channel.
Please help me out doing it.
a1.sources=src1
a1.channels=ch1 ch2
a1.sinks=s1 s2
a1.sources.src1.type=exec
a1.sources.src1.command=tail -F /home/manish/TwitterExample /Import_User_Sample_en.csv
a1.channels.ch1.type=memory
a1.channels.ch1.capacity=10000
a1.channels.ch1.transactioncapacity=100
a1.channels.ch2.type = memory
a1.channels.ch2.capacity = 10000
a1.channels.ch2.transactioncapacity = 100
Static interceptor as follows
a1.sources.src1.interceptors=i1
a1.sources.src1.interceptors.i1.type=static
a1.sources.src1.interceptor.i1.key=Id
a1.sources.src1.interceptor.i1.value=1
a1.sources.src1.interceptor.i1.preserveExisting=false
a1.sources.src1.interceptor=i2
a1.sources.src1.interceptor.i2.type=static
a1.sources.src1.interceptor.i2.key=Id
a1.sources.src1.interceptor.i2.value=2
a1.sources.src1.interceptor.i2.preserveExisting=false
a1.sources.src1.fileHeader=true
a1.sources.src1.selector.type=multiplexing
a1.sources.src1.selector.header=Id
a1.sources.src1.selector.mapping.1=ch1
a1.sources.src1.selector.mapping.2 =ch2
a1.sources.src1.selector.default = ch2
a1.sinks.s1.type=hdfs
a1.sinks.s1.hdfs.path=hdfs://kdp.ambarikdp1.com:8020/user/data/twitter
a1.sinks.s1.hdfs.filetype=DataStream
a1.sinks.s1.hdfs.rollCount=0
a1.sinks.s1.hdfs.rollSize=0
a1.sinks.s1.hdfs.rollInterval=300
a1.sinks.s1.hdfs.serializer=HEADER_AND_TEXT
a1.sinks.s2.type = hdfs
a1.sinks.s2.hdfs.path = hdfs://kdp.ambarikdp1.com:8020/user/data/t2
a1.sinks.s2.hdfs.filetype = DataStream
a1.sinks.s2.hdfs.rollCount = 0
a1.sinks.s2.hdfs.rollSize = 0
a1.sinks.s2.hdfs.rollInterval = 300
a1.sinks.s2.hdfs.serializer=HEADER_AND_TEXT
a1.sources.src1.channels=ch1 ch2
a1.sinks.s1.channel=ch1
a1.sinks.s2.channel = ch2
At least, the way you are using multiple interceptors is wrong. Interceptors must be added as a list this way:
a1.sources.src1.interceptors = i1 i2
Then, you can configure each one of them:
a1.sources.src1.interceptors.i1.type = ...
a1.sources.src1.interceptors.i1.other = ...
...
a1.sources.src1.interceptors.i2.type = ...
a1.sources.src1.interceptors.i2.other = ...
...
Being said that, I think the static interceptor is not what you need, since AFAIK it always adds the same static header to all the events. I mean, all the events will have the same header name and value, independently of the "id" field.
Please, have a look on How to use regex_extractor selector and multiplexing interceptor together in flume?, i.e. using the Regex Extractor Interceptor instead.