How to retrieve azure iot-hub storage account name - json

I am writing a PowerShell script that creates iot-hub account with storage account and a streamAnalytics job.
For updating the json file for the streamAnalytics job, I need to retrieve the storage account name that has just been created. Unfortunately, AzureRM has no function to retrieve storage account Name.
Any suggestions on how to do that?
My current script receives it as input from user, but I want the script to be automated and with no need for user input.

got it, I just used:
$storageAccountName = (Get-AzureRmStorageAccount -ResourceGroupName $IotHubResourceGroupName).storageAccountName

Related

Sending email with file attachment using Google Cloud Function and Pub/Sub

I have a Pub/Sub topic where I publish messages and that pubsub topic is subscribed by a Google Cloud Function which sends the email to the customer using mailgun API.
This works fine so far, now I got the requirement that I have to send the email with the file attachment.
I can easily do this using Google Cloud Function using HTTP trigger but how to do it using a Pub/Sub topic?
Check the given image.
Note: The file size could be more than 10mb in my case.
First of all if I understand your goal correctly, PubSub alone will never be enough as the actual work will happen at the function.
To achieve the above you need a cloud function with Pub/Sub trigger.
Once the function is called, you need to either fetch the attachment from a Google Cloud Bucket, or download it from a another API/source you have. You'll have to download it inside the /tmp directory which is used to store temporary files.
Once completed, then you can use mailgun or other tools of your choice. Remember the attachment is inside the /tmp directory and not ./
Using mailgun or any other method does not change anything on the overall approach. For example if you were to use Gmail API, you'd add this to your cloud function for the final step.

Date based Folder creation in Google Cloud Storage

Use Case :
i have to store request/response objects in Google Cloud Storage on daily basis, wanted to create folder on daily basis (bucket/year/month/day format) and store all the objects within current date/day.
my typical flow is like below:
Json message to PubSub --> Cloud Function (Python) ---> Google Cloud storage on daily basis.
Query:
Since Cloud Function can trigger parallel for each events in PubSub (millions of messages a day)and might create duplicate folders in GCS , is there any way to synchronise folder creation before creating object in GCS for given day?
In Google Cloud Storage, the file name includes the full path (flat namespace).
For example, the name of a hypothetical file is "your-bucket/abc/file.tx" instead of just "file.txt"
Having said that, folders don't exist in cloud storage, you don't have to worry about creating folders or creating folders simultaneously, you only need to avoid to create files with the same name.

Google DataStudio cannot take parameter from URL, but could I use Google Apps Script Google Data Studio data connector to pass the parameter value?

I have a Google DataStudio Reports, with an "account ids filter", for example I may have 100 client accounts.
I want to share this report with these 100 accounts. But each account should only see their own report.
But Google DataStudio does not take parameter from url, so I cannot pass account id value into the DataStudio report url to filter out the report accordingly.
I have a thought. What if I use Google Apps Script to create a Google Data Studio data connector, and use this data connector as the data source for my Google DataStudio reports.
Then I will share this report with my 100 client accounts.
Each client account will access this report (the data source is Google Apps Script data connector). And the Google Apps Script data connector would run under this client's google account. and the Google Script will do the following jobs, authenticate the this account, know which account it is based on this account's google account, fetch the data of this account only , as data source for the Google DataStudio Report. This way, each client would get the report for himself/herself.
Would that work? Do anyone have resources or codes can share for this problem and this solution?
In your connector code, use getEffectiveUser() (reference) to get the user's Id and filter your data by that.
Using your connector, create a data source and enforce viewer's credentials.
Create a dashboard from that data source.
When your clients view the dashboard, they will have to authorize the connector the first time. Then they will only see data that is applicable to them.

How to reset user's password in Google Apps Script using admin SDK?

I have a list newly creates users in my domain. I want to reset all of their password. But I don't want to do it using admin console. How can I do this using admin SDK and Google App script?
My idea is get a list of user whose password I need to reset, and then assign some random strings to those email id's and update it as their password.
I this possible? If not, is there any better idea? (I have tried using GAM, But I don't want to do it through CSV files each time.)
Resetting a password is possible via the Directory API within the Admin SDK; therefore, you would be able to do this via Apps Script.
Depending how comfortable you are with coding, you might want to use Google Apps Manager to complete this and other tasks. The command to change a password can be found here. These types of commands can be run in bulk by either reading a CSV of all users (can also be created with GAM) or writing a small script to go through a group, OU, etc. and reset only those users.

Add-On one time configuration parameters

I'm working on writing a script to synch email from gmail to another system. One item the script will need is configuration parameters from the user to the other system ( username / password ).
The configuration should only need to be done once for the user and then the script will be set up to run at regular intervals ( probably hourly ).
How can I prompt the user for username & password when they install the add-on and store it so the script can use those values?
Take a look in the docs for user properties service. Store it there its per user.
Note that add-ons dont yet allow time triggers so you cant yet do the 'regular intervals' part with add-ons.