Azure Monitor: How to fetch UsedCapacity for particulare FileShare? - azure-files

I'm trying to get the used Capacity from the Azure Monitor REST-API for a particular Fileshare.
But all I get is the summarized Capacity. Does anymone know the trick?
This command will deliver the UsedCapacity for the entire StorageAccount:
GET https://management.azure.com/subscriptions/<MYSUBSCRIPTION>/resourceGroups/<MYRG>/providers/Microsoft.Storage/storageAccounts/<MYSTORAGEACCOUNT>/fileServices/default/providers/Microsoft.Insights/metrics?api-version=2018-01-01

You can get the share usage by using the Get Share Stats API: Get Share Stats (FileREST API) - Azure Files | Microsoft Docs

Related

how to get number of pcf instances running in java code?

I have an app that uses spring rest and deployed on PCF. Now inside the code I have to get the number of PCF instances running currently. Can anyone help?
Before I answer this - why do you want to know? It's an anti-pattern for cloud native apps to know about their peers; they should each be working in total isolation.
You can discover this by looking up application details by GUID in the CloudController. You can get your current app's GUID in the VCAP_APPLICATION environment variable.
https://apidocs.cloudfoundry.org/245/apps/get_app_summary.html
In order to hit the CloudController your app will need to know the system domain of your Cloud Foundry (eg api.mycf.com) and credentials that allow it to make that request.

Retrieveing wso2 das API Usage information

I have a custom front-end set up and running in Grails. This is linked to WSO2 user creation and subscription and consequent clicks where user can invoke the API's published. I also have DAS configured with a custom dayabase(MySQL). I want to keep a graph in my frontend which will show the API Usage graph. How do I get this information? Is it possible by some REST call or probably directly by querying database? I have great scaling needs so querying might incur high latency costs.
The API usage data are collected,analysed and exposed by DAS.DAS has REST api as well javascript api to retrieve the summarised data

What does this error mean?

Got the following in the logcat while using Google FIT api for android.
No live data sources available for Sensory Registration Request{type Data Type{com.google.calories.expended[calories(f)]}
I have registered a listner and the following request to get calories expended by the user:
SensorRequest sensorRequest = new SensorRequest.Builder().setDataType(DataType.TYPE_CALORIES_EXPENDED)
.setSamplingRate(10, TimeUnit.SECONDS).build();
Are you trying to fetch the data from the Google fit Cloud or from the sensors, if its cloud, go for HistoryAPI, and if its from sensors, may be the device doesn't have built in sensors to get calories burned, so try adding sensors via BLE with the help of BleAPI and then go for this call.
You have to get a proper response i believe

Transform service types when using Bluemix Cloud Integration Service

I have been doing some research about the IBM Bluemix Cloud Integration Service and found the following links:
ftp://public.dhe.ibm.com/cloud/bluemix/cloudintegration/Cloud_Integration_for_Bluemix_User_Guide.pdf
https://www.ng.bluemix.net/docs/services/CloudIntegration/index.html
From what I have read, I have not been able to understand whether it is able to run some kind of "protocol transformation" or if it just publishes a REST or SOAP API.
I mean, imagine for example that I have a full backend publishing everything as SOAP services, but for some reason my apps only can get information through REST APIs. Does the basic connector o maybe the standard one make that kind of integration? Or do I need to put a third party product (or maybe even DataPower) to make that transformation?
Using the Cloud Integration service you can also create a REST API that links to an existing on-premises API (both SOAP and REST). Please take a look here: Creating a REST API that links to an existing on-premises API. You can upload a file that defines the on-premises API (WSDL or Swagger definition).
Please note that currently Cloud Integration cannot retrieve automatically that definition from your on-premises system. It has to be uploaded manually by the user.

Implementing IoT PowerBI table schema

I'm currently implementing an IoT solution that has a bunch of sensors sending information in JSON format through a gateway.
I was reading about doing this on azure but couldn't quite figure out how the JSON scheme and the Event Hubs work to display the info on PowerBI?
Can I create a schema and upload it to PowerBI then connect it to my device?
there's multiple sides to this. To start with, the IoT ingestion in Azure is done tru Event Hubs as you've mentioned. If your gateway is able to do a RESTful call to the Event Hubs entry point, Event Hubs will get this data and store it temporarily for the retention period specified. Then stream analytics, will consume the data from Event Hubs and will enable you to do further processing and divert the data to different outputs. In your case, you can set one of the outputs to be a PowerBI dashboard which you can authorize with an organizational account (more on that later) and the output will automatically tied to PowerBI. The data schema part is interesting, the JSON itself defines the data table schema to be used on PowerBI side and will propagate from EventHubs to Stream Analytics to PowerBI with the first JSON package sent. Once the schema is there it is fixed and the rest of the data being streamed in should be in the same format.
If you don't have an organizational account at hand to use with PowerBI, you can register your domain under Azure Active Directory and use that account since it is considered within your org.
There may be a way of altering the schema afterwards using PowerBI rest api. Kindly find the links below..Haven't tried it myself tho.
https://msdn.microsoft.com/en-us/library/mt203557.aspx
Stream analytics with powerbi
Hope this helps, let me know if you need further info.
One way to achieve this is to send your data to Azure Events Hub, read it and send it to PowerBI with Stream Analytics. Listing all the steps here would be too long. I suggest that you take a look at a series of blog posts I wrote describing how I built a demo similar to what you try to achieve. That should give you enough info to get you started.
http://guyb.ca/IoTAzureDemo