I'm connecting to Azure resources via API at resources.azure.com, from there I'm taking the API for Microsoft.Compute and importing all the VM details into Power BI via JSON.
The import works fine, however with some situations of the data there is case discrepancy. For example, when working with the tags value, some people have typed the same word but in different case, such as;
"tags": {
"Project": "DT",
"SLStandard": "Yes"
compared to
"tags": {
"project": "DT",
"SlStandard": "Yes"
When expanding the columns out in Power BI it will consider the items listed above as two different value.
Ideally I would like to have the JSON imported and the 'case' ignored, or perhaps mark all incoming as either upper or lower case.
I have read the two links below, but I'm new to Power BI and I'm unsure how to implement it, or even if it is what I need.
Case sensitivity in Power BI
and
Power BI changing text case automatically
and
http://www.thebiccountant.com/2016/10/27/tame-case-sensitivity-power-query-powerbi/
Here is my Advanced Editor code:
let
iterations = 10,
url =
"https://management.azure.com/subscriptions/< subscription id >/providers/Microsoft.Compute/virtualMachines?api-version=2017-12-01",
FnGetOnePage =
(url) as record =>
let
Source = Json.Document(Web.Contents(url)),
data = try Source[value] otherwise null,
next = try Source[nextLink] otherwise null,
res = [Data=data, Next=next]
in
res,
GeneratedList =
List.Generate(
()=>[i=0, res = FnGetOnePage(url)],
each [i]<iterations and [res][Data]<>null,
each [i=[i]+1, res = FnGetOnePage([res][Next])],
each [res][Data]),
#"Converted to Table" = Table.FromList(GeneratedList, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
#"Expanded Column1" = Table.ExpandListColumn(#"Converted to Table", "Column1"),
#"Expanded Column2" = Table.ExpandRecordColumn(#"Expanded Column1", "Column1", {"tags"}, {"Column1.tags"}),
#"Expanded Column1.tags" = Table.ExpandRecordColumn(#"Expanded Column2", "Column1.tags", {"Project", "project", "SLStandard", "sLStandard", "BIOffline", "bIStandard", "AutomationBI", "biStandard", "BIStandard", "asdf-U001", "TestVM"}, {"Column1.tags.Project.1", "Column1.tags.project", "Column1.tags.SLStandard.1", "Column1.tags.sLStandard", "Column1.tags.BIOffline", "Column1.tags.bIStandard.1", "Column1.tags.AutomationBI", "Column1.tags.biStandard.2", "Column1.tags.BIStandard", "Column1.tags.asdf-U001", "Column1.tags.TestVM"})
in
#"Expanded Column1.tags"
If you're wondering about why my query is so long for import, then check out my previous post here: Power BI - Call Azure API with nextLink (next page)
Any help would be greatly appreciated.
I'm struggling with the same issue today. One workaround, although not elegant, is to lowercase or UPPERCASE the JSON upstream.
Worked for me as a temporary solution.
Hope it helps,
Chris
It seems that your issue is coming directly from the data source and the discrepancy between the field names at the data source level.
Here's a sample code on how you can force/make sure that they all have the same casing:
let
Source = {[A=24,b=53], [a=43,B=43], [a=3,b=3]},
Custom1 = List.Transform(Source, (_)=> Record.RenameFields(_, List.Zip({Record.FieldNames(_), List.Transform(Record.FieldNames(_), Text.Lower)})))
I've had this answered on another forum and tried it, and it does work.
https://community.powerbi.com/t5/Desktop/Power-BI-case-insensitive-with-JSON/m-p/360134
"You can rename your record fields (by applying Text.Proper for example) before expanding (in the last step):"
Table.TransformColumns(#"Expanded Column2", {{"tags", each Record.RenameFields(_, List.Zip({Record.FieldNames(_), List.Transform(Record.FieldNames(_), (name)=> Text.Proper(name))}))}}),
An the final output looks like this:
let
iterations = 10,
url =
"https://management.azure.com/subscriptions/< subscription >/providers/Microsoft.Compute/virtualMachines?api-version=2017-12-01",
FnGetOnePage =
(url) as record =>
let
Source = Json.Document(Web.Contents(url)),
data = try Source[value] otherwise null,
next = try Source[nextLink] otherwise null,
res = [Data=data, Next=next]
in
res,
GeneratedList =
List.Generate(
()=>[i=0, res = FnGetOnePage(url)],
each [i]<iterations and [res][Data]<>null,
each [i=[i]+1, res = FnGetOnePage([res][Next])],
each [res][Data]),
#"Converted to Table" = Table.FromList(GeneratedList, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
#"Expanded Column1" = Table.ExpandListColumn(#"Converted to Table", "Column1"),
#"Expanded Column2" = Table.ExpandRecordColumn(#"Expanded Column1", "Column1", {"properties", "tags"}, {"properties", "tags"}),
#"Expanded properties" = Table.TransformColumns(#"Expanded Column2", {{"tags", each Record.RenameFields(_, List.Zip({Record.FieldNames(_), List.Transform(Record.FieldNames(_), (name)=> Text.Proper(name))}))}}),
#"Expanded properties1" = Table.ExpandRecordColumn(#"Expanded properties", "properties", {"vmId"}, {"properties.vmId"}),
#"Expanded tags" = Table.ExpandRecordColumn(#"Expanded properties1", "tags", {"Project", "Slstandard", "Bioffline", "Bistandard", "Automationbi", "asdf-U001", "Testvm"}, {"tags.Project", "tags.Slstandard", "tags.Bioffline", "tags.Bistandard", "tags.Automationbi", "tags.asdf-U001", "tags.Testvm"})
in
#"Expanded tags"
Hope this helps other people!
Related
I have to get data from 2 API (one is input and another is hardcoded).
My scenario is I would input my 1st key (as Authentication key) to run 1st API and inside I have a column that calls the 2nd API (which I hardcoded apikey) but the system still alerts me as wrong credential. If I run separately it will run normally. I've done a similar case but in that only used 1 APIkey. I don't know how to resolve it.
ExpandData = (dataList as list) as table =>
let
FromList = Table.FromList(dataList, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
#"Expanded Column1" = Table.ExpandRecordColumn(FromList, "Column1", {"search-results"}, {"search-results"}),
#"Expanded search-results" = Table.ExpandRecordColumn(#"Expanded Column1", "search-results", {"entry"}, {"entry"}),
#"Expanded entry" = Table.ExpandListColumn(#"Expanded search-results", "entry"),
#"Expanded entry1" = Table.ExpandRecordColumn(#"Expanded entry", "entry", {"dc:identifier", "prism:doi"}, {"identifier", "doi"}),
#"Reordered Columns" = Table.ReorderColumns(#"Expanded entry1",{"doi", "identifier"}),
#"Add Column" = Table.AddColumn(#"Reordered Columns", "Detail", each Text.Range(Text.From([identifier]),10)),
#"Add Column2" = Table.AddColumn(#"Add Column", "Detail2", each ExpandDataPublish(Text.From([Detail])))
in
#"Add Column2";
ExpandDataPublish = (data as text) as table =>
let
Source = Json.Document(Web.Contents("https://******" & data,
[Headers=[#"Accept" = "application/json;odata.metadata=minimal",
#"X-ELS-APIKey"="7f59af901d2d86f78a1fd60c1bf9426a"]])),
#"Converted to Table" = Table.FromRecords({Source})
in
#"Converted to Table";
It runs normally if not call 2nd API
But fail if call
I am trying to get financial data from Financial Modeling Prep's API into an excel spreadsheet. I am beginning to think that Power Query just does not do what I am looking for. I want to have one column with a static list of stock symbols (DAL, GOOG, AAL etc) and populate each row with financial data from various api calls such as the Net Income field from https://financialmodelingprep.com/api/v3/financials/income-statement/DAL and the current stock price from https://financialmodelingprep.com/api/v3/stock/real-time-price/DAL
What exactly have you tried? It's very simple to extract data from the first link you gave with the M Code below (all UI based, nothing advanced about that at all). Converting that into a function to go to the relevant URL for each code and do the same transformation is also trivial
let
Source = Json.Document(Web.Contents("https://financialmodelingprep.com/api/v3/financials/income-statement/DAL ")),
financials = Source[financials],
#"Converted to Table" = Table.FromList(financials, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
#"Expanded Column1" = Table.ExpandRecordColumn(#"Converted to Table", "Column1", {"date", "Net Income"}, {"date", "Net Income"}),
#"Changed Type" = Table.TransformColumnTypes(#"Expanded Column1",{{"Net Income", type number}, {"date", type date}})
in
#"Changed Type"
Here is my solution in Python:
Create parameters:
company = "NVDA"
years = 5
Add Key:
api_key = 'YOUR_KEY'
Request:
r = requests.get(f'https://financialmodelingprep.com/api/v3/income-statement/{company}?limit={years}&apikey={api_key}')
data = r.json()
data
Extract data
date = []
symbol = []
revenue = []
costOfRevenue = []
grossProfit = []
for finance in data:
date.append(finance["date"])
symbol.append(finance["symbol"])
revenue.append(finance["revenue"])
costOfRevenue.append(finance["costOfRevenue"])
grossProfit.append(finance["grossProfit"])
ncome_nvda_dict = {
"Date" : date,
"Ticket": symbol,
"Revenue" : revenue,
"CostOfRevenue" : costOfRevenue,
"grossProfit" : grossProfit,
From Object To Pands
income_nvda_df = pd.DataFrame(income_nvda_dict, columns = ['Date', 'Ticket', 'Revenue', 'CostOfRevenue', 'grossProfit'])
Have search on a few sites, and any suggestions I am finding does not fit my situation. In many cases, such as how to export html table to excel, with pagination there are only limited responses. I am able to pull data using API key from a website, but it is paginated. I have been able to adjust my query to pull 100 records per page (default 25) and can input a page number to pull a select page, but have been unsuccessful in pulling as one table. Currently one of the data sets is over 800 records, so my work around is 8 queries, all pulling down a separate page, and then using the query amalgamate function to group into one table. I have a new project, that will likely return several thousand line items, and would prefer a simpler way to handle this.
This is current code
let
Source = Json.Document(Web.Contents("https://api.keeptruckin.com/v1/vehicles?access_token=xxxxxxxxxxxxxx&per_page=100&page_no=1", [Headers=[#"X-Api-Key"="f4f1f1f0-005b-4fbb-a525-3144ba89e1f2", #"Content-Type"="application/x-www-form-urlencoded"]])),
vehicles = Source[vehicles],#"Converted to Table" = Table.FromList(vehicles, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
#"Expanded Column1" = Table.ExpandRecordColumn(#"Converted to Table", "Column1", {"vehicle"}, {"Column1.vehicle"}),
#"Expanded Column1.vehicle" = Table.ExpandRecordColumn(#"Expanded Column1", "Column1.vehicle", {"id", "company_id", "number", "status", "ifta", "vin", "make", "model", "year", "license_plate_state", "license_plate_number", "metric_units", "fuel_type", "prevent_auto_odometer_entry", "eld_device", "current_driver"}, {"Column1.vehicle.id", "Column1.vehicle.company_id", "Column1.vehicle.number", "Column1.vehicle.status", "Column1.vehicle.ifta", "Column1.vehicle.vin", "Column1.vehicle.make", "Column1.vehicle.model", "Column1.vehicle.year", "Column1.vehicle.license_plate_state", "Column1.vehicle.license_plate_number", "Column1.vehicle.metric_units", "Column1.vehicle.fuel_type", "Column1.vehicle.prevent_auto_odometer_entry", "Column1.vehicle.eld_device", "Column1.vehicle.current_driver"})
in
#"Expanded Column1.vehicle"
I am using the following code in my own powerBI Data connector to get some date from a json document:
{
"Customers": [
{
"CustomerId": "8cd72f16-8d7b-48b0-90d9-71df011502c8",
"CustomerTitle": "Test Customer",
}
}
Code:
GetCustomerTable = (url as text) as table =>
let
source = Test.Feed(url & "/overview"),
value = source[Customers],
toTable = Table.FromList(value, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
#"expandColumn" = Table.ExpandRecordColumn(toTable, "Column1", {"CustomerId", "CustomerTitle"}, {"CustomerId", "CustomerTitle"}),
#"ChangedType" = Table.TransformColumnTypes(#"expandColumn",{{"CustomerTitle", type text}, {"CustomerId", type text})
in
ChangedType;
The column "CustomerId" referes to another url where the actual data about the customer is available in json format:
URL: /Details/8cd72f16-8d7b-48b0-90d9-71df011502c8
{
"Category": "B",
}
What is the best approach to use data from another url with the ExpandRecordColumn function?
So what you need is another custom function to obtain the customer details with each CustomerId, as one of the step:
GetCustomerDetails = (url as text, customer_id as text) =>
let
Source = Json.Document(Web.Contents(url & "/Details/" & customer_id)),
#"Converted to Table" = Table.FromList(Source, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
#"Expanded Column" = Table.ExpandRecordColumn(#"Converted to Table", "Column1", {"Category"}, {"Category"})
in
#"Expanded Column"
And then you can invoke this function in your original code by passing url and the CustomerId column:
GetCustomerTable = (url as text) as table =>
let
source = Test.Feed(url & "/overview"),
value = source[Customers],
toTable = Table.FromList(value, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
"expandColumn" = Table.ExpandRecordColumn(toTable, "Column1", {"CustomerId", "CustomerTitle"}, {"CustomerId", "CustomerTitle"}),
"ChangedType" = Table.TransformColumnTypes(#"expandColumn",{{"CustomerTitle", type text}, {"CustomerId", type text}),
#"Invoked Custom Function" = Table.AddColumn(#"ChangedType", "GetCustomerDetails", each GetCustomerDetails("http://testing.com/", [CustomerId]))
in
#"Invoked Custom Function"
You may need to make some adjustments to the code, depending on how it exactly looks like, but I hope you get the point.
I am learning M (Power Query Language). I'd like to use M to parse JSON from REST APIs. For example one can use the Stack Overflow REST API. I can see how to drill down into a simple JSON string using say
let
Source = Json.Document("{ ""glossary"": { ""title"": ""example glossary"", ""GlossDiv"": { ""title"": ""S"", ""GlossList"": { ""GlossEntry"":
{ ""ID"": ""SGML"", ""SortAs"": ""SGML"", ""GlossTerm"": ""Standard Generalized Markup Language"", ""Acronym"": ""SGML"",
""Abbrev"": ""ISO 8879:1986"", ""GlossDef"": { ""para"": ""A meta-markup language, used to create markup languages such as DocBook."",
""GlossSeeAlso"": [""GML"", ""XML""] }, ""GlossSee"": ""markup"" } } } } }"),
glossary = Source[glossary],
GlossDiv = glossary[GlossDiv],
GlossList = GlossDiv[GlossList],
GlossEntry = GlossList[GlossEntry],
ConvertedToTable = Record.ToTable(GlossEntry)
in
ConvertedToTable
But what happens when I have a list from which I want to drill in and fetch a subproperty and then I want to return all of those like a SQL UNION query. Actually it is more of a For Each type query.
So here is my non-working query that does not do a union but unfortunately glues the second record onto the side
let
Source = "{""items"":["
{""tags"":[""vba"",""permissions""],""owner"":
{""reputation"":49,""user_id"":9073241,""user_type"":""registered"",""accept_rate"":86,""display_name"":""Kam""},
""is_answered"":false,""view_count"":4,""answer_count"":0,""score"":0,""question_id"":48229549},
{""tags"":[""excel"",""vba"",""excel-vba""],""owner"":
{""reputation"":18,""user_id"":9057704,""user_type"":""registered"",""accept_rate"":29,""display_name"":""Gregory""},
""is_answered"":false,""view_count"":6,""answer_count"":0,""score"":0,""question_id"":48229590}
]}",
#"Parsed JSON" = Json.Document(Source),
items = #"Parsed JSON"[items],
item0 = items{0},
owner0 = item0[owner],
item1 = items{1},
owner1 = item1[owner],
#"Converted to Table" = Table.Combine( {Record.ToTable(owner0), Record.ToTable(owner1) })
in
#"Converted to Table"
What I am really aiming for is this output but not limited to 2 records, but all the records from the list. (The above sample source has been simplified from this REST API StackOverflow questions tagged VBA)
reputation user_id user_type accept_rate display_name
49 9073241 registered 86 Kam
18 9057704 registered 29 Gregory
I think you want to pivot your tables before you try to combine them. Try this query, for example.
let
Source1 = Json.Document("{""tags"":[""vba"",""permissions""],""owner"":
{""reputation"":49,""user_id"":9073241,""user_type"":""registered"",""accept_rate"":86,""display_name"":""Kam""},
""is_answered"":false,""view_count"":4,""answer_count"":0,""score"":0,""question_id"":48229549}"),
Owner1 = Table.Pivot(Record.ToTable(Source1[owner]), List.Distinct(Record.ToTable(Source1[owner])[Name]), "Name", "Value"),
Source2 = Json.Document("{""tags"":[""excel"",""vba"",""excel-vba""],""owner"":
{""reputation"":18,""user_id"":9057704,""user_type"":""registered"",""accept_rate"":29,""display_name"":""Gregory""},
""is_answered"":false,""view_count"":6,""answer_count"":0,""score"":0,""question_id"":48229590}"),
Owner2 = Table.Pivot(Record.ToTable(Source2[owner]), List.Distinct(Record.ToTable(Source2[owner])[Name]), "Name", "Value"),
#"Appended Query" = Table.Combine({Owner1, Owner2})
in
#"Appended Query"
If you just want to expand all of the owners, try a query more like this:
let
Source = "{""items"":[{""tags"":[""vba"",""permissions""],""owner"":{""reputation"":49,""user_id"":9073241,""user_type"":""registered"",""accept_rate"":86,""display_name"":""Kam""},""is_answered"":false,""view_count"":4,""answer_count"":0,""score"":0,""question_id"":48229549},{""tags"":[""excel"",""vba"",""excel-vba""],""owner"":{""reputation"":18,""user_id"":9057704,""user_type"":""registered"",""accept_rate"":29,""display_name"":""Gregory""},""is_answered"":false,""view_count"":6,""answer_count"":0,""score"":0,""question_id"":48229590}]}",
#"Parsed JSON" = Json.Document(Source),
items = #"Parsed JSON"[items],
#"Converted to Table" = Table.FromList(items, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
#"Expanded Column1" = Table.ExpandRecordColumn(#"Converted to Table", "Column1", {"owner"}, {"owner"}),
#"Expanded owner" = Table.ExpandRecordColumn(#"Expanded Column1", "owner", {"reputation", "user_id", "user_type", "display_name"}, {"reputation", "user_id", "user_type", "display_name"})
in
#"Expanded owner"