Handling Pagination from API (Json) with Excel Query - json

Have search on a few sites, and any suggestions I am finding does not fit my situation. In many cases, such as how to export html table to excel, with pagination there are only limited responses. I am able to pull data using API key from a website, but it is paginated. I have been able to adjust my query to pull 100 records per page (default 25) and can input a page number to pull a select page, but have been unsuccessful in pulling as one table. Currently one of the data sets is over 800 records, so my work around is 8 queries, all pulling down a separate page, and then using the query amalgamate function to group into one table. I have a new project, that will likely return several thousand line items, and would prefer a simpler way to handle this.
This is current code
let
Source = Json.Document(Web.Contents("https://api.keeptruckin.com/v1/vehicles?access_token=xxxxxxxxxxxxxx&per_page=100&page_no=1", [Headers=[#"X-Api-Key"="f4f1f1f0-005b-4fbb-a525-3144ba89e1f2", #"Content-Type"="application/x-www-form-urlencoded"]])),
vehicles = Source[vehicles],#"Converted to Table" = Table.FromList(vehicles, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
#"Expanded Column1" = Table.ExpandRecordColumn(#"Converted to Table", "Column1", {"vehicle"}, {"Column1.vehicle"}),
#"Expanded Column1.vehicle" = Table.ExpandRecordColumn(#"Expanded Column1", "Column1.vehicle", {"id", "company_id", "number", "status", "ifta", "vin", "make", "model", "year", "license_plate_state", "license_plate_number", "metric_units", "fuel_type", "prevent_auto_odometer_entry", "eld_device", "current_driver"}, {"Column1.vehicle.id", "Column1.vehicle.company_id", "Column1.vehicle.number", "Column1.vehicle.status", "Column1.vehicle.ifta", "Column1.vehicle.vin", "Column1.vehicle.make", "Column1.vehicle.model", "Column1.vehicle.year", "Column1.vehicle.license_plate_state", "Column1.vehicle.license_plate_number", "Column1.vehicle.metric_units", "Column1.vehicle.fuel_type", "Column1.vehicle.prevent_auto_odometer_entry", "Column1.vehicle.eld_device", "Column1.vehicle.current_driver"})
in
#"Expanded Column1.vehicle"

Related

Is there a way to get records from 2 API with different API key in 1 connector Power BI using Power Query?

I have to get data from 2 API (one is input and another is hardcoded).
My scenario is I would input my 1st key (as Authentication key) to run 1st API and inside I have a column that calls the 2nd API (which I hardcoded apikey) but the system still alerts me as wrong credential. If I run separately it will run normally. I've done a similar case but in that only used 1 APIkey. I don't know how to resolve it.
ExpandData = (dataList as list) as table =>
let
FromList = Table.FromList(dataList, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
#"Expanded Column1" = Table.ExpandRecordColumn(FromList, "Column1", {"search-results"}, {"search-results"}),
#"Expanded search-results" = Table.ExpandRecordColumn(#"Expanded Column1", "search-results", {"entry"}, {"entry"}),
#"Expanded entry" = Table.ExpandListColumn(#"Expanded search-results", "entry"),
#"Expanded entry1" = Table.ExpandRecordColumn(#"Expanded entry", "entry", {"dc:identifier", "prism:doi"}, {"identifier", "doi"}),
#"Reordered Columns" = Table.ReorderColumns(#"Expanded entry1",{"doi", "identifier"}),
#"Add Column" = Table.AddColumn(#"Reordered Columns", "Detail", each Text.Range(Text.From([identifier]),10)),
#"Add Column2" = Table.AddColumn(#"Add Column", "Detail2", each ExpandDataPublish(Text.From([Detail])))
in
#"Add Column2";
ExpandDataPublish = (data as text) as table =>
let
Source = Json.Document(Web.Contents("https://******" & data,
[Headers=[#"Accept" = "application/json;odata.metadata=minimal",
#"X-ELS-APIKey"="7f59af901d2d86f78a1fd60c1bf9426a"]])),
#"Converted to Table" = Table.FromRecords({Source})
in
#"Converted to Table";
It runs normally if not call 2nd API
But fail if call

Parsing website search history JSON data in PowerBI

I am doing some analytics on a certain website, and I'm trying to get people's search terms to an easier format. They're in JSON format, and I've already parsed the column several times to get other relevant data. The problem is that when trying to parse the keywords/search data, each search word goes into its own column, and with tens of thousands of searches it becomes a problem. Hard to explain, so I will add a picture that illustrates the problem.
The "Ai O" and "JAHAHASHAS" are searches that I did on the website, rest are hidden, but the list goes on forever
https://i.imgur.com/5IXTXn1.png
Here is an example of a person searching "business model" in the JSON
"{""sort"": ""lastUpdated"", ""limit"": 5, ""fields"": {""keywords"": {""business model"": true}}, ""offset"": 0}"
Tried to parse the keywords column anyways, but creating thousands of new columns doesn't really work
I added some extra records to your json file, I hope that it matches what you have to work with in reality. In the future, you'll get faster and better results if you're willing to share a bit more concrete information about your project. In most cases, it's very difficult to provide meaningful advice if we can't reasonably recreate your situation.
Anyway, I wanted the solution to work against multiple search terms, so I tweaked the JSON you provided to look like this:
{ "Searches": [
{"sort": "lastUpdated", "limit": 5, "fields": {"keywords": {"business model": true}}, "offset": 0},
{"sort": "lastUpdated", "limit": 5, "fields": {"keywords": {"Alpha Beta": true}}, "offset": 0},
{"sort": "lastUpdated", "limit": 5, "fields": {"keywords": {"Gamma Delta Epsilon": true}}, "offset": 0}
]}
and then provided the following Power Query (taken from the advanced editor)
let
Source = Json.Document(File.Contents("C:\Users\XXXXXX\Desktop\foo.json")),
#"Converted to Table" = Table.FromList(Source[Searches], Splitter.SplitByNothing(), null, null, ExtraValues.Error),
#"Expanded Column1" = Table.ExpandRecordColumn(#"Converted to Table", "Column1", {"fields"}, {"Fields"}),
#"Expanded Fields" = Table.ExpandRecordColumn(#"Expanded Column1", "Fields", {"keywords"}, {"keywords"}),
#"Added Custom" = Table.AddColumn(#"Expanded Fields", "keywords_text", each Record.FieldNames([keywords])),
#"Extracted Values" = Table.TransformColumns(#"Added Custom", {"keywords_text", each Text.Combine(List.Transform(_, Text.From), " "), type text}),
#"Added Custom1" = Table.AddColumn(#"Extracted Values", "keywords_text_list", each Text.Split([keywords_text], " ")),
Custom1 = List.Combine( #"Added Custom1"[keywords_text_list])
in
Custom1
After loading the data from file, I extracted the records (searches, that is) from the Json.
Next, having converted the data to a table, I expanded twice to dive down from Records to Fields, and Fields to keywords. Then I added a custom column and used the M Query 'Record.Fieldnames' function to generate a list on every fieldnames in our keyword collections. Keep in mind, each list here only has one element, even though there might be spaces in that text.
Then I extracted values from those lists
and followed that up by splitting those strings back into lists -- this time using space as a delimiter.
As a final step, I combined the multiple lists into a single list of all our keywords as single elements.
So, that's my approach. Hope it helps.

Power BI case insensitive with JSON

I'm connecting to Azure resources via API at resources.azure.com, from there I'm taking the API for Microsoft.Compute and importing all the VM details into Power BI via JSON.
The import works fine, however with some situations of the data there is case discrepancy. For example, when working with the tags value, some people have typed the same word but in different case, such as;
"tags": {
"Project": "DT",
"SLStandard": "Yes"
compared to
"tags": {
"project": "DT",
"SlStandard": "Yes"
When expanding the columns out in Power BI it will consider the items listed above as two different value.
Ideally I would like to have the JSON imported and the 'case' ignored, or perhaps mark all incoming as either upper or lower case.
I have read the two links below, but I'm new to Power BI and I'm unsure how to implement it, or even if it is what I need.
Case sensitivity in Power BI
and
Power BI changing text case automatically
and
http://www.thebiccountant.com/2016/10/27/tame-case-sensitivity-power-query-powerbi/
Here is my Advanced Editor code:
let
iterations = 10,
url =
"https://management.azure.com/subscriptions/< subscription id >/providers/Microsoft.Compute/virtualMachines?api-version=2017-12-01",
FnGetOnePage =
(url) as record =>
let
Source = Json.Document(Web.Contents(url)),
data = try Source[value] otherwise null,
next = try Source[nextLink] otherwise null,
res = [Data=data, Next=next]
in
res,
GeneratedList =
List.Generate(
()=>[i=0, res = FnGetOnePage(url)],
each [i]<iterations and [res][Data]<>null,
each [i=[i]+1, res = FnGetOnePage([res][Next])],
each [res][Data]),
#"Converted to Table" = Table.FromList(GeneratedList, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
#"Expanded Column1" = Table.ExpandListColumn(#"Converted to Table", "Column1"),
#"Expanded Column2" = Table.ExpandRecordColumn(#"Expanded Column1", "Column1", {"tags"}, {"Column1.tags"}),
#"Expanded Column1.tags" = Table.ExpandRecordColumn(#"Expanded Column2", "Column1.tags", {"Project", "project", "SLStandard", "sLStandard", "BIOffline", "bIStandard", "AutomationBI", "biStandard", "BIStandard", "asdf-U001", "TestVM"}, {"Column1.tags.Project.1", "Column1.tags.project", "Column1.tags.SLStandard.1", "Column1.tags.sLStandard", "Column1.tags.BIOffline", "Column1.tags.bIStandard.1", "Column1.tags.AutomationBI", "Column1.tags.biStandard.2", "Column1.tags.BIStandard", "Column1.tags.asdf-U001", "Column1.tags.TestVM"})
in
#"Expanded Column1.tags"
If you're wondering about why my query is so long for import, then check out my previous post here: Power BI - Call Azure API with nextLink (next page)
Any help would be greatly appreciated.
I'm struggling with the same issue today. One workaround, although not elegant, is to lowercase or UPPERCASE the JSON upstream.
Worked for me as a temporary solution.
Hope it helps,
Chris
It seems that your issue is coming directly from the data source and the discrepancy between the field names at the data source level.
Here's a sample code on how you can force/make sure that they all have the same casing:
let
Source = {[A=24,b=53], [a=43,B=43], [a=3,b=3]},
Custom1 = List.Transform(Source, (_)=> Record.RenameFields(_, List.Zip({Record.FieldNames(_), List.Transform(Record.FieldNames(_), Text.Lower)})))
I've had this answered on another forum and tried it, and it does work.
https://community.powerbi.com/t5/Desktop/Power-BI-case-insensitive-with-JSON/m-p/360134
"You can rename your record fields (by applying Text.Proper for example) before expanding (in the last step):"
Table.TransformColumns(#"Expanded Column2", {{"tags", each Record.RenameFields(_, List.Zip({Record.FieldNames(_), List.Transform(Record.FieldNames(_), (name)=> Text.Proper(name))}))}}),
An the final output looks like this:
let
iterations = 10,
url =
"https://management.azure.com/subscriptions/< subscription >/providers/Microsoft.Compute/virtualMachines?api-version=2017-12-01",
FnGetOnePage =
(url) as record =>
let
Source = Json.Document(Web.Contents(url)),
data = try Source[value] otherwise null,
next = try Source[nextLink] otherwise null,
res = [Data=data, Next=next]
in
res,
GeneratedList =
List.Generate(
()=>[i=0, res = FnGetOnePage(url)],
each [i]<iterations and [res][Data]<>null,
each [i=[i]+1, res = FnGetOnePage([res][Next])],
each [res][Data]),
#"Converted to Table" = Table.FromList(GeneratedList, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
#"Expanded Column1" = Table.ExpandListColumn(#"Converted to Table", "Column1"),
#"Expanded Column2" = Table.ExpandRecordColumn(#"Expanded Column1", "Column1", {"properties", "tags"}, {"properties", "tags"}),
#"Expanded properties" = Table.TransformColumns(#"Expanded Column2", {{"tags", each Record.RenameFields(_, List.Zip({Record.FieldNames(_), List.Transform(Record.FieldNames(_), (name)=> Text.Proper(name))}))}}),
#"Expanded properties1" = Table.ExpandRecordColumn(#"Expanded properties", "properties", {"vmId"}, {"properties.vmId"}),
#"Expanded tags" = Table.ExpandRecordColumn(#"Expanded properties1", "tags", {"Project", "Slstandard", "Bioffline", "Bistandard", "Automationbi", "asdf-U001", "Testvm"}, {"tags.Project", "tags.Slstandard", "tags.Bioffline", "tags.Bistandard", "tags.Automationbi", "tags.asdf-U001", "tags.Testvm"})
in
#"Expanded tags"
Hope this helps other people!

Replace different values with specific text in report builder

I have a table that will return a number and i need to convert it into a text label
20 = Entered, 30 = Returned, 200 = Cancelled, 220 = Complete, 300 = Deleted
I want these to show in my report as simply 'Complete' etc.
Im able to use the replace function to get one value to show correctly in the report:
=Replace(Fields!status.Value,"220","Complete")
But i cant work out how to do this for each possible number that will show in this column
Best way would likely be modifying the query with a CASE statement as mentioned, if you are able to do that. But if not, a cleaner alternative to the nested Replaces would be to simply use a Switch statement:
=Switch(
Fields!Status.Value = "20", "Entered",
Fields!Status.Value = "30", "Returned",
Fields!Status.Value = "200", "Cancelled",
Fields!Status.Value = "220", "Complete",
Fields!Status.Value = "300", "Deleted"
)
This is not the most efficient way to do this, but it's a quick fix:
=Replace(Replace(Replace(Replace(Replace(Fields!status.Value,"220","Complete"), "200","Cancelled"),"300","Deleted"),"20","Entered"),"30","Returned")
A better way would be to modify your DataSet query to replace the numbers with a CASE statement. See this documentation:
https://learn.microsoft.com/en-us/sql/t-sql/language-elements/case-transact-sql

Smartsheet data in Excel via Power Query

The API output from Smartsheet returns rows and columns as separate objects,that are independent of each other.
This results in separate records for the columns(A list of field names) and another set of records for the rows(records with a single field of values from various fields)
Is there a way to return a single list of JSON (with rows and columns resulting in a single list of records)?
This is the code I'm using in the Query Editor that returns separate Rows and Columns
= Web.Contents(
"https://api.smartsheet.com/1.1/sheet/[SHEET_ID]",
[
Headers =
[
#"Authorization" = "Bearer YOUR_API_TOKEN"
]
]
)
I used the sample data on their site to come up with this set of transformations:
let
Source = Json.Document(File.Contents("D:\testdata\foo.json")),
ColumnIds = List.Transform(Source[columns], each Text.From([id])),
ColumnNames = List.Transform(Source[columns], each [title]),
Table = Table.FromList(Source[rows], Splitter.SplitByNothing(), null, null, ExtraValues.Error),
Expanded = Table.ExpandRecordColumn(Table, "Column1", {"rowNumber", "cells"}, {"rowNumber", "cells"}),
Mapped = Table.TransformColumns(Expanded, {"cells",
each Record.Combine(List.Transform(_, each Record.AddField([], Text.From([columnId]), [value])))}),
Result = Table.ExpandRecordColumn(Mapped, "cells", ColumnIds, ColumnNames)
in
Result