SSRS - extract data from column containing JSON - json

I have a dataset with a column containing arrays of JSON data that looks like:
[{"name":"aaa","type":"yyy"},{"name":"bbb","type":"ccc"}]
or more specifically:
dataset with JSON array column
Is there any straight forward method of extracting the JSON data from the column using something like JSON_QUERY, so that I can use it in a report

As far as I can tell, the existing JSON array format is not usable with any of the T-SQL JSON functions.
The array in the column "jsonCol" needs to be in the form of:
{ "tag": [{"name":"aaa","type":"yyy"},{"name":"bbb","type":"ccc"}]}
and then I can extract each array element individually with:
SELECT JSON_QUERY(jsonCol, '$.tag[0]') as tag
FROM
So I could add a prefix and suffix string to the select statement to fix this as long as no one else will see it.

Related

How to Query Nested JSON in Google Big Query

I'm currently having trouble extracting data from a JSON String.
The way the data has been pulled, everything has been nested into a single string under the data field name.
How it looks like in Big Query:
Screenshot of the Schema:
Below is an example of what the string looks like:
{"id":1381,"email":"J.Smith#gmail.com","name":"Jake Smith","sub_network_ids":[2375,2270],"extended_updated_at":"2022-01-27T00:02:14Z"}
If I simply wanted to pull the ID, Email, and Name from this string and into a table, I'm wondering how would one go about doing such? Currently, I was trying to use JSON_EXTRACT with Unnest, but that didn't pan out in the direction I thought it would.
Any help would be appreciated, thanks.

How to convert Excel to JSON in Azure Data Factory?

I want to convert this Excel file which contains two tables in a single worksheet
Into this JSON format
{
parent:
{
"P1":"x1",
"P2":"y1",
"P3":"z1"
}
children: [
{"C1":"a1", "C2":"b1", "C3":"c1", "C4":"d1"},
{"C1":"a2", "C2":"b2", "C3":"c2", "C4":"d2"},
...
]
}
And then post the JSON to a REST endpoint.
How to perform the mapping and posting to REST service?
Also, it appears that I need to sink the JSON to a physical JSON file before I can post as a payload to REST service - is this physical sink step necessary or can it be held in memory?
I cannot use Lookup activity to read in the Excel file because it is limited to 5,000 rows and 4MB.
I managed to do it in ADF, the solution is a bit long, but you can use azure functions to do it programmatically.
Here is a quick demo that i built:
the main idea is to split data, add headers as requested and then re-join data and add relevant keys like parents and children.
ADF:
added Conditional join to split data (see attached pictures).
add surrogate key for each table.
filtered first row to get red off the headers in the csv.
map children/parents' columns: renaming columns using derived column activity
added constant value in children data flow so i can aggregate by it and convert the CSV into a complex data type.
childrenArray: in a derived column,added subcolumn to a new column named Children and in values i added relevant columns.
aggregated children Jsons by using the constant value.
in parents dataFlow: after mapping columns , i created jsons using derived column.(please see attached pictures).
joined the children array and parents jsons into one table so it will be converted to the requested Json.
wrote to cached sink(here you can do the post request instead of writing to sink).
DataFlow:
![enter image description here
Activities:
Conditional Split:
AddSurrogateKey:
(it's the same for parents data flow just change the name of incoming stream as shown in dataflow above)
FilterFirstRow:
MapChildrenColumns:
MapParentColumns:
AddConstantValue:
PartentsJson:
Here i added subcolumn in Expression Builder and sent column name as value,this will build the parents json.
ChildrenArray:
Again in a derived column, added column with a name "children"
and in Expression Builder i added relevant columns.
Aggregate:
the purpose of this activity is to aggregate children Json's and build the array, without it you will not get an array.
the aggregation function is collect().
Join Activity:
Here i added an outer join to join the parents json and the children array.
Select Relevant columns:
Output:

How do I retrieve all column fields as objects from Tabulator?

I want to make the custom filter dynamic. So, for writing future code I could pass in a list of references to each field object in the table.
That way I do not have to hardcode data.(field name here). Instead, it would work off the list of properties of the column object.
I know there ways to get the field normally but they are always returned as strings not object references. This obviously will not work with the dot operator.
I have some success with using JSON.parse followed by looping through the entries. But like before it returns the field as a string instead of a reference.
So is there a way to retrieve the column fields as objects and if so how?
I tried using the getColumns but I am still getting undefined when grabbing the fields. There is something wrong with my code.
function customFilter(data, filterParams) {
//data - the data for the row being filtered
//filterParams - params object passed to the filter
for (column of table.getColumns()){
field = column.getField();
console.log(data.field);
}
}
You speak about references in your question, but references to what? the field names themselves arent references to anything, they simply show Tabulator how to access the underlying row data, without a specific row data object to reference, there isn't anything to build any references from
You can only have a reference if it points to something, but there is nothing for the field definitions to point to without the row data.
If you are looking to have objects that you can manipulate the the getColumns function returns an array of Column Components with each component having a range of functions that can be called to manipulate that column. including the getField function that returns the field for that column.
Given that the Tabulator filter functions will accept the filed names with dot notation that shouldnt be an issue at all, but you can also pass the column component directly into the filter, so it shouldnt be a problem their either

Alpha anywhere: Can I populate JSON data into the list

Can I populate a list with JSON data? I have a general list containing data available for several sessions but I need to filter them with my current session and insert them to another list. My idea is to use the filtered JSON data since I successfully filtered them in JSON format. I've looked into some threads that might relate but currently get nothing. Hope someone can point me to the right page.
I missed this page: or maybe I overlooked it: https://forum.alphasoftware.com/showthread.php?119524-How-to-populate-a-List-from-a-JSON-formatted-field.
Anyway, populating JSON data into list in alpha anywhere is easy to be done. Firstly, get the JSON data(in my case I produce them from another list). With this data(already in JSON format), I do the filter using:
var filtered_json = find_in_object(JSON.parse('my_JSON_data'), {my_filter_condition});
Then, the result should be in [object object][object object]
Finally, populate the result to the list.
var lObj= {dialog.object}.getControl('my_list_ID')
lObj.populate(filtered_json);

In Talend, how do you keep input values provided to tSoap so that you can use them with the Soap response?

I have a Talend Job that currently does the following:
Input CSV --Main--> tMap --Output--> tSoap --Main--> Output CSV
The input CSV has
ID and TYPE as input columns.
The Map creates a Soap XML (String) message using the ID from the CSV and passes that String to the tSoap component.
The tSoap component fires the web request, which sends the response to the next component. That data is then written to CSV.
The problem is that TYPE from the input CSV is not passed through to amalgamate with the SOAP response data. Only the response data seems accessible.
I've had a look at tBufferInput / tBufferOutput and tFlowToIterate but they seem to work in scenarios where the tSoap component does not depend on an input from the main flow.
Does anyone know which components can be used to achieve the amalgamation?
Thank you
If you output the data you need to reuse to a tHashOutput component you should be able to rejoin your data with the response output from tSoap assuming there's some natural join element from the response.
I solved this in the end by:
Placing between the output from the tMap and the input to the tSoap, a new component - tSetGlobalVar
Inside tSetGlobalVar, you can then create a new row, which maps an input column (Value) to a named variable that you specify as the 'Key'.
E.g. Key = "ID", Value = row11.ID
The output from tSetGlobalVar then goes into the tSoap component.
The output from tSoap goes into a new tMap.
Inside this new tMap is the Body column from the previous tSoap component which maps to an output column. To access the stored "ID" variable for the current flow/iteration, I created a new output column, and instead of mapping any columns from the inputs, used (String)globalMap.get("ID"); which would insert the value back into the flow.