JSON Table Creation - json

Just gonna start with saying, I sure do know how to pick a challenge (based on my skillsets or the lack of).
I'm trying to figure out how to generate a true/false style table through PowerAutomate from a PowerBI output (Exciting, right?!).
The data is a report of Active Directory users and their group memberships. A Sample of the source data looks like this;
The desired output would look something like this;
How the data is being generated through PowerAutomate and PowerBI;
I'm trying to research JSON Schemas and how that all is supposed to work, but I am just really not sure where to start. Finding an example that matches well enough to replicate into my use case is turning out to be more difficult. Assuming this is a function JSON can complete.
I'm trying out these sites, its helping me learn but is probably not going to solve my issue;
https://odileeds.github.io/JSONSchema/
https://json-schema.org/blog/posts/applicability-json-schema-fundamentals-part-1
Any help would be very much appreciated. Not always looking for the direct answer to the problem, if documentation exists that would push me in the right direction.

There is a new DAX function added recently called TOJSON that might solve your problems, and I believe the imagined use-cases for this function includes your current challenge.
Here is what I would do:
Prepare your table exactly as in your desired output, in Power BI
Lift the entire query used to generate the table using Performance Analyzer
In DAX Studio, paste the query you copied and isolate the final output table - wrap this in TOJSON and verify your results
Use this query in Power Automate - should give you the JSON response you are after directly
If you need to parse the JSON, you can copy the output from DAX Studio as basis for the schema in Power Automate

Related

SSIS and JSON Flat Files

Whats the best way to get JSON flat files into SQL Server using SSIS?
Currently I've tried parsing the data in a script component, but with amount of JSON files I'm parsing (around 120 at a time) it takes upward of 15 minutes to get the data in. I also don't consider this very practical.
Is there a way to combine the powers of SSIS and the OPENJSON command in SQL server? I'm running SQL server 2016 so I'm trying to leverage that command in the hopes that it works faster.
Also, I have no problem getting the JSON data in without losing format.
Looks like this:
Is there a way for me to leverage that and get JSON format into a more Normalized format.
This guy has an example to splitting a JSON string that is in a column that would be a good easy basis.
SSIS Data flow task runs by itself but not as part of package
You would want a class referencing a class if you have subclasses. Kind of like an order class reference a line item class.
In that example, you would have a DF on foreach order and within that a foreach lineitem including the order ID.
I had a good example with Survey Monkey but I can't find it right now.
I actually didn't use data flows with that example and just directly loaded from C#.
Here is the survey monkey class structure i referenced above:
Trouble using all members in a class. Why can I only use the list?
Good luck.
Actually figured this out.
I bring the files in one at a time, with all the JSON text in a single row.
From there I can use the OPENJSON command in SQL Server 2016.

MS Access data lineage documentation

I am looking for a scripted/automated way (presumably VBA?) to take an Access query and generate some kind of savable, searchable, publish-able documentation on the data lineage. So if there were a bunch of layered/nested queries, or even passthrough queries, along the way I want a way to trace the final fields in the specified query back until I get back to the original source tables/fields.
Everything I've found seems to do database documentation focused on how the table relationships are configured. I'm looking for a way to get the documentation of the user-created portion, down to the field. I'm very open-minded on what format the output is in. I'm convinced this must be possible, but haven't had any luck yet.
I'm also open to recommendations for a third-party application if it could do this.
Thanks in advance!
Access does have a built in “dependency” feature. The result is a VERY nice tree-view of those dependencies, and you can even launch such objects using that treeview of your application to “navigate” the application so to speak.
The option is found under database tools and is appropriately called Object Dependencies.
The result looks like this:
While you don't want to use auto correct, this feature will force on track changes. If this is a large application, then on first run a significant delay will occur. After that, the results can be viewed instantly. As noted, not only do you have a hierarchical tree view, but objects in the tree view can be clicked on to launch the object in question.
And the above will work for a query that based on a query etc. all the way down to the base table.
https://www.dropbox.com/sh/f73rs3h9u9q2xk5/AAArloN_Cmf_WbPZ4W75I6KVa?dl=0
This is a set of queries I wrote to provide the kind of documentation you're looking for. It seems a bit kludgy, but it works for me. It's not a simple as the other response, but it provides output that can be incorporated into other documentation.
Note - the documentation is out of date with respect to Union queries. The query I have to analyze Union queries seems to only pick up the 1st two things that go into the Union, so I changed this to a Make Table query, and manually edit the resulting table to add the missing relationships.
To use the queries:
Copy the table and all the queries into your database
Run the "Mapping Unions Make Table" query
Manually edit the Unions table if necessary
When you run any of the 3 main output queries, you are prompted for the Top object you want to analyze. Enter the name of a query or table to find all the dependencies for that object. The three main outputs are:
Mapping Summary - lists all of the objects that go into the top object and all of the objects that go into them, to a depth of about 10 (depth is controlled in the "Mapping all parents" query)
Mapping summary without duplicates
Mapping summary duplicates
I especially like the 2nd output - this is in a format that can be saved in Excel and input to Visio's Org Chart Wizard to get a simple graphical representation of the relationships. Then the 3rd output query can be used to manually add in the queries that go into more than one other query, which Visio's wizard cannot handle.

SSRS -> Using code function to create my query dynamically, how do I get at my data (put data into table)?

I am not a seasoned SSRS veteran. I have made quite a few but they were pretty simple.
Today, I am attempting to use the Code tab of the Report Properties to perform some vb functions that will return my query in a string. I am passing in date parameter that is used to create my dynamic query.
My problem/issue is that I do not know how to pull this information into my table.
I have seen instances where a developer calls from each individual field and it calls the code to get a specific field. I was under the impression that I could somehow use a dataset to do this and have some documentation on this, but can't seem to find anything on the web regarding how to do this.
This is probably a pretty poorly written question, but does anyone know how to do this?
I was thinking in the Dataset Properties, I would code something like this in the expression field.
=Code.GetReportDetail(Parameter!InputDate.Value)
GetReportDetail being the starting function within my code window.
I am having difficulties how I can then pull that dataset into my table from that point though.
Any advice on this is greatly appreciated....Thanks.
After further review, I was creating this in VS2010 for RC0 2012 SQL Server, which I should have noted above.
All you have to do is create your report by adding new item (Report). Add your code by right clicking in the pink area and going to Report Properties and pasting your code into the code tab.
Next when you pull a table into your "add item to the report / white space" It allows you to create your dataset.
I chose use dataset embedded in my report. Datasource of OLE DB because I am doing MDX. Query Type Text then in the function I pasted this. (omitting parameters for now) :
=Code.GetReportDetail()
I then filled in fields manually because it seems that dynamic query running does not pull in fields.
I was then able to reference these manually created fields via my table detail row.

How do I re-use reports on different datasets?

What is the best way to re-use reports on different tables / datasets?
I have a number of reports built in BIRT, which get their data from a flat (un-normalized) MySQL table, the data which in turn has been imported from an excel sheet.
In BIRT, I've constructed my query like this, such that I can change the field names and re-use the report:
SELECT * FROM
(SELECT index as "Index", name as "Name", param1 as "First Parameter" FROM mytable) t
However, then when I switch to a new client's data, I need to change the query to the new data source and this doesn't seem sustainable or anywhere near a good practice.
So... what is a good practice?
Is this a reporting issue, or a database-design issue?
Do I create a standard view that the report connects to?
If I have a standard view, do I create a different view with the same structure for each data table, or keep replacing the view with a reference to the correct data table each time I run the report?
What's annoying is the excel sheets keep changing - new columns are added, and different clients name their data differently. Even if I can standardize this, I'd store different client data in different tables... so would I need to create a different report for each client, or pass in the table name to the report?
There are two ways and the path you choose is really dictated by how much flexibility you have architecturally.
First, you are on the right track by renaming your selected columns to a common name since that name is what is used to bind the data to the control on the report. Have you considered a stored procedure to access the data? This removes the query from the report and allows you to set up the stored proc on any database to return the necessary columns. If you cannot off-load to a stored proc, you can always rely on altering the query text at run-time. Because BIRT reports are not compiled (they are XML) you can change the query based on parameters and have it executed for each run of the design. Look at the onCreate event for the Data Set and you can access this.queryText and do any dynamic string substitution you need via JavaScript. Hidden parameters are a good way to help alter/tune the query. If you build the Data Set correctly, the changing of the underlying data could be as easy as changing the Data Source and then re-associating the Data Set to the new Data Source (in the edit data set window). I have done this MANY times and it works well. If you are going down this route, I would add the Data Source(s), Data Set(s) and any controls that they provide data to a report library. With the library you can use the controls in many reports and maintain them in one spot. If you update the library, all the reports using the library get updated as well.
Alternatively, if you want to really commit to a fully re-usable strategy that allows you to build a library of reusable components you could check out the free Reusable Component Library at BIRT Exchange (Reusable Component Library). In my opinion this strategy would give you the re-use you are looking for but at the expense of maintainability. It is abstraction to the point of obfuscation. It requires totally generic names for columns and controls that make debugging very difficult. While it would not be my first choice (the option above would be) others have used it successfully so I thought I would include it here since it directly speaks to your question.

Grouping by a report item in SSRS 2005 - textbox - any workarounds?

I want to group by a report item, but that's not allowed.
So I tried creating a parameter...not allowed as well.
Tried referencing from footer...failed again.
This is somewhat complicated.
Let me explain:
I have textbox22, it's value is:
=Code.Calc_Factor(Fields!xx.Value, fields!yy.Value...)
This is embedded VB code in the report that's called for each row to calculate a standard factor.
Now to calculate the deviation from the standard factor, I use textbox89, whose value is:
=(Fields!FACTOR.Value - ReportItems!textbox22.Value)/ReportItems!textbox22.Value
Don't get confused between Fields!FACTOR.Value and textbox22.Value, they are different.
Fields!FACTOR.Value is the factor used, textbox22.Value is what it should be (standard factor).
Now I want to create a group which splits deviations into 2 groups, > 1% or not.
So I tried creating a group:
=IIF(ReportItems!textbox89.Value > 1,0,1)
...But then SSRS complains about using report items.
I have run into a similar problem of using report items in the past, but this is a new case!
Any help greatly appreciated.
Have you tried adding a calculated field to your dataset?
Here is how it works:
While you are in the layout view of the report, open "datasets" tool window(in my environment it is on the left).
Right click on the DataSet you are working with and add a field, you can use a calculated field, and build your formula appropriately
Then you should be able to group on this field
-Dan
I'm not 100% that someone won't have some magic solution for this but I have run across similar problems myself in the past. I believe (but could be wrong) the problem Reporting Services is having is that it only renders once and what you're asking it to do is render the data before rendering the grouping which it doesn't do.
The only way I have ever been able to produce the exact results I need is to make the data rendering happen exclusively in the SQL (through the use of table variables usually) and then use Reporting Services merely as a display platform. This will require that your factoring algorithm gets expressed in the T-SQL within the stored procedure you will likely have to write to get the data in shape. This would appear to be the only way to achieve your end result.
This has the bonus feature of separating report design and presentation from data manipulation.
Sorry I couldn't provide a SSRS solution, maybe someone else will know more.