Pass a Parameter to DAX Query Report Builder - parameter-passing

I need some help here. Below is a DAX query that I have copied over from Power BI into Power BI Report Builder. I'm looking to pass a parameter into this query for 'edw dim_paycom_amcs_location_xref'[Paycom_Location_Desc]. However, I'm not sure where to place it within the query. I've researched the heck out of this and no matter where I try to place it I receive errors. Can anyone help with this? Thank you very much.
// DAX Query
DEFINE
VAR __DS0FilterTable =
TREATAS({"2021"}, 'edw dimDate'[Year])
VAR __DS0FilterTable2 =
TREATAS({"August"}, 'edw dimDate'[MonthName])
VAR __DS0Core =
SUMMARIZECOLUMNS(
'edw dimDate'[MonthYear],
'edw dimDate'[Month],
'edw dim_paycom_amcs_location_xref'[Paycom_Location_Desc],
__DS0FilterTable,
__DS0FilterTable2,
"SumOvertime_Hours_by_Day", CALCULATE(SUM('PaycomHours'[Overtime_Hours_by_Day])),
"SumReg_Hours_by_Day", CALCULATE(SUM('PaycomHours'[Reg_Hours_by_Day])),
"Transportation", 'PaycomHours'[Transportation],
"Total_Inbound_Tons", 'PaycomHours'[Total Inbound Tons],
"Total_Inbound_Tons__excl_Yakima_", 'PaycomHours'[Total Inbound Tons (excl Yakima)],
"No_Operating_Days", 'edw dimDate'[No.Operating Days],
"Tonnage_Inbound__3rd_Party", 'PaycomHours'[Tonnage Inbound- 3rd Party],
"Tonnage_Inbound__Intercompany", 'PaycomHours'[Tonnage Inbound- Intercompany],
"Tonnage_Inbound___3rd_Party_Metal", 'PaycomHours'[Tonnage Inbound - 3rd Party Metal],
"Tonnage___Intercompany_Metal", 'PaycomHours'[Tonnage - Intercompany Metal],
"Tonnage___Intercompany_Hog_Fuel", 'PaycomHours'[Tonnage - Intercompany Hog Fuel],
"Tonnage___3rd_Party_Hog_Fuel", 'PaycomHours'[Tonnage - 3rd Party Hog Fuel],
"Total_Commodities_Volume_Sold", 'PaycomHours'[Total Commodities Volume Sold],
"Tonnage___Intercompany_Cardboard", 'PaycomHours'[Tonnage - Intercompany Cardboard],
"Tonnage___Intercompany_ALL", 'PaycomHours'[Tonnage - Intercompany ALL],
"Tonnage___3rd_Party_ALL", 'PaycomHours'[Tonnage - 3rd Party ALL]
)
VAR __DS0PrimaryWindowed =
TOPN(
501,
__DS0Core,
'edw dimDate'[Month],
1,
'edw dimDate'[MonthYear],
1,
'edw dim_paycom_amcs_location_xref'[Paycom_Location_Desc],
1
)
EVALUATE
__DS0PrimaryWindowed
ORDER BY
'edw dimDate'[Month],
'edw dimDate'[MonthYear],
'edw dim_paycom_amcs_location_xref'[Paycom_Location_Desc]

I'd suggest the following:
DEFINE
VAR __DS0FilterTable =
TREATAS ( { "2021" }, 'edw dimDate'[Year] )
VAR __DS0FilterTable2 =
TREATAS ( { "August" }, 'edw dimDate'[MonthName] )
VAR __DS0FilterTable3 =
TREATAS ( { #Location }, 'edw dim_paycom_amcs_location_xref'[Paycom_Location_Desc] )
VAR __DS0Core =
SUMMARIZECOLUMNS (
'edw dimDate'[MonthYear],
'edw dimDate'[Month],
'edw dim_paycom_amcs_location_xref'[Paycom_Location_Desc],
__DS0FilterTable,
__DS0FilterTable2,
__DS0FilterTable3,
[... Remainder of query the same ...]
Be sure to map #Location to your report parameter here:

I've struggled a lot with parameters with DAX in Report Builder. There are a few different approaches depending on what your needs are.
The first question, do you need users to be able to change the value of the parameter? If so, you will need to define it under "Parameters" in Report Data.
Next regardless of your previous answer, you need to define the parameter in your dataset.
If you are using a user-selectable parameter, then use the notation shown in the screenshot of [#User_Parameter_Name]. If you click into the function editor, you will see this corresponds to the VB.Net expression =Parameters!User_Parameter_Name.Value
The second question is if you are entering the query directly into the Query box in the Dataset Properties. If you are just pasting your DAX from PBI into that box, then you should be good to go.
However, if you are using the Query Designer (and I can't blame you if you don't), then you need to do as you have already discovered and re-declare the parameter within the Query Designer.
The problem with the Query Parameters here is that they really struggle with defaulting to empty values... Also, when I have a parameter, Query Designer always seems to forget that I'm using DAX and not MDX every time I opened it.
But anyways, once you declare it here, then you can refer to them in the query itself.
Finally, if you need to pass a list of values instead of a single value, the only way I've gotten this to work is to use this strange, little-documented function RSCustomDaxFilter to create a filter table:
VAR MonthFilter=
SUMMARIZECOLUMNS (
'Month'[Month],
RSCustomDaxFilter(#MonthQP,EqualToCondition, [Month].[Month], String
)
I can make some inferences on how this function works; the Table.Field syntax and specifying the object type make me think at least modeled after VB.Net, but aside from examples of how to use it in pretty much only this exact scenario, I have yet to find any official documentation on how it works, and the best explanation is from Chris Webb's blog, back in 2019. Also, it loves to freeze when performing query preparation if you don't use the Query Designer!

DAX pattern schematic that I use. No need to use RSCustomDaxFilter function and thus you can use DaxStudio to work on your querys. Hallelujah :}
/*
History
yyyymmdd aa build
Datasets
abbrev = workspace.dataset.table
Notes
+ etc
*/
EVALUATE
-- parse the parameter value lists ready for the main querys
VAR ParameterA = SUBSTITUTE (TRIM (SUBSTITUTE (#ParameterA, ",", "|")), "| ", "|" )
-- build the lists
VAR TableA =
FILTER (
SELECTCOLUMNS (
'table',
"column a", 'table '[column a],
"column b", 'table '[column b],
etc
),
-- column a, ParameterA
SWITCH (
TRUE (),
LEN (ParameterA) = 0, TRUE (), -- ignore
PATHCONTAINS (ParameterA, "All"), TRUE (), -- ignore
IF ( PATHCONTAINS ( ParameterA, "NA" ), -- NA parameter selection
IF ( [column a] IN
{"NA",
"0",
"Not Applicable",
"Select",
" ",
"",
BLANK()},
TRUE (), -- NA value found
IF ( PATHCONTAINS ( ParameterA, [column a] ),
TRUE (), -- direct match found
FALSE () -- out of scope condition
)
),
FALSE ()
),
TRUE (),
PATHCONTAINS ( ParameterA, [column a] ), TRUE (),
FALSE ()
) &&
etc
)
VAR TableB =
etc
-- join the lists
VAR Result =
NATURALINNERJOIN (TableA, TableB)
RETURN
Result
--ROW ( "ParameterA", """"& ParameterA &"""" ) -- parameter checker
ORDER BY
[column a] ASC,
[column b] ASC
/* testing framework daxstudio */
<Parameters etc
<Parameter>
<Name></Name>
<Value xsi:type="xsd:string"></Value>
</Parameter>
etc
</Parameters>

Related

SSRS Textbox expression modifying

i designed a tablix report, i have a text box called Student-Attendance which dispaly the information below.
Student_Attendance
Sick
Absence
Present
I have tried to use IIF statement in order to show it as S,A,P. Other than "IIF" is there anything i could use in order to get my result.
IIF (Fields!Student_Attendance.value = "Sick", "S" ) and
IIF(Fields!Student_Attendance.value = "Absence" , "A")
IIF Takes 3 arguments
The condition (if field = value)
What to return if true (e.g. "S")
What to return if false - you are missing this
If you want to use IIF then you have to nest the IIFs
=IIF(Fields!Student_Attendance.value = "Sick", "S", IIF(Fields!Student_Attendance.value = "Absence" , "A") )
What might be simpler is SWITCH especially if you have more than a few options, something like this
=SWITCH (
Fields!Student_Attendance.value = "Sick", "S",
Fields!Student_Attendance.value = "Absence", "A",
Fields!Student_Attendance.value = "Present", "P"
)

Django / PostgresQL jsonb (JSONField) - convert select and update into one query

Versions: Django 1.10 and Postgres 9.6
I'm trying to modify a nested JSONField's key in place without a roundtrip to Python. Reason is to avoid race conditions and multiple queries overwriting the same field with different update.
I tried to chain the methods in the hope that Django would make a single query but it's being logged as two:
Original field value (demo only, real data is more complex):
from exampleapp.models import AdhocTask
record = AdhocTask.objects.get(id=1)
print(record.log)
> {'demo_key': 'original'}
Query:
from django.db.models import F
from django.db.models.expressions import RawSQL
(AdhocTask.objects.filter(id=25)
.annotate(temp=RawSQL(
# `jsonb_set` gets current json value of `log` field,
# take a the nominated key ("demo key" in this example)
# and replaces the value with the json provided ("new value")
# Raw sql is wrapped in triple quotes to avoid escaping each quote
"""jsonb_set(log, '{"demo_key"}','"new value"', false)""",[]))
# Finally, get the temp field and overwrite the original JSONField
.update(log=F('temp’))
)
Query history (shows this as two separate queries):
from django.db import connection
print(connection.queries)
> {'sql': 'SELECT "exampleapp_adhoctask"."id", "exampleapp_adhoctask"."description", "exampleapp_adhoctask"."log" FROM "exampleapp_adhoctask" WHERE "exampleapp_adhoctask"."id" = 1', 'time': '0.001'},
> {'sql': 'UPDATE "exampleapp_adhoctask" SET "log" = (jsonb_set(log, \'{"demo_key"}\',\'"new value"\', false)) WHERE "exampleapp_adhoctask"."id" = 1', 'time': '0.001'}]
It would be much nicer without RawSQL.
Here's how to do it:
from django.db.models.expressions import Func
class ReplaceValue(Func):
function = 'jsonb_set'
template = "%(function)s(%(expressions)s, '{\"%(keyname)s\"}','\"%(new_value)s\"', %(create_missing)s)"
arity = 1
def __init__(
self, expression: str, keyname: str, new_value: str,
create_missing: bool=False, **extra,
):
super().__init__(
expression,
keyname=keyname,
new_value=new_value,
create_missing='true' if create_missing else 'false',
**extra,
)
AdhocTask.objects.filter(id=25) \
.update(log=ReplaceValue(
'log',
keyname='demo_key',
new_value='another value',
create_missing=False,
)
ReplaceValue.template is the same as your raw SQL statement, just parametrized.
(jsonb_set(log, \'{"demo_key"}\',\'"another value"\', false)) from your query is now jsonb_set("exampleapp.adhoctask"."log", \'{"demo_key"}\',\'"another value"\', false). The parentheses are gone (you can get them back by adding it to the template) and log is referenced in a different way.
Anyone interested in more details regarding jsonb_set should have a look at table 9-45 in postgres' documentation: https://www.postgresql.org/docs/9.6/static/functions-json.html#FUNCTIONS-JSON-PROCESSING-TABLE
Rubber duck debugging at its best - in writing the question, I've realised the solution. Leaving the answer here in hope of helping someone in future:
Looking at the queries, I realised that the RawSQL was actually being deferred until query two, so all I was doing was storing the RawSQL as a subquery for later execution.
Solution:
Skip the annotate step altogether and use the RawSQL expression straight into the .update() call. Allows you to dynamically update PostgresQL jsonb sub-keys on the database server without overwriting the whole field:
(AdhocTask.objects.filter(id=25)
.update(log=RawSQL(
"""jsonb_set(log, '{"demo_key"}','"another value"', false)""",[])
)
)
> 1 # Success
print(connection.queries)
> {'sql': 'UPDATE "exampleapp_adhoctask" SET "log" = (jsonb_set(log, \'{"demo_key"}\',\'"another value"\', false)) WHERE "exampleapp_adhoctask"."id" = 1', 'time': '0.001'}]
print(AdhocTask.objects.get(id=1).log)
> {'demo_key': 'another value'}

SSIS Derived Column translated from SSMS query

New to SSIS, been dealing with SSMS mostly. Anyone can help translating the below SSMS statement into SSIS Derived Column Transformation? Many thanks.
ReliabilityFactorInput = Case
When (isnull(pn.LBOXMATL, 'OTHER') = 'OTHER' AND (round(ISNull(edd.cal_year, eqd.YearManuf) + 1, -4)/10000<=2003) OR pn.LBOXMATL ='Cast Iron') AND (ceiling((pn.NOWAYS+1)/2)*2 >= 4) then '1.3'
When (isnull(pn.LBOXMATL, 'OTHER') = 'OTHER' AND (round(ISNull(edd.cal_year, eqd.YearManuf) + 1, -4)/10000<=2003) OR pn.LBOXMATL = 'Cast Iron') AND (ceiling((pn.NOWAYS+1)/2)*2 < 4) then '1.1'
else ''
End
1.Name a variable with whatever name you want with int data type
2.Use execute sql task
3.copy all the complete query into that task and specify the result Set to single row
4.Switch to Result Set page, choose the variable you create, and set the result name to 0
5.Now every time you run the package the variable will be assigned either 1.3 or 1.1
That variable could be used in Derived Column transformation in data flow now

Hiding table or assigning temp data based on visibility expression ssrs 2008

I have a table in ssrs 2008. This table has a row visibility expression like:
=IIF(max(Fields!VExpected.Value) <> "", 1, 0) +
IIF(max(Fields!MExpected.Value) <> "", 1, 0) +
IIF(max(Fields!PExpected.Value) <> "", 1, 0) = 3, false, true)
Sometimes the datasource returns no data, or the returned data is not matching with this expression. In this case what I see is that a table with borders and column names but no data on it like:
id Vex Mex Pex
However, I want to show it as
id Vex Mex Pex
- - - -
Or if possible:
id Vex Mex Pex
No Data
Another question is, is there any way to hide the complete table if there is no returning data or any matching data with the expression?
Thanks
You can use CountRows function to determine how many rows your dataset is returning. If it is zero hide the table otherwise show it.
=iif(CountRows("DataSetName")=0,true,false)
Replace DataSetName by the actual name of your dataset.
For not matching expression data you can use the this expression.
=IIF(
max(Fields!VExpected.Value) <> "" AND
max(Fields!MExpected.Value) <> "" AND
max(Fields!PExpected.Value) <> "",False,True
)
The whole expression for matching expression and no rows cases could be something like this:
=Switch(
CountRows("DataSetName")=0,true,
max(Fields!VExpected.Value) = "",true,
max(Fields!MExpected.Value) = "",true,
max(Fields!PExpected.Value) = "",True,
true,False
)
Supposing VM, ME and PE expected values are numeric type I'd use ISNOTHING() function to determine when null values are being returned.
=Switch(
CountRows("DataSetName")=0,true,
ISNOTHING(max(Fields!VExpected.Value)),true,
ISNOTHING(max(Fields!MExpected.Value)),true,
ISNOTHING(max(Fields!PExpected.Value)),True,
true,False
)
Additional you can set a message when no rows are being returned from your dataset. Select the tablix and press F4 to see properties window. Go to NoRowsMessage property and use an expression to say your users there is no data.
="There is no data."
In this cases the tablix will not appear in your report but the message you set will be rendered in the location where the tablix should be.
Let me know if this helps.

postgres crosstab query with $libdir/tablefunc crosstab_hash function

My crosstab query (see below) runs just fine. However, I have to generate a large number of such queries, and - crucially - the number of column definitions will vary from day to day. If the number of output columndefs does not match that of the second argument of the crosstab, the crosstab will throw and error and abort. Therefore, I cannot "hard-wire" the column definitions as in my current query, and I need instead a function which will ensure that column definitions will be synchronized on-the-fly. Is it possible to write a generic postgres function that will be reusable in all such instances? Here is my query:
SELECT *
FROM crosstab
('SELECT
to_char(ipstimestamp, ''mon DD HH24h'') As row_name,
ips.objectid::text As category,
COUNT(*)::integer As value
FROM loggingdb_ips_boolean As log
INNER JOIN IpsObjects As ips
ON log.Varid=ips.ObjectId
WHERE (( log.varid = 37551)
OR (log.varid = 27087)
OR (log.varid = 29469)
OR (log.varid = 50876)
OR (log.varid = 45096)
OR (log.varid = 54708)
OR (log.varid = 47475)
OR (log.varid = 54606)
OR (log.varid = 25528)
OR (log.varid = 54729))
GROUP BY to_char(ipstimestamp, ''yyyy MM DD HH24h''), row_name, objectid, category
ORDER BY to_char(ipstimestamp, ''yyyy MM DD HH24h''), row_name, objectid, category',
'SELECT DISTINCT varid
FROM loggingdb_ips_boolean ORDER BY 1;'
)
As CountsPerHour(row_name text,
"25528" integer,
"27087" integer,
"29469" integer,
"37551" integer,
"45096" integer,
"54606" integer,
"54708" integer,
"54729" integer)
PS: Note that this query can be run against test data at the following server:
host: bellariastrasse.com
database: IpsLogging
user: guest
password: guest
I am afraid what you want is not completely possible. If the return type varies, you can either
create a function returning a generic SETOF record.
But then you'd have to provide a column definition list with every call - bringing you right back to where you started.
create a new function with a matching return type for every different case.
But that's what you are trying to avoid ...
If you have to write "a large number of such queries" you could utilize a query-generator function instead, which would not return the results but the DDL script which you would execute in a second step. Basically a function that takes in the variable parts as parameters and generates the query string in your example .. RETURNS text.
This can get pretty complex. Several meta-levels on top of each other have to be considered, but it is absolutely possible. Be sure to make heavy use of dollar-quoting to keep the quoting madness at bay.