Using Expressions in SSIS Casting Issue - ssis

I'm using expressions in the Control Flow to choose different Data Flow Tasks based upon the name o file being parsed (basically loading data from Excel Source into OLE DB Destination).
I'm using the following Expression and its variants to get the desired result:
(DT_I4) (DT_WSTR, 2) FINDSTRING( #[User::srcFilePath] , "DIVISION", 1)
(DT_BOOL) (DT_WSTR, 2) FINDSTRING( #[User::srcFilePath] , "DIVISION", 1)
(DT_BOOL) FINDSTRING( #[User::srcFilePath] , "DIVISION", 1)
I basically just want the DFT to be executed only if the expression is true. There are some casting errors coming.
The most promising result comes when I use this expression:
(DT_WSTR, 2) FINDSTRING( #[User::srcFilePath] , "DIVISION", 1)
Where the answer is basically '45' in string form so I cannot use any logical operators or compare them against any numbers ...
Any help would be much appreciated.

Why not just compare the FINDSTRING result against 0 (not found)?
FINDSTRING("DIVISION_ABC.xls", "DIVISION", 1 ) > 0
That would evaluate to True - just replace "DIVISION_ABC.xls" with #[User::srcFilePath]

Related

Double Quotes in temporary JSON variable on MySQL using R

I have a table in MYSQL that contains the user interactions with a Web Page, I needed to extract the rows for the users where the date of that interaction is lower than a certain benchmark date and that benchmark date is different for each customer (I extract that date from a different database).
My approach was to set a json variable in which the key is a user and the value is the benchmark date, and used it in the query to extract the intended fields.
Example in R:
#MainDF contains the user and the benchmark date from a different database
json_str <- mapply(function(uid, bench_date){
paste0(
'{','"',cust,'"', ':', '"', bench_date, '"','}'
)
}, MainDF[, 'uid'],
MainDF[, 'date']
)
json_str <- paste0("'", '[', paste0(json_str , collapse = ','), ']', "'")
temp_var <- paste('set #test=', json_str)
The intention was to make temp_var to be like:
set #test= '{"0001":"2010-05-05",
"0012":"2015-05-05",
"0101":"2018-07-20"}'
but it actually looks like :
set #test= '{\"0001\":\"2010-05-05\",
\"0012\":\"2015-05-05\",
\"0101\":\"2018-07-20\"}'
then create the main query:
main_Q <- "select user_id, date
from interaction
where 1=1
and json_contains(json_keys(#test), concat('\"',user_id,'\"')) = 1
and date <= json_unquote(json_extract(#test,
concat('$.','\"',user_id, '\"')
)
)
"
For the execution, first, set the temporal variable and then execute the main query
dbSendQuery(connection, temp_var)
resp <- dbSendQuery(connection, main_Q )
target_df <- fetch(resp, n=-1)
dbClearResult(resp )
When I test a fraction of it in a SQL IDE it does works. However, in R it doesn't return anything.
I think that the issue is that R escape the double quotes in temp_var and SQL end up reading
set #test= '{\"0001\":\"2010-05-05\",
\"0012\":\"2015-05-05\",
\"0101\":\"2018-07-20\"}'
which is not won't work.
For example if I execute:
set #test= '{"0001":"2010-05-05",
"0012":"2015-05-05",
"0101":"2018-07-20"}'
select json_keys(#test)
it will return an array with the keys, but that is not the case with
set #test= '{\"0001\":\"2010-05-05\",
\"0012\":\"2015-05-05\",
\"0101\":\"2018-07-20\"}'
select json_keys(#test)
I am not sure how to solve the issue, but I need double quotes to specify the JSON. Is there any other approach that I should try or a way to make this work?
First, I think it is generally better to use a well-known library/package for converting to/from JSON, for several reasons.
This gives you a string that you should be able to place just about anywhere.
json_str <- jsonlite::toJSON(setNames(as.list(MainDF$date), MainDF$uid), auto_unbox=TRUE)
json_str
# {"0001":"2010-05-05","0012":"2015-05-05","0101":"2018-07-20"}
And while looking at the object on the R console will give the escaped-doublequotes,
as.character(json_str)
# [1] "{\"0001\":\"2010-05-05\",\"0012\":\"2015-05-05\",\"0101\":\"2018-07-20\"}"
that is merely R's representation (shows all strings within double-quotes, and therefore needs to escape any double-quotes within the string).
Adding it into some script should be straight-forward:
cat(paste('set #test=', sQuote(json_str)), '\n')
# set #test= '{"0001":"2010-05-05","0012":"2015-05-05","0101":"2018-07-20"}'
I'm assuming that having each on its own row is not critical. If it is, and indentation is important, perhaps this is more your style:
spaces <- strrep(' ', 2+nchar('set #test = '))
cat(paste0('set #test = ', sQuote(gsub(",", paste0(",\n", spaces), json_str))), '\n')
# set #test = '{"0001":"2010-05-05",
# "0012":"2015-05-05",
# "0101":"2018-07-20"}'
Data:
MainDF <- read.csv(stringsAsFactors=FALSE, colClasses='character', text='
uid,date
0001,2010-05-05
0012,2015-05-05
0101,2018-07-20')

Error DeserializeJSON() MySQL json_object

I am getting back a JSON string from a MySQL 5.7 query in ColdFusion 9.0.1. Here is my query:
SELECT (
SELECT GROUP_CONCAT(
JSON_OBJECT(
'nrtype', nrt.nrtype,
'number', nr.number
)
)
) AS nrJSON
FROM ...
The returned data looks like this:
{"nrtype": "Phone 1", "number": "12345678"},{"nrtype": "E-Mail 1", "number": "some#email.com"}
But as soon as I try to use DeserializeJSON() on it I am getting the following error:
JSON parsing failure at character 44:',' in {"nrtype": "Phone 1", "number": "12345678"},{"nrtype": "E-Mail 1", "number": "some#email.com"}
I am a little confused. What I want to get is a structure created by the DeserializeJSON() function.
What can I do?
That is not valid JSON as the parser is describing. If you wrap that JSON within square brackets '[' and ']' it would be valid (or at least parsable). They will make it an array of structures. Not sure how to make MySQL return the data within those brackets?
I guess you could add the brackets using ColdFusion but I would prefer to have the source do it correctly.
jsonhack = '[' & queryname.nrJSON & ']';
datarecord = DeserializeJSON(jsonhack);
writeDump(datarecord);
I created an example with your data that you can see here - trycf.com gist
From the comments
The solution indeed was [to add the following to the SQL statement]:
CONTACT('[',
GROUP_CONCAT(
JSON_OBJECT(...)
),
']')
If you have columns with some already containing JSON format String, try this : https://stackoverflow.com/a/45278722/2282880
Portion of code with JSON_MERGE() :
...
CONCAT(
'{"elements": [',
GROUP_CONCAT(
JSON_MERGE(
JSON_OBJECT(
'type', T2.`type`,
'data', T2.`data`
),
CONCAT('{"info": ', T2.`info`, '}')
)
),
']}'
) AS `elements`,
...

SSIS Filename variable with wildcards

I am looking at creating a SSIS package that copies the latest file from a specific directory. There are multiple files output to the directory each day and each gets date and time stamped in the file name
FLATFILE_20150909_130801.txt
FLATFILE_20150909_230508.txt
I am only interested in the latest file during the day. Which will always have the same prefix 23 . I’ve written the expression as
"\\\\DESTINATION\\FLATFILE"+ "_"+ (DT_STR,4,1252)DATEPART( "yyyy" , getdate() ) + RIGHT("0" + (DT_STR,4,1252)DATEPART( "mm" , getdate() ), 2) + RIGHT("0" + (DT_STR,4,1252)DATEPART( "dd" , getdate() ), 2) + "_"+ "23"+ "****"+ ".sqb"
Which returns FLATFILE_20150909_23****.txt
Is there any way to assign wildcards to the characters in the string **** so it ignores these??

SSRS Format seconds as time (negative seconds)

I have a column with integers: TotalSec that has seconds. It can be 0, negative or positive.
I have to format these seconds in a report. But cant get something working for the negative seconds.
My logic:
For 0 = Nothing,
For Positive format as HH:mm:ss
For Negative - ABS the value then format as -HH:mm:ss
=IIF(SUM(Fields!TotalSec.Value)=0, Nothing, IiF(SUM(Fields!TotalSec.Value)>0,
Format(DateAdd("s",SUM(Fields!TotalSec.Value), "00:00:00"), "HH:mm:ss"), "-" & Format(DateAdd("s",ABS(SUM(Fields!TotalSec.Value)), "00:00:00"), "HH:mm:ss")))
I get an #Error for the Negative numbers. With the warning:
Warning 2 [rsRuntimeErrorInExpression] The Value expression for the textrun ‘TotalSec.Paragraphs[0].TextRuns[0]’ contains an error: The added or subtracted value results in an un-representable DateTime. Parameter name: value
It worked like this:
=IIF(SUM(Fields!TotalSec.Value)=0,Nothing,IIF(SUM(Fields!TotalSec.Value)< 0,"-"&Format(DateAdd("s",ABS(SUM(Fields!TotalSec.Value)), "00:00:00"), "HH:mm:ss"),Format(DateAdd("s",ABS(SUM(Fields!TotalSec.Value)), "00:00:00"), "HH:mm:ss")))
I cleaned up the previous answer:
=IIf(Fields!elapsed.Value < 0, "-", "+")
&
Format(
DateAdd(
"s",
Abs(Fields!elapsed.Value),
"00:00:00"
),
"HH:mm:ss" ' can be m:ss
)
The "+" is to keep the results lined up, and can be replaced with "" if desired.

IIF Statements with blank or alpha characters in MS Visual Basic

I have a field CODE_USER_2 that can be equal to 1.75, 2, 2.62, 3.75, 5.25, 6, OT(w/2 spaces, or __(4 spaces). If it is 1.75, 2, 2.62, 3.75, 5.25, 6 I would like corresponding weights to result (THIS PART WORKS).
If the field is __ or OT, I would like the equation to result with 0. I currently get #error with the following formula.
=IIf(Fields!CODE_USER_2_IM.Value= " " OR "OT " ,0,Switch(Fields!CODE_USER_2_IM.Value=1.75,.629,Fields!CODE_USER_2_IM.Value=2,.67,Fields!CODE_USER_2_IM.Value=2.62,1.089,Fields!CODE_USER_2_IM.Value=3.75,1.767,Fields!CODE_USER_2_IM.Value=5.25,3.224,Fields!CODE_USER_2_IM.Value=6,3.895))
Please let me know if you have ideas!
You have to repeat "Fields!CODE_USER_2_IM.Value = " after the OR.
Or try this - less wordy but more obscure :
=IIf( ( " |OT |" ).Contains(Fields!CODE_USER_2_IM.Value + "|"
,0,Switch(Fields!CODE_USER_2_IM.Value=1.75,.629,Fields!CODE_USER_2_IM.Value=2,.67,Fields!CODE_USER_2_IM.Value=2.62,1.089,Fields!CODE_USER_2_IM.Value=3.75,1.767,Fields!CODE_USER_2_IM.Value=5.25,3.224,Fields!CODE_USER_2_IM.Value=6,3.895
, True , 0))
Always close out a Switch with:
, True , [default value] )
Also beware that matching in SSRS expressions is exact e.g. 2 spaces not = 4 spaces.