What should be the structure if I want to use concat and encodeURI in the same JSON for Azure DevOps pipeline - json

I have a code in my logic app that gets a list of warehouses from an API URL. In it, I am trying to define a parameter that picks my URL for the azure DevOps pipeline when deploying my logic app, but when I run the pipeline I get the error
Deployment template validation failed: 'The template resource
'MyLogicApp' at line '1' and column '1465' is not valid: Unable to
parse language expression
'concat('/v2/datasets/#{encodeURIComponent(encodeURIComponent(''https://',parameters('myApiUrl'),'.com''))}/tables/#{encodeURIComponent(encodeURIComponent('msdyn_warehouses'))}/items')':
expected token 'RightParenthesis' and actual 'Identifier'.. Please see
https://aka.ms/arm-template-expressions for usage details.'.
This is what my json looks like
"
[
concat
(
'
/v2/datasets/#
{
encodeURIComponent
(
encodeURIComponent
(
'
'
https://
',
parameters
(
'
myApiUrl
'
),
'
.com
'
'
)
)
}/tables/#
{
encodeURIComponent
(
encodeURIComponent
(
'
msdyn_warehouses
'
)
)
}/items
'
)
]
"
If I use it without concat, it works
"path": "/v2/datasets/#{encodeURIComponent(encodeURIComponent('https://bla.bla3.com'))}/tables/#{encodeURIComponent(encodeURIComponent('msdyn_warehouses'))}/items"
Is there a specific way to format concat and encodeURI together that I am missing?

After checking thoroughly, you are receiving the above-mentioned error as because of the incomplete template (i.e., it misses a right parenthesis).
the uri from the parameter within the encoding statement
In that case you need to include concat() inside the encodeURIComponent and below is how you need to frame.
/v2/datasets/#{encodeUriComponent(encodeUriComponent(concat('https://',parameters('myApiUrl'),'.com')))}/tables/#{encodeURIComponent(encodeURIComponent('msdyn_warehouses'))}/items

Related

I want to use few variables that are extracted using JSON extractor in other controller in Jmeter

In Jmeter,From an api I have extracted two variables which has many values using a json extractor the extracted variable are something like this
First variable is name which has data like {abc,asd,qwe,dff,hjk,lku,ghs,jjss}
So used the json extractor with variable name as name and have given path expression as $..name and match number as -1 and checked the suffix ALL
And second variable is id which has data like {123,344,6383,0298383,8282}
Again used json extractor with variable name as id path expression is $..id with match number-1 and checked the suffix ALL
These two variables are from a get api that is placed inside a for each controller1.
Now there is a other for each controller2 which has other get api same in this I have used json extractor and extracted two variables naming idd and namee
And I have written a compare code in this controller using bean shell assertion.(used the general compare code)
When the script is run I get a assertion error. Like the expected data field is empty, this expected data has to have the foreach controller1 variable data... Actual data i.e., foreach controller2 data is shown correctly.
If I don't place these 2 api's in for each controller the assertion works fine, but I need to use a controller as looping is needed
How to use one controller variables in other controller?
I have tried using
${__setProperty(name,${name})} in bean shell for first for each controller
Then used ${__property(name)} in the controller 2 beanshell assertion in the compare code to compare
This didn't work
If you have JMeter Variables like:
id_1=123
id_2=344
id_3=6383
etc.
from the first sampler
and variables like:
idd_1=123
idd_2=344
idd_3=6383
from the second sampler
and want to compare them one by one using ForEach Controller it needs to be configured as follows:
Then in JSR223 Assertion you will be able to refer current value of idd variable as vars.get('idd') and current value of the relevant id variable as vars.get('id_' + ((vars.get('__jm__ForEach Controller__idx') as int) + 1))
Example code:
def expected = vars.get('idd')
def actual = vars.get('id_' + ((vars.get('__jm__ForEach Controller__idx') as int) + 1))
log.info('idd variable value: ' + expected)
log.info('id variable value: ' + actual)
if (expected != actual) {
AssertionResult.setFailure(true)
AssertionResult.setFailureMessage('Expected: ' + expected + ', actual: ' + actual)
}
Demo:

Error parsing JSON: more than one document in the input (Redshift to Snowflake SQL)

I'm trying to convert a query from Redshift to Snowflake SQL.
The Redshift query looks like this:
SELECT
cr.creatives as creatives
, JSON_ARRAY_LENGTH(cr.creatives) as creatives_length
, JSON_EXTRACT_PATH_TEXT(JSON_EXTRACT_ARRAY_ELEMENT_TEXT (cr.creatives,0),'previewUrl') as preview_url
FROM campaign_revisions cr
The Snowflake query looks like this:
SELECT
cr.creatives as creatives
, ARRAY_SIZE(TO_ARRAY(ARRAY_CONSTRUCT(cr.creatives))) as creatives_length
, PARSE_JSON(PARSE_JSON(cr.creatives)[0]):previewUrl as preview_url
FROM campaign_revisions cr
It seems like JSON_EXTRACT_PATH_TEXT isn't converted correctly, as the Snowflake query results in error:
Error parsing JSON: more than one document in the input
cr.creatives is formatted like this:
"[{""previewUrl"":""https://someurl.com/preview1.png"",""device"":""desktop"",""splitId"":null,""splitType"":null},{""previewUrl"":""https://someurl.com/preview2.png"",""device"":""mobile"",""splitId"":null,""splitType"":null}]"
It seems to me that you are not working with valid JSON data inside Snowflake.
Please review your file format used for the copy into command.
If you open the "JSON" text provided in a text editor , note that the information is not parsed or formatted as JSON because of the quoting you have. Once your issue with double quotes / escaped quotes is handled, you should be able to make good progress
Proper JSON on Left || Original Data on Right
If you are not inclined to reload your data, see if you can create a Javascript User Defined Function to remove the quotes from your string, then you can use Snowflake to process the variant column.
The following code is working POJO that can be used to remove the doublequotes for you.
var textOriginal = '[{""previewUrl"":""https://someurl.com/preview1.png"",""device"":""desktop"",""splitId"":null,""splitType"":null},{""previewUrl"":""https://someurl.com/preview2.png"",""device"":""mobile"",""splitId"":null,""splitType"":null}]';
function parseText(input){
var a = input.replaceAll('""','\"');
a = JSON.parse(a);
return a;
}
x = parseText(textOriginal);
console.log(x);
For anyone else seeing this double double quote issue in JSON fields coming from CSV files in a Snowflake external stage (slightly different issue than the original question posted):
The issue is likely that you need to use the FIELD_OPTIONALLY_ENCLOSED_BY setting. Specifically, FIELD_OPTIONALLY_ENCLOSED_BY = '"' when setting up your fileformat.
(docs)
Example of creating such a file format:
create or replace file format mydb.myschema.my_tsv_file_format
type = CSV
field_delimiter = '\t'
FIELD_OPTIONALLY_ENCLOSED_BY = '"';
And example of querying from a stage using this file format:
select
$1 field_one
$2 field_two
-- ...and so on
from '#my_s3_stage/path/to/file/my_tab_separated_file.csv' (file_format => 'my_tsv_file_format')

Save the xml result to a column in a SQL table

I have copied the code somewhere in the internet and have created an html table using something like
''FOR XML RAW (''TR''), ELEMENTS, TYPE) AS ''TBODY''',
' FOR XML PATH (''''), ROOT (''TABLE'')'`
in SQL. The result is as expected which is an HTML table, below is the snippet.
Can someone point me on how to get the HTML string and save it into a column in my table. My thought was to get the result and save it as a string then insert it into a column in my table but after sometimes I failed.
The code example can be retrieve from https://www.mssqltips.com/sqlservertip/5025/stored-procedure-to-generate-html-tables-for-sql-server-query-output/
Cheers!! it's simple but I went around the globe. What I did was adding SET #Myvariable = in the dynamic query. Previously, I have tried so hard to assign the result into a variable outside the dynamic query which will never work for me.
Originally:
SET #DynTSQL = CONCAT (
'SELECT (SELECT '
, #columnslist
,' '
, #restOfQuery
,' FOR XML RAW (''TR''), ELEMENTS, TYPE) AS ''TBODY'''
,' FOR XML PATH (''''), ROOT (''TABLE'')'
)
What I did:
SET #DynTSQL = 'SET #SQLQuery1 =('+CONCAT (
'SELECT (SELECT '
, #columnslist
,' '
, #restOfQuery
,' FOR XML RAW (''TR''), ELEMENTS, TYPE) AS ''TBODY'''
,' FOR XML PATH (''''), ROOT (''TABLE'')'
)+')'
The result will be assigned into the #SQLQuery1.
Cheers!!!

postgresql read json that contains character ' in a string

Try to read this the json OV-fiets (http://fiets.openov.nl/locaties.json) in a postgres database with json_array_elements. Some names of train station contains the character ' .
Example ..... "description": "Helmond 't Hout"
I believe that my script fails because of the ' between Helmond and the t.
The script i use:
WITH data AS (SELECT 'paste the json from http://fiets.openov.nl/locaties.json'::json AS fc)
SELECT
row_number() OVER () AS gid,
feat->'locaties' AS locaties,
FROM (
SELECT json_array_elements(fc->'locaties') AS feat
FROM data
) AS f;*
++++++++++++++++++++++++++++++
The error i get:
*syntax error at or near "Hout"
LINE 3: ...Images": [], "name": "HMH - OV-fiets - Helmond 't Hout", "ex.*
How can i change the script to avoid the syntax error due to the character '
the easiest workaround here would probably be dollar quotes:
SELECT $dq$paste the json from http://fiets.openov.nl/locaties.json$dq$::json
In SQL, single quotes need to be escaped by doubling them, e.g.:
select 'Arthur''s house';
As an alternative (in Postgres) you can use dollar quoting to avoid changing the string:
SELECT $data$Arthur's house$data$

MySQL to JSON not formed properly

I am trying to return JSON formatted results from a MySQL query but cannot get the correct format - it needs to be e.g.
{comCom:'test 3', comUid:'63',... etc
But what I'm getting is without apostrophes
{comCom:test 3, comUid:63,... etc
I am running the query in PHP as follows (shortened for ease of reading)
$result = mysql_query("select...
...GROUP_CONCAT(CONCAT('{comCom:',ww.comment, ', comUid:',h.user_id,', comName:',h.name,', comPic:',h.live_prof_pic,',comUrl:',h.url,',comWhen:',time_ago(ww.dateadded),'}')) comment,...
How can I get the punctuation?
I know mysql_query is deprecated btw, just in process of moving things to MySQLi
Can you not just escape the ' character with \'?
...GROUP_CONCAT(CONCAT('{comCom:\'',ww.comment, '\', comUid:\'',h.user_id,'\', comName:\'',h.name,'\', comPic:\'',h.live_prof_pic,'\',comUrl:\'',h.url,'\',comWhen:\'',time_ago(ww.dateadded),'\'}'))
or use a mixture of " with '
...GROUP_CONCAT(CONCAT("{comCom:'",ww.comment, "', comUid:'",h.user_id,"', comName:'",h.name,"', comPic:'",h.live_prof_pic,"',comUrl:'",h.url,"',comWhen:'",time_ago(ww.dateadded),"'}"))