I have this stored procedure:
Dbo.SprocName (#Id UNIQUEIDENTIFIER,
#ResponseCode INT OUTPUT,
#ResponseDescription VARCHAR(500) OUTPUT)
And it returns a dataset called say Result as a nvarchar(MAX) (always a single row).
I've tried OLE and ADO connections and as well as result sets. I've tried creating a table variable and storing the value there.
Nothing works.
I can see in the database that it's running successfully then it fails when returning the result data set.
I’ve done some debugging and I can assure the result string is returned as should be. The problem is that I don’t know how to handle this on SSIS.
The error that I get is:
Input string was not in a correct format
I appreciate any ideas.
Thanks.
EDIT: I have tried using a table variable again and it works. I guess I didn't do it well first time. sorry about that. Thanks!
One potential cause for your problem could be a mismatch in data types between SSIS and SQL Server.
An SSIS GUID data type does not match a SQL Server uniqueidentifier - the SSIS GUID has curly braces (e.g., {00000000-0000-0000-0000-000000000000}), while the SQL value does not. SQL cannot recognize the value as a unique identifier, and fails to convert.
To pass down a GUID, you will need to remove those curly braces, either in SSIS or in SQL. One approach I've used it to send it across as a VARCHAR and then strip out the curly braces, e.g.,
DECLARE #GUID VARCHAR(40) = '{00000000-0000-0000-0000-000000000000}'
DECLARE #CnvtGUID UNIQUEIDENTIFIER = REPLACE(REPLACE(#GUID, '}', ''), '{', '')
SELECT #GUID, #CnvtGUID
Related
I am fetching data from a third party API, which responds with a JSON payload. However, this JSON contains another JSON object, stored as a string including escape characters. Example:
{
"aggregationType": "IDENTITY",
"outputs": [
{
"name": "Sinusoid|Sinusoid"
}
],
"value": "{\"dataX\":[1,2,3,4],\"dataY\":[1,4,9,16]}"
}
In the first part of the file, we have some regular parameters like 'aggregationType' and 'outputs', but the last parameter 'value' is the JSON object I am talking about.
What I would like to do is to enter the 'dataX' and 'dataY' arrays together into a table on a SQL DB. I haven't found a straightforward way of doing it so far.
What I've tried:
Using a simple copy activity, but I can only access the whole 'value' field, not separate out 'dataX' from 'dataY', let alone the array's individual values.
Using the lookup activity to then store 'value' in a variable. From here I can get to a usable JSON object in ADF, but the only way I've found of then sending the data to the DB is to use a ForEach activity containing a copy activity. Since dataX and dataY are in reality much larger, this seems to take forever when I debug.
Copying only the 'value' object to a blob and trying to retrieve the data from there. This hasn't worked because the object always ends up getting stored with the initial " marks and the \ escape characters.
Is there any way of getting around this issue?
You can store the value in a staging kind of table in SQL and then create a stored procedure to separate out the objects as arrays
JSON_Value can help you extract values:
SELECT JSON_VALUE('{"dataX": [1,2,3,4]}', '$.dataX') AS 'Output';
In your stored procedure you can try using above query and insert values in SQL table
To expand, on the tip from #Pratik Somaiya, I've written up a stored procedure that does the work of inserting the data to a persistent table.
I've had to use a WHILE loop, which doesn't really feel right, so I'm still on the lookout for a better solution on that.
CREATE OR ALTER PROCEDURE dataset_outputs_from_adf
#json_output_value NVARCHAR(MAX),
#output_name NVARCHAR(100)
AS
DECLARE #i INT = 0
WHILE JSON_VALUE(#json_output_value,CONCAT('$.dataX[',#i,']')) IS NOT NULL
BEGIN
INSERT INTO my_table
VALUES (
#output_name,
JSON_VALUE(
#json_output_value,
CONCAT('$.dataX[',#i,']')
),
JSON_VALUE(
#json_output_value,
CONCAT('$.dataY[',#i,']')
)
)
SET #i = #i + 1
END
GO
I should be able to make the whole thing repeatable without repetitive code by parameterizing Data Factory with the output name.
As I understand it, you can retrieve the embedded "value" JSON but it retains its escape characters. You would like to pass the arrays [1,2,3,4] and [1,4,9,16] to a relational database (Microsoft SQL Server?) to store them.
The embedded "value" can be converted to referencable JSON using the expression json(). This handles the escaped characters.
#json(variables('payload')).dataX
Will return the array [1,2,3,4] as expected.
How best to get this into SQL Server? We are limited to the activities ADF supports, which really comes down to a stored procedure (SP). Using a table valued parameter would be ideal, but not possible in current ADF. So I would suggest passing it to the SP as a string.
#string(json(variables('payload')).dataX)
This will look much the same as above but will be a string not an array.
In the SP there are a couple of ways to parse this string. If your version supports it STRING_SPLIT is convenient. Note the passed string will retain its leading and trailing square bracket. These can be removed in ADF or in SQL, it doesn't much matter where.
Since the data is JSON it may make more sense to use OPENJSON instead. Let's say we pass the contents of dataX to a SP parameter #dataX varchar(4000). Inside the SP we write
create procedure dbo.HandleData
#dataX varchar(4000),
#dataY varchar(4000)
as
insert dbo.SomeTable(ColumnX, ColumnY)
select x.value, y.value
from OPENJSON(#dataX) x
inner join OPENJSON(#dataY) y
on y.[key] = x.[key];
Further code may be needed if the arrays could be of different lengths, or there are NULLs, or non-integer values etc. Of course the resultset from OPENJSON can be used for joining or any other purpose within the SP.
If both dataX and dataY from the original payload end up in the same DB table the SP can be called twice, once for each. Alternatively you can union their arrays in ADF and call the SP once.
I have a simply report and would like to pass multiple INT value parameter (Product ID 'int') when I am running the report. When I choose the single value (Product ID), it runs well. But when I choose more than one value, in the report preview, it shows 'Error converting data type nvarchar to int' .
Does anyone have any idea about fixing this 'simple' problem? I think maybe i need to convert the parameter in SP. But I tried 2 days and got nothing.
Realllllly Appreciate It!!
(I am using SQL SERVER 2008.)
Yes you might need to convert your parameter's data type to varchar and use that as IN query as Product ID like
SELECT * FROM Product WHERE ProductID IN (#ProductIDs)
where #ProductIds will be varchar.
I'm just starting developing reports in SSRS and would appreciate some help with this issue if possible! I'm selecting a dataset from a Dynamics database and want to then pass them to a SQL Server stored procedure referenced in another dataset to retrieve data from another database. I have created a report parameter and set it to Allow multiple values and to retrieve its values from a query and set it to the field that I want to retrieve.
The dataset would look like this:
U1234
U5678
U6789
In the dataset that uses the stored procedure I have set up a parameter, #pnum, and in the Parameter Value field I have created an expression using the Join statement like this:
Join(Parameters!pnum.Value, ", ")
When this gets passed to the stored proc it seems to be passing a string formatted like this:
'U1234, U5678, U6789'
Whereas what I would like to achieve is this:
'U1234', 'U5678', 'U6789'
so that I can use them in an IN statement. Is there a way of doing this within SSRS?
Many Thanks!
To anyone else experiencing this issue, the assumption made in the question on how the values are passed to the stored procedure and how they can be used are incorrect.
The value passed from the join expression would be formatted as such, without single quotes at the start and end:
U1234, U5678, U6789
Further to this, when passed to a stored procedure as a single string this can only be used as an in list by using dynamic SQL.
To parse out and filter on the passed values, the string will need to be split on the delimiter and inserted into a table (temporary or otherwise) to be joined to.
A suitable splitting can be found here (though others exist that may better suit your needs) utilising logic as follows:
declare #xml as xml,#str as varchar(100),#delimiter as varchar(10)
set #str='A,B,C,D,E'
set #delimiter =','
set #xml = cast(('<X>'+replace(#str,#delimiter ,'</X><X>')+'</X>') as xml)
select N.value('.', 'varchar(10)') as value from #xml.nodes('X') as T(N)
If you don't have to pass the values to a stored procedure and are using hardcoded datasets (Shared or not) you can actually directly use the parameter value without additional processing either in the query or by adding a join expression to the parameter value in the report:
select cols
from tables
where cols in(#MultiValueParameterName)
You have to add an extra field with the value wrapped in quotes.
Like this:
SELECT field1 AS display, '''' + field1 + '''' AS value
I've got a column called description of type NVARCHAR(MAX) - biggest you can have. I need to return this field with quotes around it, so I'm trying to
SELECT QUOTENAME(description, '"')
This doesn't work - I get a "string or binary data would be truncated error."
My googling tells me that this problem can be solved by using SET ANSI_WARNINGS OFF, but if I do that, I still get the same error anyway.
Normally I would just pull the values into a temp table and use a field that is two characters bigger than the field I'm pulling in, thus ensuring that the QUOTENAME function won't cause any problems. How do I make a column two characters bigger than MAX, though?
QUOTENAME is a function intended for working with strings containing SQL Server identifier names and thus only works for strings less than or equal to the length of sysname (128 characters).
Why doesn't SELECT '"' + description +'"' work for you?
Using SQL-Server 2008 and concatenating string literals to more than 8000 characters by obvious modification of the following script, I always get the result 8000. Is there a way to tag string literals as varchar(max)?
DECLARE #t TABLE (test varchar(max));
INSERT INTO #t VALUES ( '0123456789012345678901234567890123456789'
+ '0123456789012345678901234567890123456789'
+ '... and 200 times the previous line'
);
select datalength(test) from #t
I used the following code on SQL Server 2008
CREATE TABLE [dbo].[Table_1](
[first] [int] IDENTITY(1,1) NOT NULL,
[third] [varchar](max) NOT NULL
) ON [PRIMARY]
END
GO
declare #maxVarchar varchar(max)
set #maxVarchar = (REPLICATE('x', 7199))
set #maxVarchar = #maxVarchar+(REPLICATE('x', 7199))
select LEN(#maxVarchar)
insert table_1( third)
values (#maxVarchar)
select LEN(third), SUBSTRING (REVERSE(third),1,1) from table_1
The value you are inserting in your example is being stored temporally as a varchar(8000) because. To make the insert one will need to use a variable which is varchar(max) and append to it to overcome the internal 8000 limit.
Try casting your value being inserted as a varchar(max):
INSERT INTO #t VALUES (CAST('0123456789012345678901234567890123456789'
+ '0123456789012345678901234567890123456789'
+ '... and 200 times the previous line' AS varchar(max)
);
Also, you may have to concatenate several <8000 length strings (each casted as varchar(max)).
See this MSDN Forum Post.
When I posted the question, I was convinced that there are some limitations for the length or maximal line width of a single string literal to be used in INSERT and UPDATE statement.
This assumption is wrong.
I was led to this impression by the fact the SSMS limits output width for a single column in text mode to 8192 characters and output of PRINT statements to 8000 characters.
Fact is, as far as I know you need only enclose the string with apostrophes and double all embedded apostrophes. I found no restrictions concerning width or total length of a string.
For the opposite task, to convert such strings back from database back to script the best tool I found is ssms toolspack which works for SQL-Server 2005+.