JSON Serialization and .NET SQL Parameters - json

I have recently joined a team where they are using JSON serialization to pass parameter arrays to SQL Server stored procedures where they are then deserialized and the required values extracted i.e. Each stored procedure has a '#Parameters' parameter of type VARCHAR(MAX). The framework creating and executing the command is written in C# and uses standard .NET types (SqlCommand, SQlParameter) etc.
It appears that when the serialized content is greater in length than a threshold that the stored procedure is not being properly executed but no exceptions are raised. Nothing appears to happen. Running SQL Profiler I have observed that there is no attempt to execute the stored procedure in SQL Server.
For example:
In one case there are just 30 instances of a type with no more than eight properties being serialized. The serialization succeeds and the value is assigned to the sql parameter in the parameters collection of a SqlCommand (there is only one parameter). The command is executed but nothing happens. If there are fewer occurrences of a type then it succeeds. When it does not succeed an exception is not being raised.
Using:
SQL Server 2008
C# .NET 4.0
JSON Serialization provided by Newtonsoft.
The SqlParameter in code is created as a varchar max.
Client Server architecture - there are no intermediary services.
Does anyone know of a limit for JSON serialized values being passed as a sql parameter with a SqlCommand or have any ideas as to what might be causing this behaviour?

I've figured out what was happening in my case. It had nothing to do with the serialziation of the value in a sql parameter as I found that if I waited long enough (5 -7 minutes in my case) that the procedure eventually executed.
In the stored procedure there is a cursor being used to extract the records of interest (this is a batch update procedure). This cursor was calling the CLR function to deserialize the Json for each value it required. By inserting the desirialized data into a temp table for use in the cursor the 'issue' was resolved.

Related

MySQL Stored Procedure Read Replica Issue - Strange Stored Procedure/Function Behavior

UPDATE 11.15.2022
I have conducted extensive testing and found the pattern of problem here. Once again, what's strange is this ONLY happens if you pass a function as a parameter to the originating Stored Procedure; passing a hardcoded value or variable works just fine.
The issue is when the Stored Procedure calls another Stored Procedure that checks ##read_only to see if it can WRITE to the database. I confirmed removing any code that writes data fixes the issue -- so ultimately it appears passing a STATIC value to the SP causes the procedure execution to bypass any writing (as expected) because of the IF ##read_only = FALSE THEN ...write...
It seems passing a function somehow causes MySQL to compile a "tree" of calls and subcalls to see if they CAN write rather than if they DO write.
It appears the only way to work around this is to pass the parameters as variables rather than function calls. We can do this, but it will require substantial refactoring.
I just wonder why MySQL is doing this - why passing a function is causing the system to look ahead and see IF it COULD write rather than if it does.
We have a Read Replica that's up and running just fine. We can execute reads against it without a problem.
We can do this:
CALL get_table_data(1, 1, "SELECT * from PERSON where ID=1;", #out_result, #out_result_value);
And it executes fine. Note it's READS SQL DATA tagged. It does not write anything out.
We can also do this:
SELECT get_value("OBJECT_BASE", "NAME");
Which is SELECT function that is READ ONLY.
However, if we try to execute this:
CALL get_table_data(1, get_value("OBJECT_BASE", "NAME"), "SELECT * from PERSON where ID=1;", #out_result, #out_result_value);
We get the error:
Error: ER_OPTION_PREVENTS_STATEMENT: The MySQL server is running with the --read-only option so it cannot execute this statement
We're baffled at what could cause this. Both the SP and function are read-only and execute individually just fine, but the second we embed the function result in the call of the SP, the system chokes.
Any ideas?
So AWS cannot figure this out. The issue only happens when a function is passed as a parameter to a stored procedure that calls another stored procedure (not even passing the value of the function) that has a ##read_only check before doing an INSERT or UPDATE. So for some reason, the system is doing a pre-scan check when a function is passed vs. a variable or hardcoded value.
The workaround is to pass the function value as a variable.
I'm going to report this issue to Oracle as it might be some sort of bug, especially given the function is DETERMINISTIC.

How to pass azure pipeline variable to mysql stored procedure query in look up activity

I have to call a stored procedure in lookup activity of Azure Data Factory for mysql that takes azure pipeline variable as input but i dont know the exact syntax.
Like call stored_prpcedure("#variables('BAtchID')")
The variable is of string type
If anyone knows how exactly i can call it?
Please do share.
You cannot directly use call stored_prpcedure("#variables('BAtchID')") in your query section of Look up activity.
The query field expects a string value, when you use call stored_prpcedure("#variables('BAtchID')") directly, it will be parsed as is but not as a pipeline variable.
Instead, you need to concatenate the query with pipeline variable using #concat() function in data factory.
The following is a demonstration of how I used query field to execute stored procedure using dynamic content.
You can use the dynamic content below to successfully achieve your requirement (replace stored procedure name and variable name)
#concat('call demo("',variables('value_to_pass'),'")')
The above content will be parsed as call demo("Welcome") which is shown below (\ indicates escape character):
Note: The debug run in the above image failed because I don't have a stored procedure in mysql database.

Running Stored Procedure from Matlab Only Returns 1 row

I am retrieving data from a mysql database in matlab.
conn=database('my_database','','');
sql = 'call latest_hl_tradables()';
curs = exec(conn,sql);
curs = fetch(curs);
Yesterday, the code returned 600 rows. This morning, it returns 1 row. If I run the stored procedure (latest_hl_tradables) in MySQL Workbench, it still returns 600 rows.
Strangely, it the code started working again.
All I did was write some diagnostic code to count the number of records in the tables that latest_hl_tradables() queries. Initially those only returned 1 row. Then when they started returning all the rows. I don't know what changed.
(My configuration is R2014b / SQL Server 2014 / MS JDBC 4.0. I believe the OP describes a generic issue, not database-vendor-specific)
I have also experienced unreliable results within MATLAB from stored procedures which return result sets. I submitted a service request to Mathworks. The short answer is that in this use case, neither of the Matlab Database Toolbox functions exec or runstoredprocedure are appropriate. Rather, stored procedures should be called from MATLAB via runsqlscript. Here's the tech's response:
Unfortunately there is no way to get output of the query using the
cursor object. The only way to get the output on DML is by running
them as a script. The syntax is:
>>results = runsqlscript(connObject,'ScriptName.sql')
where the SQL queries are placed in the file "ScriptName.sql".
This will return cell array of results as the output
Note that this solution makes life complicated when your stored procedure requires input parameters. In typical cases when sp parameters aren't known a priori, this means you have to generate a custom SQL script on-the-fly & write it out to disk.

Accessing stored procedure in Linq-to-SQL

I'm using Linq-to-SQL query and using stored procedure in that. I'm getting error :
Specified cast is not valid.
How to solve it ?
Check your TDetail.AMOUNT values.
Your error is not when casting to an array, but rather in the Convert.ToDouble(TDetail.AMOUNT).
Run your stored proc with those same arguments (in SSMS or Visual Studio), and try to determine which value in TDetail.AMOUNT is causing this problem.
You're seeing this exception being thrown when you cast to an array, but it would happen whenever you evaluated your LINQ query. It's nothing to do with ToArray(). It could be ToList(), and you'd find the same exception.

SSIS 2008 Execute SQL output parameter mapping datetime2 problem

I'm trying to use an Execute SQL Task in SSIS 2008 to map a store procedure output parameter to a package variable.
The package variable is SSIS type DateTime and the store procedure parameter is SQL type DATETIME.
The SQL Statement is EXEC GetCurrentDate #CurrentDate=? and in the parameter mapping screen, the parameter is mapped to the package variable with direction Output and Data Type DBTIMESTAMP specified.
When I run the package I get the following error:
[Execute SQL Task] Error: Executing
the query "EXEC GetCurrentDate
#CurrentDate=? " failed with the
following error: "The type of the
value being assigned to variable
"User::CurrentDate" differs from the
current variable type. Variables may
not change type during execution.
Variable types are strict, except for
variables of type Object. ". Possible
failure reasons: Problems with the
query, "ResultSet" property not set
correctly, parameters not set
correctly, or connection not
established correctly.
If I run a trace on the query being run I see the type is being assumed as datetime2:
declare #p3 datetime2(7)
set #p3=NULL
exec sp_executesql N'EXEC GetCurrentDate #CurrentDate=#P1 ',N'#P1 datetime2(7) OUTPUT',#p3 output
select #p3
Does anyone know why it is assuming the type is datetime2?
Thanks
Found the answer on a Micorsoft Connect bug report:
We are closing this case as this is expected behaviour and is a result of the new sql datetime type change. You are using a native oledb connection manager for sql task, in the process of COM interop, we use VARIANT to hold the value and the only way to prevent data loss is to store the value as BSTR variant. If you change User::dateParam to String type it will work, or you can switch to use managed connection manager to bypass the COM interop.
http://connect.microsoft.com/SQLServer/feedback/ViewFeedback.aspx?FeedbackID=307835
Try specifying the inout/output parameters as DATE rather than DBTIMESTAMP in the SSIS task.
This certainly works in SSIS 2005 packages I've worked on.
It's also worth taking a look at this link, which covers this as well as a couple of other issues with SSIS and dates.