Executing stored procedure with large JSON parameter with pyodbc - json

In my system, I have a stored procedure set up in SQL Server which takes a JSON as a sole parameter. I want to call this procedure from python via pyodbc, but the JSON parameter I'd like to pass is in a large file.
Is there any way to avoid reading the entire JSON into dynamic memory? I'm not opposed to using BCP or related tools to help here, but none of the answers I've seen here on SO have dealt with large input files.

Related

Is it possible to run a stored procedure using sqlcmd and save the output into an Excel or .csv file?

I have a stored procedure which outputs 2 tables. I would like to save the output into two .csv files or into one Excel file. Is it possible to do so?
I am not sure how much it can help you, but you can create a DLL to generate CSV by excepting some input data and use that dll inside extended stored procedure. Call that store procedure using sqlcmd.
Extended stored procedure
But be aware Microsoft is suggesting to not use extended stored procedure as it will not be available in future.
Important
This feature will be removed in a future version of Microsoft SQL Server. Do not use this feature in new development work, and modify applications that currently use this feature as soon as possible. Use CLR Integration instead.

Progress4gl - exporting to single csv file from multiple procedures?

In progress4gl, Iam exporting some values from mutiple procedure files to a single csv file. But when running the second procedure (.p) file the values which I got from the previous file is getting overwritten...How to export all datas of all the procedure files to a single csv file? Thanks in Advance..
The quick answer is to open the second and subsequent outputs to the file as
OUTPUT TO file.txt APPEND.
if that is all you need. If you are looking to do something more complex, then you could define and open a new shared stream in the calling program, and use that stream in each of the called programs, thus only opening and closing the stream once.
If you're using persistent procedures and functions, this answer may help, as it's a little more complex than normal shared streams.
I would realy not suggest to use a SHARED Stream. Especially with persischen Procedures or OO. STREAM-HANDLES provide a more flexible way of distributing the Stream.
So as was previously suggested
on the first job running you do:
OUTPUT TO file.txt.
on all the other jobs running after this you do:
OUTPUT TO file.txt APPEND.

How to export a csv file based on a SQL query programatically

I am not a SQL expert, so this may be a really obvious thing to those of you who are experts. Anyway, I have created an application that has to generate a csv file based on a SQL Query. In reading the internet tips on how to export a CSV file from SQL they all refer you to using Management Studio, right-click on the query results and use the Save-AS technique. That will not practical for my purposes as the CSV file has to generated at the push of a button within my application. I was hoping (and expected) to be able to do it with a pass through query or stored procedure, but I'm not seeing that SQL 2008 supports this. Any help would be very much appreciated!
The answer will vary depending on the language that your application is written in but to use C# as an example. A common way is to populate a dataset from the sql query and then burn through the dataset using loops to generate the csv.
Here is an example of that approach from the interweb.
http://www.diaryofaninja.com/blog/2009/12/16/c-convert-dataset-to-csv
Here is another example using VB
http://www.vbnettutorial.net/?Id=119&Desc=Export-CSV-from-Dataset
The complexity of the data may require that you get fancy... (for example does your data have double quotes, coma's, binary data???

CF9 Serializejson gives "out of memory" error

I'm trying to serialize a query to json. The query returns about 300.000 records. When serializing error 500 "out of memory" occurs.
How to solve this. Is there a way to directly stream the query to some file format?
300 records shouldn't be enough to overflow the json library...
How much memory does your server have available / assigned to cf?
Can you paste a stack trace?
We use a handy little library called javacsv.
It is marvelous at creating csvs from arrays of strings. You simply add the .jar file to your class path, then create the java csv class, then call a bunch of methods to add columns or rows. It's good as it automagically quotes all of your data so you dont even have to think about it. It's fast too! I can post some code samples if you are interested.
http://sourceforge.net/projects/javacsv/
CF9 has some spreadsheet exporting methods too, which you should probably check out if you haven't already.
http://cfquickdocs.com/cf9/#cfspreadsheet

Optimizing Stored Procedures so they will be processed properly by Linq2SQL

Where I work it is a requirement for us to go through stored procedures as a mechanism to access our data through code. I am using LINQ2SQL to minimize this pain so that I can work with objects instead of ADO.NET directly. I have a situation Linq2SQL is consuming one of my stored procedures an generating code where the return type from the stored proc call is an int. The stored procedure actually returns a dataset. After doing a little research I have found that this is because the SQLClient library can not properly parse the sproc to generate the expected metadata that Linq2SQL uses to create the object graph. My question is how can sprocs (even complex ones) be structured so that you get an object graph out of linq2sql, or in other words what should you avoid having in your stored procedure that will create confusion for the SQLClient library to not understand how to generate the metadata that linq2sql consumes in order to create an object graph?
This is not actually a limitation of LINQ to SQL but rather of SQL Server which can not always tell a client what the return type would be without actually running it when it contains temporary tables, cursors or dynamic SQL.
As running it with invalid parameters could be potentially catastrophic it doesn't try.
You can either set it by hand using the designer or if it is absolutely okay to run the stored procedure with invalid data (i.e. it is purely passive) then you can add SET FMTOPT OFF to the start of the stored procedure.
DamienG works on the LinqToSql team at Microsoft and I have upvoted his answer as correct.
That said, he likely won't advise you away from LinqToSql and I think it's very important to consider that option.
Trying to guess the return type of a stored procedure is very difficult and LinqToSql does it as well as anyone (for SQL Server). That said, there are very compelling reasons not to use stored procedures:
Stored procedures are bad, m'kay?
If you must protect your tables from developers for "security reasons" (which I'm guessing is the situation you are in), it's best to do that with views instead of stored procedures.
If you are using views, you then have a lot better options in the ORM department than LinqToSql.
The main problem you are going to run into with LinqToSql in this regard is that what works fine for 5 stored procedures in a tiny database doesn't work fine for 50 or 500 stored procedures. You can use the O/R Designer to "override" the return type of a stored procedure, but you will have significant syncing issues when stored procedures or the tables, etc. they operate on change. Changes to stored procedures will not get reflected in the O/R Designer unless you remove the stored procedure from the O/R Designer, re-add it, and then reapply your custom override. If your project is like any normal project, the tables and stored procedures change often and this sync issue soon becomes a nightmare because it's completely manual and if you fail to do it correctly you will get very strange errors at runtime.
I would strongly advise against continuing down the path you are on.