Override SQL generated by LINQ to SQL? - linq-to-sql

Is it possible to override the SQL generated by LINQ to SQL, for optimisation purposes?

You could use the ExecuteQuery method instead. This is useful if you want to leverage a function that's available in SqlServer but not in Linq (IE PIVOT, etc...)
For instance:
var query = db.ExecuteQuery<MyType>( #"SELECT ... FROM ... WHERE ...");

One way I have used:
Create a stored proc, use the linq to sql designer to drag the proc into the design surface. Call the resulting method instead.

Related

how to use find_in_set of mySQL in SYBASE?

Before, I use find_in_set(idField,:ids) in mySQL to delete or update multi field with ids separated by comma, example:
UPDATE USER SET name = 'a' WHERE find_in_set(id,'1,2,3,4') > 0
How do I can customize the query and use it in SyBase ?
NOTE: You haven't mentioned which Sybase database product you're using (ASE? SQLAnywhere? IQ? Advantage?), nor the version. While ASE does not have anything like find_in_set(), I can't speak for the other database products.
From an ASE perspective you have a few options:
create your own user-defined function; you have T-SQL (since ASE 15.0.2) and Java options
build a dynamic query and submit via execute()
rewrite your query to use available ASE functions (eg, patindex(), charindex())
rewrite your query to use the like operator (see alternate to find_in_set() for non-MySQL databases for an example)
As Sybase does not have function like find_in_set, you have couple of options: google a function which parses the comma separated list to a temp table or use dynamic SQL with execute-command.

Passing Dynamic code to MySQL execute

I am writing some reports code, which requires executing complex sql code and executing this using raw connection. I am good for the static paramteres but not sure how to handle the dynamic values.
I prepare the dynamic sql and then create a statement object
st = conn.prepare(dynamic_sql_string)
st.execute(dynamic values).
How do I create this dynamic values code?
In one it will be
st.execute(#first_name)
and in second case it will be
st.execute(#last_name).
How do I write this dynamic code?
Got it, you do this using
eval "st.execute(dynamic values").

How to change update statement before executing: Linq2Sql Classes

I have implemented Change Tracking (http://msdn.microsoft.com/en-us/library/cc280462.aspx) on some tables I am using Linq2Sql on.
As a part of this I need to add the below SQL to the start of the update statements generated.
DECLARE #originator_id varbinary(128);
SET #originator_id = CAST('SyncService' AS varbinary(128));
WITH CHANGE_TRACKING_CONTEXT (#originator_id)
....generated statements....
....
....
I know I can create stored procedures and manually map the fiels but I would like to avoid this if possible.
does anyone know a way to override and edit the SQL on SubmitChanges()?
You can override the Update method by implementing partial classes on your datacontext that LINQ to SQL will call instead. Just give it the signature:
partial void UpdateClassName(ClassName instance)
You can also pass through to what it would normally do using:
ExecuteDynamicInsert(instance);
Unfortunately there is no mechanism just to get the intended SQL back for inserts/update/deletes (you can get SELECT statements with GetCommand on the DataContext)

how to use a variable in a where clause

I use jsp and sql and I want to query a table using a variable that I have already declared (entity)
here is my query:
..
String **entite**="informatique";
entite="informatique"
...
rs = st.executeQuery ("select * from User where User.id_e= &**entite** ");
entite is a variable
my question is: how to use a variable in a where clause
My prefered solution - Use a PreparedStatement with ? and setString(...) for parameters.
See further details here:
http://download.oracle.com/javase/tutorial/jdbc/basics/prepared.html
The best and safe way is to use PreparedStatement to bind variables to the query.
For example see (hope the link does not break in a second)
http://download.oracle.com/javase/1.4.2/docs/api/java/sql/PreparedStatement.html
You can use escaping techniques, however they are error prone and quite too often lead to SQL injection attacks and/or database corruption.

Any Best practice of doing table record insert with SQL CLR store procedure?

Recently we turned a set of complicate C# based scheduling logic into SQL CLR stored procedure (running in SQL Server 2005). We believed that our code is a great SQL CLR candidate because:
The logic involves tons of data from sQL Server.
The logic is complicate and hard to be done using TSQL
There is no threading or sychronization or accessing resources from outside of the sandbox.
The result of our sp is pretty good so far. However, since the output of our logic is in form of several tables of data, we can't just return a single rowset as the result of the sp. Instead, in our code we have a lot of "INSERT INTO ...." statements in foreach loops in order to save each record from C# generic collection into SQL tables. During code review, someone raised concern about whether the inline SQL INSERT approach within the SQL CLR can cause perforamnce problem, and wonder if there's other better way to dump data out (from our C# generic collections).
So, any suggestion?
I ran across this while working on an SQLite project a few months back and found it enlightening. I think it might be what you're looking for.
...
Fastest universal way to insert data
using standard ADO.NET constructs
Now that the slow stuff is out of the
way, lets talk about some hardcore
bulk loading. Aside from SqlBulkCopy
and specialized constructs involving
ISAM or custom bulk insert classes
from other providers, there is simply
no beating the raw power of
ExecuteNonQuery() on a parameterized
INSERT statement. I will demonstrate:
internal static void FastInsertMany(DbConnection cnn)
{
using (DbTransaction dbTrans = cnn.BeginTransaction())
{
using (DbCommand cmd = cnn.CreateCommand())
{
cmd.CommandText = "INSERT INTO TestCase(MyValue) VALUES(?)";
DbParameter Field1 = cmd.CreateParameter();
cmd.Parameters.Add(Field1);
for (int n = 0; n < 100000; n++)
{
Field1.Value = n + 100000;
cmd.ExecuteNonQuery();
}
}
dbTrans.Commit();
}
}
You could return a table with 2 columns (COLLECTION_NAME nvarchar(max), CONTENT xml) filled with as many rows as internal collections you have. CONTENT will be an XML representation of the data in the collection.
Then you can use the XML features of SQL 2005/2008 to parse each collection's XML into tables, and perform your INSERT INTO's or MERGE statements on the whole table.
That should be faster than individual INSERTS inside your C# code.