How to execute multiple statement with variables in C# OdbcCommand object - mysql

I want to execute below MySql queries at a time through OdbcCommand object within C# as dynamic query, it always fails:
SET SESSION TRANSACTION ISOLATION LEVEL READ UNCOMMITTED;
set #row=0;
select * from
(
select #row:=#row+1 as my____row_num,
cities.`cityid`,
cities.`cityname`,
cities.`countryid`,
cities.`countryname` , '1' as my____data_row_created , '1' as
my____data_row_updated from `cities` ) p
where my____row_num>=101 and my____row_num<=200;
SET SESSION TRANSACTION ISOLATION LEVEL REPEATABLE READ ;
I'm using below method to execute above MySql queries:
ExcuteCommand(Sql)
{
DataTable dt = new DataTable();
OdbcCommand SQLCommand = new OdbcCommand(Sql);
OdbcConnection Con = new OdbcConnection(ConnectionString);
try
{
Con.Open();
SQLCommand.Connection = Con;
OdbcDataAdapter da = new OdbcDataAdapter(SQLCommand);
da.Fill(dt);
Con.Close();
Con.Dispose();
}
catch
{
try
{
Con.Close();
}
catch { }
throw;
}
return dt;
}

I found solution from here. While executing multiple dynamic MySql statements through ODBC in C# we have two options:
Execute separately every command
Use stored procedures
In my case I'm bound to use dynamic-quires because I'm having only read-access on database.
Solution:
Rather than Declaring variable and set it, I used another technique to use a session variable as a derived table and crossed join it with the main table. See the following query, in my scenario I changes to below MySql query code and removed both SET SESSION related code from the query, and it worked properly:
select * from
(
select #row:=#row+1 as my____row_num,
cities.`cityid`,
cities.`cityname`,
cities.`countryid`,
cities.`countryname` , '1' as my____data_row_created , '1' as
my____data_row_updated from `cities` ,(select #row:=0) as t ) p
where my____row_num>=101 and my____row_num<=200;

I'm not going to attempt to solve your MySQL problem, but your C# code can and should be written better, and since comments are not suited for codes, I thought I'd better write this as an answer.
So here is an improvement to your C# part:
DataTable FillDataTable(string sql)
{
var dataTable = new DataTable();
using(var con = new OdbcConnection(ConnectionString))
{
using(var command = new OdbcCommand(sql, con))
{
using(var dataAdapter = new OdbcDataAdapter(SQLCommand))
{
dataAdapter.Fill(dataTable);
}
}
}
return dataTable;
}
Points of interests:
I've renamed your method to a more descriptive name. ExecuteCommand doesn't say anything about what this method does. FillDataTable is self explanatory.
The using statement ensures the disposing of instances implementing the IDisposable interface - And almost all ADO.Net classes are implementing it.
The disposing of an OdbcConnection also close it, so you don't need to explicitly close it yourself.
There is no point of catching exceptions if you are not doing anything with them. The thumb rule is to throw early, catch late. (actually catch as soon as you can do something about it like write to log, show a message to the user, retry etc').
DataAdapters implicitly opens the Connection object, no need to explicitly open it.
Other two improvements you can do are:
Have this method also accepts parameters.
Have this method also accept the CommandType as a parameter (currently, using a stored procedure with this will not work since the default value of CommandType is Text
So, an even better version would be this:
DataTable FillDataTable(string sql, CommandType commandType, params OdbcParameter[] parameters)
{
var dataTable = new DataTable();
using(var con = new OdbcConnection(ConnectionString))
{
using(var command = new OdbcCommand(sql, con))
{
command.CommandType = commandType;
command.Parameters.AddRange(parameters);
using(var dataAdapter = new OdbcDataAdapter(SQLCommand))
{
dataAdapter.Fill(dataTable);
}
}
}
return dataTable;
}
If you want to improve that even further, You can have a look at my GitHub ADONETHelper project - There I have a single private method for Execute, and the methods for filling data tables, filling data sets, execute non query etc' all use this single method.

would you please try this instead
declare #row int
set #row=0;
select * from
(
select SUM(#row,1) as my____row_num,
cities.cityid as CityID,
cities.cityname as CityName,
cities.countryid as CountryID,
cities.countryname as CountryName ,
'1' as my____data_row_created , '1' as my____data_row_updated from cities) //i did not understand the meaning of this
where (my____row_num BETWEEN 100 AND 200 )
backEnd
ExcuteCommand(Sql)
{
<AddThis>ConnectionString= ConfigurationManager.ConnectionStrings["YourDataBaseLocation_OR_theConnectionCreatedViaProperties"].Connectionstring;</AddThis>
DataTable dt = new DataTable();
<deleteThis> OdbcCommand SQLCommand = new OdbcCommand(Sql);</deletethis>
//You Need to add the connection you have used it and Odbc
//Command.CommandType= CommandType.StoredProcedure();
OdbcConnection Con = new OdbcConnection(ConnectionString);
<AddThis>OdbcCommand SqlCommand = new OdbcCommand(Sql,Con);</AddThis>
try
{
Con.Open();
SQLCommand.Connection = Con;
OdbcDataAdapter da = new OdbcDataAdapter(SQLCommand);
da.Fill(dt);
<add this > SQLCommand.ExecuteNonQuery();</Add this>
Con.Close();
<delete> Con.Dispose();</delete>
}
catch
{
try
{
Con.Close();
}
catch (Exception e) { }
throw (e);
}
return dt;
}

Related

Returning an object from a stored procedure

I am trying to create a stored procedure which reads data from a local db and creates an object and returns it. My problem is i have not worked with stored procedures so i don't have much knowledge about it.
I know how to use stored procedures to store data to a database but i don't know how to return data through stored procedures.
Below is the stored procedure which i have created to return an object.
CREATE PROCEDURE [dbo].[get_Advertisements]
AS
BEGIN
Select * From Advertisements;
END
I know the above stored procedure only selects the records but What i want to do is :
Select one record at a time from the advertisement table
Create an object of advertisement class and pass the values read from the advertisement table
Return the object
Continue the above procedure until the full table is read.
Advertisement object has the following properties :
- topic
- content
How do i achieve this? Please help , i tried to do it myself but i am confused with the returning part.
Thank you for your time
try this :
this is for SQL server and ASP.NET
string connetionString = null;
SqlConnection sqlCnn ;
SqlCommand sqlCmd ;
SqlDataAdapter adapter = new SqlDataAdapter();
DataTable ds = new DataTable();
int i = 0;
string sql = null;
connetionString = "Data Source=ServerName;Initial Catalog=DatabaseName;User ID=UserName;Password=Password";
// this should be always in web.config file
sql = "Select * from Advertisements";
sqlCnn = new SqlConnection(connetionString);
try
{
sqlCnn.Open();
sqlCmd = new SqlCommand(sql, sqlCnn);
adapter.SelectCommand = sqlCmd;
adapter.Fill(ds);
adapter.Dispose();
sqlCmd.Dispose();
sqlCnn.Close();
//in ds You will get a Table
foreach(DataRow row in thisTable.Rows)
{
foreach(DataColumn column in thisTable.Columns)
{
Response.write((row[column]+"</br>");
// read all values of table
}
}
}
catch (Exception ex)
{
}

MySqlClient: SaveChanges in ASP.NET doesn't update DB table

I'm using MySql database in ASP.NET MVC 4 project with MySqlClient (MySQL Connector .NET ).
In the References are dlls: MySql.Data, MySql.Data.Entry, MySql.Web
Selects from MySql database executes successfully, but inserts and updates are doesn't executes. No errors, no exceptions.
code №1:
var connectionString = "Server=my_server;Uid=my_login;Pwd=my_password;Old Guids=true;persist security info=True;database=clientest;allow zero datetime=True;convert zero datetime=True";
using (MySqlConnection conn = new MySqlConnection(connectionString))
{
String commandText = "update testdb.visit set doctor_spec='dentist' where visit_id = 2;";
MySqlCommand cmd = new MySqlCommand(commandText, conn);
cmd.CommandType = System.Data.CommandType.Text;
conn.Open();
cmd.ExecuteNonQuery();
}
No errors, no exception, but the table hasn't updates
code №2
using (var db = new MySqlDBEntities())
{
var vx = (from v in db.visit where v.visit_id == 1 select v).FirstOrDefault();
vx.doctor_spec = "dentist";
db.SaveChanges();
}
No errors, no exception, but the table hasn't updates.
What's wrong? Maybe another way for using MySql in ASP.NET MVC projects?
P.S. Sorry for my poor English :(
check connection in web config and find Correctly Data File
After Do Save Change Successfully any Edited Or New Entity Changes is Update
check this code too
* from v in db.visit /* db.visits */ where *

How can I speed up updating lots of rows

I have a table that has 1.400.000 entries. Its is a simple list of documents
Table - Document
ID int
DocumentPath nvarchar
DocumentValid
bit
I scan a directory and set any document found in the directory as valid.
public void SetReportsToValidated(List<int> validatedReports)
{
SqlConnection myCon = null;
try
{
myCon = new SqlConnection(_conn);
myCon.Open();
foreach (int id in validatedReports)
{
SqlDataAdapter myAdap = new SqlDataAdapter("update_DocumentValidated", myCon);
myAdap.SelectCommand.CommandType = CommandType.StoredProcedure;
SqlParameter pId = new SqlParameter("#Id", SqlDbType.Int);
pId.Value = id;
myAdap.SelectCommand.Parameters.Add(pId);
myAdap.SelectCommand.ExecuteNonQuery();
}
}
catch (SystemException ex)
{
_log.Error(ex);
throw;
}
finally
{
if (myCon != null)
{
myCon.Close();
}
}
}
The performance of Updates is ok, but I want more. It takes more than 1 hour to update 1000000 of the documents to valid. Is there any good way to speed up the updates? I am thinking of using some kind of batch (like table valued parameters).
Each update takes some 5-10ms when profiled on SQLServer.
Read the reports in and append them together in a DataTable (since they have the same dimensions) then use the SqlBulkCopy object for to upload the entire thing. Will probably work better for you. I don't think you will have memory issues given the small number of columns and rows.
At the moment you are calling the db for each record individually. You can use the SqlDataAdapter to do bulk updates by (in a very brief nutshell):
1) define one SqlDataAdapter
2) set the .UpdateCommand on the adapter to your update sproc
3) call the .Update method on the adapter, passing it a DataTable containing the ids of documents to be updated. This will batch up the updated rows from the DataTable in to the DB, calling the sproc for each record in a batched manner. You can control the Batch Size via the .BatchSize property.
4) So what you're doing is removing the manual, row by row looping which is inefficient for batched updates.
See examples:
http://support.microsoft.com/kb/308055
http://www.c-sharpcorner.com/UploadFile/61b832/4430/
Alternatively, you could:
1) Use SqlBulkCopy to bulk insert all the IDs into a new table in the database (highly efficient)
2) Once loaded in to that staging table, run a single SQL statement to update your main table from that staging table to validate the documents.
See examples:
http://www.adathedev.co.uk/2010/02/sqlbulkcopy-bulk-load-to-sql-server.html
http://www.adathedev.co.uk/2011/01/sqlbulkcopy-to-sql-server-in-parallel.html
Instead of creating the adapter and parameter every time in the loop just create them once and assign different value to the parameter:
SqlDataAdapter myAdap = new SqlDataAdapter("update_DocumentValidated", myCon);
myAdap.SelectCommand.CommandType = CommandType.StoredProcedure;
SqlParameter pId = new SqlParameter("#Id", SqlDbType.Int);
myAdap.SelectCommand.Parameters.Add(pId);
foreach (int id in validatedReports)
{
myAdap.SelectCommand.Parameters[0].Value = id;
myAdap.SelectCommand.ExecuteNonQuery();
}
This might not result in a very dramatic improvement but is better compared to the original code. Also, as you are manually executing the SqlCommand object you do not need the adapter at all. Just use the SqlCommand directly.

SqlDependency and table update do not refresh DataContext

I'm having trouble with the implementation of SqlDependency in my project.
I'm using SqlDependency in a WCF Service. WCF Service then holds in memory cache all results from all tables in order to have a huge speed gain. Everything seems to be working fine, except when I'm doing a table row update. If I add or delete a row in my table, DataContext is refreshed and cache is invalidated without problems. But when it comes to a table row update, nothing happens, the cache is not invalidated and when I look in debug mode at the content of DataContext, no changes seems to be there.
Here's the code I'm using (note that I'm using the System.Runtime.Caching object) :
public static List<T> LinqCache<T>(this Table<T> query) where T : class
{
ObjectCache cache = MemoryCache.Default;
string tableName =
query.Context.Mapping.GetTable(typeof(T)).TableName;
List<T> result = cache[tableName] as List<T>;
if (result == null)
{
using (SqlConnection conn =
new SqlConnection(query.Context.Connection.ConnectionString))
{
conn.Open();
SqlCommand cmd = new SqlCommand(
query.Context.GetCommand(query).CommandText, conn);
cmd.Notification = null;
cmd.NotificationAutoEnlist = true;
SqlDependency dependency = new SqlDependency(cmd);
SqlChangeMonitor sqlMonitor =
new SqlChangeMonitor(dependency);
CacheItemPolicy policy = new CacheItemPolicy();
policy.ChangeMonitors.Add(sqlMonitor);
cmd.ExecuteNonQuery();
result = query.ToList();
cache.Set(tableName, result, policy);
}
}
return result;
}
I created an extension method so all I have to do is to query any table like that :
List<MyTable> list = context.MyTable.LinqCache();
My DataContext is opened at the Global.asax Application_OnStart and stored in cache, so I can use it whenever I want in my WCF Service. As well at this moment I'm opening the SqlDependency object with
SqlDependency.Start(
ConfigurationManager.ConnectionStrings[myConnectionString].ConnectionString);
So, is that a limitation of SqlDependency, or I'm doing something wrong/missing something in the process?
I think the problem is that although you do all the work in setting up the command object you then do:
cmd.ExecuteNonQuery();
result = query.ToList();
Which is going to use your SQL Command and throw away the results then LINQ to SQL will generate it's own internally via query.ToList(). Thankfully you can ask LINQ to SQL to execute your own command and translate the results for you so try replacing those two lines with:
results = db.Translate<T>(cmd.ExecuteReader());

MySQL Connector: parameters not being added

Looking at my query log for MySQL, I see my parameters aren't being added. Here's my code:
MySqlConnection conn = new MySqlConnection(ApplicationVariables.ConnectionString());
MySqlCommand com = new MySqlCommand();
try
{
conn.Open();
com.Connection = conn;
com.CommandText = String.Format(#"SELECT COUNT(*) AS totalViews
FROM pr_postreleaseviewslog AS prvl
WHERE prvl.dateCreated BETWEEN (#startDate) AND (#endDate) AND prvl.postreleaseID IN ({0})"
, ids);
com.CommandType = CommandType.Text;
com.Parameters.Add(new MySqlParameter("#startDate", thisCampaign.Startdate));
com.Parameters.Add(new MySqlParameter("#endDate", endDate));
numViews = Convert.ToInt32(com.ExecuteScalar());
}
catch (Exception ex)
{
}
finally
{
conn.Dispose();
com.Dispose();
}
Looking at the query log, I see this:
SELECT COUNT(*) AS totalViews
FROM pr_postreleaseviewslog AS prvl
WHERE prvl.dateCreated BETWEEN (#startDate) AND (#endDate) AND prvl.postreleaseID IN (1,2)
I've used the MySQL .NET connector on countless projects (I actually have a base class that takes care of opening these connections, and closing them with transactions, etc.). However, I took over this application, and here I am now.
Thanks for the help!
Try like this.
mySqlCommand.Parameters.Add("#CustomerID", SqlDbType.NChar, 5);
mySqlCommand.Parameters["#CustomerID"].Value = "T1COM";
Some SQL clients, especially for MySql use "?" instead of "#".