I'm having trouble with the implementation of SqlDependency in my project.
I'm using SqlDependency in a WCF Service. WCF Service then holds in memory cache all results from all tables in order to have a huge speed gain. Everything seems to be working fine, except when I'm doing a table row update. If I add or delete a row in my table, DataContext is refreshed and cache is invalidated without problems. But when it comes to a table row update, nothing happens, the cache is not invalidated and when I look in debug mode at the content of DataContext, no changes seems to be there.
Here's the code I'm using (note that I'm using the System.Runtime.Caching object) :
public static List<T> LinqCache<T>(this Table<T> query) where T : class
{
ObjectCache cache = MemoryCache.Default;
string tableName =
query.Context.Mapping.GetTable(typeof(T)).TableName;
List<T> result = cache[tableName] as List<T>;
if (result == null)
{
using (SqlConnection conn =
new SqlConnection(query.Context.Connection.ConnectionString))
{
conn.Open();
SqlCommand cmd = new SqlCommand(
query.Context.GetCommand(query).CommandText, conn);
cmd.Notification = null;
cmd.NotificationAutoEnlist = true;
SqlDependency dependency = new SqlDependency(cmd);
SqlChangeMonitor sqlMonitor =
new SqlChangeMonitor(dependency);
CacheItemPolicy policy = new CacheItemPolicy();
policy.ChangeMonitors.Add(sqlMonitor);
cmd.ExecuteNonQuery();
result = query.ToList();
cache.Set(tableName, result, policy);
}
}
return result;
}
I created an extension method so all I have to do is to query any table like that :
List<MyTable> list = context.MyTable.LinqCache();
My DataContext is opened at the Global.asax Application_OnStart and stored in cache, so I can use it whenever I want in my WCF Service. As well at this moment I'm opening the SqlDependency object with
SqlDependency.Start(
ConfigurationManager.ConnectionStrings[myConnectionString].ConnectionString);
So, is that a limitation of SqlDependency, or I'm doing something wrong/missing something in the process?
I think the problem is that although you do all the work in setting up the command object you then do:
cmd.ExecuteNonQuery();
result = query.ToList();
Which is going to use your SQL Command and throw away the results then LINQ to SQL will generate it's own internally via query.ToList(). Thankfully you can ask LINQ to SQL to execute your own command and translate the results for you so try replacing those two lines with:
results = db.Translate<T>(cmd.ExecuteReader());
Related
Similar to the question answered here https://stackoverflow.com/a/42932812/1321510 we need to execute a raw sql query. For the query we don't have any db context model (so any .FromSql answers won't work for us). However we need to execute it within an existing transaction (created with context.Database.BeginTransaction()). All solutions found on SO do not work with existing transactions.
Example:
var connection = context.Database.GetDbConnection();
using (var command = connection.CreateCommand())
{
command.CommandText = sql;
command.Transaction = context.Database.CurrentTransaction.GetDbTransaction();
var executeReader = command.ExecuteReader();
var values = new object[executeReader.FieldCount];
if (!executeReader.Read())
{
return values;
}
executeReader.GetValues(values);
return values;
}
}
Commiting the transaction then throws System.InvalidOperationException: 'This MySqlConnection is already in use. See https://fl.vu/mysql-conn-reuse'.
The provided link in the exception doesn't seem helpful at all, since we're neither using async nor using the connection whilst reading from it.
We're using Pomelo.EntityFrameworkCore.MySql as the database connector.
I'm using MySql database in ASP.NET MVC 4 project with MySqlClient (MySQL Connector .NET ).
In the References are dlls: MySql.Data, MySql.Data.Entry, MySql.Web
Selects from MySql database executes successfully, but inserts and updates are doesn't executes. No errors, no exceptions.
code №1:
var connectionString = "Server=my_server;Uid=my_login;Pwd=my_password;Old Guids=true;persist security info=True;database=clientest;allow zero datetime=True;convert zero datetime=True";
using (MySqlConnection conn = new MySqlConnection(connectionString))
{
String commandText = "update testdb.visit set doctor_spec='dentist' where visit_id = 2;";
MySqlCommand cmd = new MySqlCommand(commandText, conn);
cmd.CommandType = System.Data.CommandType.Text;
conn.Open();
cmd.ExecuteNonQuery();
}
No errors, no exception, but the table hasn't updates
code №2
using (var db = new MySqlDBEntities())
{
var vx = (from v in db.visit where v.visit_id == 1 select v).FirstOrDefault();
vx.doctor_spec = "dentist";
db.SaveChanges();
}
No errors, no exception, but the table hasn't updates.
What's wrong? Maybe another way for using MySql in ASP.NET MVC projects?
P.S. Sorry for my poor English :(
check connection in web config and find Correctly Data File
After Do Save Change Successfully any Edited Or New Entity Changes is Update
check this code too
* from v in db.visit /* db.visits */ where *
Edit: I solved my problem but if you have anything to add please do. Thanks
Note: I did not create the DB it was created by Wordpress hosted on GoDaddy with my site
I have a MySql Database called "wordpress" (for clarity). I want to be able to grab the most recent post from my blog and show it on the landing page for my url.
So my thought is this: connect to the MySql DB, run a query to grab the most recent post, display the post.
I built a class to handle the connection and process the request:
public class DAL
{
private string connectionString = "DRIVER={MySQL ODBC 3.51 Driver}; SERVER=[server here]; PORT=[port]; DATABASE=wordpress;
USER=[user name here]; PASSWORD=[password here];";
private OdbcConnection blogConnection;
public DAL()
{
blogConnection = new OdbcConnection(connectionString);
}
public String[] GetRecentPost()
{
string queryString = "SELECT * FROM RecentPost";
String[] recentPost = new String[3];
//ODBC
blogConnection.Open();
OdbcCommand MySqlDB = new OdbcCommand(queryString, blogConnection);
OdbcDataReader reader = MySqlDB.ExecuteReader();
while (reader.NextResult())
{
recentPost[0] = reader.GetString(0);
recentPost[1] = reader.GetString(1);
}
recentPost[2] = reader.HasRows.ToString();
blogConnection.Close();
return recentPost;
}
}
In the queryString above RecentPost is a view I created to simplify the queryString since the query was a bit long.
I already know the view works. I tested it by opening phpMyAdmin from within the GoDaddy Hosting Center and executed the query above and I got the correct result, so I don't think the query/view is wrong.
The code-behind for the landing page:
protected void Page_Load(object sender, EventArgs e)
{
DAL dataAccess = new DAL();
String[] recentPost = dataAccess.GetRecentPost();
Title.Text = recentPost[0];
Post.Text = recentPost[1];
Extra.Text = recentPost[2];
}
So when my page loads the Title and Post texts are empty and Extra.Text is False (which from the DAL is the value from reader.HasRows).
So my guess is that its connecting fine and running the query but maybe on the wrong database? I don't know.
I also tried to debug but then my code throws an error about trying to connect to database.
So my questions are: Do you see anything wrong with the connection string?
If not do you see anything else than would cause a connection to be esablished, a query to run, no exceptions thrown but no results returned?
Any one with experience trying to grab data from thier own wordpress blog?
Thanks for the help - this one has been driving me crazy.
I don't know why my original code wasn't working but I solved my issue. For anyone else having this issue here is how I changed my code (in the GetRecentPost method) and solved my problem:
DataSet ds = new DataSet();
//ODBC
blogConnection.Open();
OdbcDataAdapter MySqlDB = new OdbcDataAdapter(queryString, blogConnection);
MySqlDB.Fill(ds);
return ds.Tables[0];
So instead of an array of strings I used a DataSet. Instead of using the OdbcDataReader I used an OdbcDataAdapter and populated the DataSet with the .Fill() method from OdbcDataAdapter I then returned the first table from the DataSet to my Page_Load method.
Here is my new Page_Load():
DataTable table = dataAccess.GetRecentPost();
if (table.Rows.Count > 0)
{
Title.Text = table.Rows[0]["title"].ToString();
Post.Text = table.Rows[0]["content"].ToString();
}
else
Extra.Text = table.Rows.Count.ToString(); \\if nothing was returned ouput the 0 just to be sure
Hope this helps anyone else with this issue
And thanks for anyone who took the time to look
I have a table that has 1.400.000 entries. Its is a simple list of documents
Table - Document
ID int
DocumentPath nvarchar
DocumentValid
bit
I scan a directory and set any document found in the directory as valid.
public void SetReportsToValidated(List<int> validatedReports)
{
SqlConnection myCon = null;
try
{
myCon = new SqlConnection(_conn);
myCon.Open();
foreach (int id in validatedReports)
{
SqlDataAdapter myAdap = new SqlDataAdapter("update_DocumentValidated", myCon);
myAdap.SelectCommand.CommandType = CommandType.StoredProcedure;
SqlParameter pId = new SqlParameter("#Id", SqlDbType.Int);
pId.Value = id;
myAdap.SelectCommand.Parameters.Add(pId);
myAdap.SelectCommand.ExecuteNonQuery();
}
}
catch (SystemException ex)
{
_log.Error(ex);
throw;
}
finally
{
if (myCon != null)
{
myCon.Close();
}
}
}
The performance of Updates is ok, but I want more. It takes more than 1 hour to update 1000000 of the documents to valid. Is there any good way to speed up the updates? I am thinking of using some kind of batch (like table valued parameters).
Each update takes some 5-10ms when profiled on SQLServer.
Read the reports in and append them together in a DataTable (since they have the same dimensions) then use the SqlBulkCopy object for to upload the entire thing. Will probably work better for you. I don't think you will have memory issues given the small number of columns and rows.
At the moment you are calling the db for each record individually. You can use the SqlDataAdapter to do bulk updates by (in a very brief nutshell):
1) define one SqlDataAdapter
2) set the .UpdateCommand on the adapter to your update sproc
3) call the .Update method on the adapter, passing it a DataTable containing the ids of documents to be updated. This will batch up the updated rows from the DataTable in to the DB, calling the sproc for each record in a batched manner. You can control the Batch Size via the .BatchSize property.
4) So what you're doing is removing the manual, row by row looping which is inefficient for batched updates.
See examples:
http://support.microsoft.com/kb/308055
http://www.c-sharpcorner.com/UploadFile/61b832/4430/
Alternatively, you could:
1) Use SqlBulkCopy to bulk insert all the IDs into a new table in the database (highly efficient)
2) Once loaded in to that staging table, run a single SQL statement to update your main table from that staging table to validate the documents.
See examples:
http://www.adathedev.co.uk/2010/02/sqlbulkcopy-bulk-load-to-sql-server.html
http://www.adathedev.co.uk/2011/01/sqlbulkcopy-to-sql-server-in-parallel.html
Instead of creating the adapter and parameter every time in the loop just create them once and assign different value to the parameter:
SqlDataAdapter myAdap = new SqlDataAdapter("update_DocumentValidated", myCon);
myAdap.SelectCommand.CommandType = CommandType.StoredProcedure;
SqlParameter pId = new SqlParameter("#Id", SqlDbType.Int);
myAdap.SelectCommand.Parameters.Add(pId);
foreach (int id in validatedReports)
{
myAdap.SelectCommand.Parameters[0].Value = id;
myAdap.SelectCommand.ExecuteNonQuery();
}
This might not result in a very dramatic improvement but is better compared to the original code. Also, as you are manually executing the SqlCommand object you do not need the adapter at all. Just use the SqlCommand directly.
Suppose I have an Orders table in my database and a corresponding model class generated by the VS2008 "Linq to SQL Classes" designer. Suppose I also have a stored procedure (ProcessOrder) in my database that I use to do some processing on an order record.
If I do the following:
var order = dataContext.Orders.Where(o => o.id == orderId).First();
// More code here
dataContext.ProcessOrder(orderId);
order.Status = "PROCESSED";
dataContext.SubmitChanges();
...then I'll get a concurrency violation if the ProcessOrder stored proc has modified the order (which is of course very likely), because L2S will detect that the order record has changed, and will fail to submit the changes to that order.
That's all fairly logical, but what if I want to update the order record after calling the stored proc? How do I tell L2S to forget about its cached copy and refresh it from the DB?
You can do it with the Refresh method on your data context, like so:
DataContext.Refresh(System.Data.Linq.RefreshMode.OverwriteCurrentValues,
DataContext.Orders);
DataContext.Refresh works but is very slow.
A faster way is to clear the DataContext cache using this method:
public static void ClearCache(this DataContext context)
{
const BindingFlags FLAGS = BindingFlags.Instance | BindingFlags.Public | BindingFlags.NonPublic;
var method = context.GetType().GetMethod("ClearCache", FLAGS);
method.Invoke(context, null);
}
Then just call ClearCache(dataContext) and it will refresh.