DuplicateKeyException on the submitchanges method- Windows Phone - linq-to-sql

I am learning on how to insert data into the database using Linq and also to the ObservableCollection.
The data gets inserted into the ObservableCollection but not to the database. Could somebody explain whats going on with the code below. The system throws an unhandled exception on submitchanges method. Please advice.
public void populateDates(DateTime theWeek)
{
ObservableCollection<theSchedule> theDatesList = new ObservableCollection<theSchedule>();
for (int i = 0; i < 7; i++)
{
theSchedule theShift = new theSchedule
{
theDay = (theWeek.AddDays(i).ToString("dd/MM/yyyy")),
theTime = (theWeek.AddHours(i).ToString("HH:mm")) + " - " + (theWeek.AddHours(i+8).ToString("HH:mm"))
};
theDatesList.Add(theShift);
//MessageBox.Show(theWeek.AddDays(i).ToString("HH:mm"));
shiftsDb.theSchedules.InsertOnSubmit(theShift);
}
shiftsDb.SubmitChanges();
mylistbox.ItemsSource = theDatesList;
}

Related

I have d365 database and mysql database and need to make this as a generic but the catch is that the attributes names are not same in both database

`
using (var s = webRequest.GetResponse().GetResponseStream())
{
using (var sr = new StreamReader(s))
{
var entitydatafromMysql = sr.ReadToEnd();
string bsObj2 = string.Empty;
if (entitydatafromMysql.Contains("Success0:"))
{
DataContractJsonSerializer deserializer = new DataContractJsonSerializer(typeof(string));
using (var ms = new MemoryStream(Encoding.Unicode.GetBytes(entitydatafromMysql)))
{
bsObj2 = (string)deserializer.ReadObject(ms);
bsObj2 = bsObj2.Remove(0, 9);
dtMySQLdata = (DataTable)JsonConvert.DeserializeObject(bsObj2, (typeof(DataTable)));
for (int i = 0; i < dtMySQLdata.Rows.Count; i++)
{
Entity entity = new Entity("efl_archivalemail");
entity["efl_archivalemailid"] = Convert.ToString(dtMySQLdata.Rows[i]["activityid"]);
entity.Id = new Guid(Convert.ToString(dtMySQLdata.Rows[i]["activityid"]));
entity["efl_name"] = Convert.ToString(dtMySQLdata.Rows[i]["sender"]);
entity["efl_description"] = Convert.ToString(dtMySQLdata.Rows[i]["description"]);
ec.Entities.Add(entity);
}
}
}
else
{
throw new InvalidPluginExecutionException(entitydatafromMysql);
}
}
}
`
This is a plugin code so I am using POST API to connect and get data FROM mysql database. I know I need to map CRM and Mysql entity attributes but I can only access CRM attributes by IserviceProviders execution context how can I do this?

Slartoolkit Solution Error

I'm getting a very strange error in relation to an AR project I'm working on, I am getting a System.NullReferenceException' occurred in SLARToolKit WinPhone.DLL" error in relation to a specific line of code being highlighted, that being:
private static Marker LoadFromResource(string relativePath, int segmentsX, int segmentsY, double width, System.Reflection.Assembly assembly)
{
var asmName = new System.Reflection.AssemblyName(assembly.FullName).Name;
using (var markerStream = Application.GetResourceStream(new Uri(asmName + ";component/" + relativePath, UriKind.Relative)).Stream)
{
return Load(markerStream, segmentsX, segmentsY, width);
}
}
Specificially this line is being highlighted:
using (var markerStream = Application.GetResourceStream(new Uri(asmName + ";component/" + relativePath, UriKind.Relative)).Stream)
This code is found in a "Marker.cs" file which I'm guessing is part of the SLARToolkit.DLL. Any ideas why I'm getting this error? Thanks in advance.

How to append results in Processing?

I have implemented the Table() function in order to save the results generated by the application. However, it seems that the Timer function in the application causes the application to write over the existing CSV file each time it runs. Rather than write over the existing CSV file, I would like to append the newest search results to the existing CSV file. Is there a way to do this? Is it easier to append the results if the results are stored in a different format such as JSON?
Timer timer;
import java.util.List;
Table table;
long lastID = Long.MAX_VALUE;
void setup() {
timer = new Timer(30000);
timer.start();
goTwitter();
table = new Table();
table.addColumn("id");
table.addColumn("latitude");
table.addColumn("longitude");
}
void draw(){
if (timer.isFinished()){
goTwitter();
timer.start();
}
}
void goTwitter(){
ConfigurationBuilder cb = new ConfigurationBuilder();
cb.setOAuthConsumerKey("");
cb.setOAuthConsumerSecret("");
cb.setOAuthAccessToken("");
cb.setOAuthAccessTokenSecret("");
Twitter twitter = new TwitterFactory(cb.build()).getInstance();
Query query = new Query("#love");
int numberOfTweets = 300;
ArrayList<Status> tweets = new ArrayList<Status>();
while (tweets.size () < numberOfTweets) {
if (numberOfTweets - tweets.size() > 100)
query.setCount(100);
else
query.setCount(numberOfTweets - tweets.size());
//long lastID = Long.MAX_VALUE;
try {
QueryResult result = twitter.search(query);
tweets.addAll(result.getTweets());
println("Gathered " + tweets.size() + " tweets");
for (Status t: tweets)
if(t.getId() < lastID) lastID = t.getId();
}
catch (TwitterException te) {
println("Couldn't connect: " + te);
};
query.setSinceId(lastID);
}
for (int i = 0; i < tweets.size(); i++) {
Status t = (Status) tweets.get(i);
GeoLocation loc = t.getGeoLocation();
String user = t.getUser().getScreenName();
String msg = t.getText();
String time = "";
if (loc!=null) {
Double lat = t.getGeoLocation().getLatitude();
Double lon = t.getGeoLocation().getLongitude();
println(i + " USER: " + user + " wrote: " + msg + " located at " + lat + ", " + lon);
TableRow newRow = table.addRow();
newRow.setString("id", user);
newRow.setDouble("latitude", lat);
newRow.setDouble("longitude", lon);
saveTable(table, "data2/syria_16500_5.csv");
}
}
println("lastID= " + lastID);
}
class Timer {
int savedTime;
int totalTime;
Timer (int tempTotalTime) {
totalTime = tempTotalTime;
}
void start(){
savedTime = millis();
}
boolean isFinished() {
int passedTime = millis() - savedTime;
if (passedTime > totalTime){
return true;
} else {
return false;
}
}
}
Well, there does not seem to be a direct implementation to append to a table, so you'll have to resort to a hack: load the table in processing, write to it and resave it, sort of like this:
processing.data.Table table;
void setup() {
File f = new File(sketchPath("") + "data2/syria_16500_5.csv");
println(f.getAbsolutePath());
if (!f.exists()) {
table = new processing.data.Table();
table.addColumn("id");
table.addColumn("latitude");
table.addColumn("longitude");
}
else
table = loadTable("data2/syria_16500_5.csv", "header, csv");
TableRow newRow = table.addRow();
newRow.setString("id", "asad");
newRow.setDouble("latitude", 234);
newRow.setDouble("longitude", 2523);
saveTable(table, "data2/syria_16500_5.csv");
}
The sketch first checks if the file exists. If it does not, it creates a new table, otherwise it loads the old table in with its header.
Be warned, this is not particularly safe... If you change your columns (say, in a text editor) and try to run the sketch again you will get an exception.

Bulk Insert into SQL Server 2008

for (int i = 0; i < myClass.Length; i++)
{
string upSql = "UPDATE CumulativeTable SET EngPosFT = #EngPosFT,EngFTAv=#EngFTAv WHERE RegNumber =#RegNumber AND Session=#Session AND Form=#Form AND Class=#Class";
SqlCommand cmdB = new SqlCommand(upSql, connection);
cmdB.CommandTimeout = 980000;
cmdB.Parameters.AddWithValue("#EngPosFT", Convert.ToInt32(Pos.GetValue(i)));
cmdB.Parameters.AddWithValue("#RegNumber", myClass.GetValue(i));
cmdB.Parameters.AddWithValue("#EngFTAv", Math.Round((engtot / arrayCount), 2));
cmdB.Parameters.AddWithValue("#Session", drpSess.SelectedValue);
cmdB.Parameters.AddWithValue("#Form", drpForm.SelectedValue);
cmdB.Parameters.AddWithValue("#Class", drpClass.SelectedValue);
int idd = Convert.ToInt32(cmdB.ExecuteScalar());
}
assuming myClass.Length is 60. This does 60 update statements. How can I limit it to 1 update statement. Please code example using the above code will be appreciated. Thanks
Tried using this
StringBuilder command = new StringBuilder();
SqlCommand cmdB = null;
for (int i = 0; i < myClass.Length; i++)
{
command.Append("UPDATE CumulativeTable SET" + " EngPosFT = " + Convert.ToInt32(Pos.GetValue(i)) + "," + " EngFTAv = " + Math.Round((engtot / arrayCount), 2) +
" WHERE RegNumber = " + myClass.GetValue(i) + " AND Session= " + drpSess.SelectedValue + " AND Form= " + drpForm.SelectedValue + " AND Class= " + drpClass.SelectedValue + ";");
//or command.AppendFormat("UPDATE CumulativeTable SET EngPosFT = {0},EngFTAv={1} WHERE RegNumber ={2} AND Session={3} AND Form={4} AND Class={5};", Convert.ToInt32(Pos.GetValue(i)), Math.Round((engtot / arrayCount), 2), myClass.GetValue(i), drpSess.SelectedValue, drpForm.SelectedValue, drpClass.SelectedValue);
}//max length is 128 error is encountered
Look at the BULK INSERT T-SQL command. But since I don't have a lot of personal experience with that command, I do see some immediate opportunity to improve this code using the same sql by creating the command and parameters outside of the loop, and only making the necessary changes inside the loop:
string upSql = "UPDATE CumulativeTable SET EngPosFT = #EngPosFT,EngFTAv=#EngFTAv WHERE RegNumber =#RegNumber AND Session=#Session AND Form=#Form AND Class=#Class";
SqlCommand cmdB = new SqlCommand(upSql, connection);
cmdB.CommandTimeout = 980000;
//I had to guess at the sql types you used here.
//Adjust this to match your actual column data types
cmdB.Parameters.Add("#EngPosFT", SqlDbType.Int);
cmdB.Parameters.Add("#RegNumber", SqlDbType.Int);
//It's really better to use explicit types here, too.
//I'll just update the first parameter as an example of how it looks:
cmdB.Parameters.Add("#EngFTAv", SqlDbType.Decimal).Value = Math.Round((engtot / arrayCount), 2));
cmdB.Parameters.AddWithValue("#Session", drpSess.SelectedValue);
cmdB.Parameters.AddWithValue("#Form", drpForm.SelectedValue);
cmdB.Parameters.AddWithValue("#Class", SqlDbTypedrpClass.SelectedValue);
for (int i = 0; i < myClass.Length; i++)
{
cmdB.Parameters[0].Value = Convert.ToInt32(Pos.GetValue(i)));
cmdB.Parameters[1].Value = myClass.GetValue(i));
int idd = Convert.ToInt32(cmdB.ExecuteScalar());
}
It would be better in this case to create a stored procedure that accepts a Table Valued Parameter. On the .NET side of things, you create a DataTable object containing a row for each set of values you want to use.
On the SQL Server side of things, you can treat the parameter as another table in a query. So inside the stored proc, you'd have:
UPDATE a
SET
EngPosFT = b.EngPosFT,
EngFTAv=b.EngFTAv
FROM
CumulativeTable a
inner join
#MyParm b
on
a.RegNumber =b.RegNumber AND
a.Session=b.Session AND
a.Form=b.Form AND
a.Class=b.Class
Where #MyParm is your table valued parameter.
This will then be processed as a single round-trip to the server.
In such scenarios it is always best to write a Stored Procedure and call that stored proc in the for loop, passing the necessary arguments at each call.
using System;
using System.Data;
using System.Data.SqlClient;
namespace DataTableExample
{
class Program
{
static void Main(string[] args)
{
DataTable prodSalesData = new DataTable("ProductSalesData");
// Create Column 1: SaleDate
DataColumn dateColumn = new DataColumn();
dateColumn.DataType = Type.GetType("System.DateTime");
dateColumn.ColumnName = "SaleDate";
// Create Column 2: ProductName
DataColumn productNameColumn = new DataColumn();
productNameColumn.ColumnName = "ProductName";
// Create Column 3: TotalSales
DataColumn totalSalesColumn = new DataColumn();
totalSalesColumn.DataType = Type.GetType("System.Int32");
totalSalesColumn.ColumnName = "TotalSales";
// Add the columns to the ProductSalesData DataTable
prodSalesData.Columns.Add(dateColumn);
prodSalesData.Columns.Add(productNameColumn);
prodSalesData.Columns.Add(totalSalesColumn);
// Let's populate the datatable with our stats.
// You can add as many rows as you want here!
// Create a new row
DataRow dailyProductSalesRow = prodSalesData.NewRow();
dailyProductSalesRow["SaleDate"] = DateTime.Now.Date;
dailyProductSalesRow["ProductName"] = "Nike";
dailyProductSalesRow["TotalSales"] = 10;
// Add the row to the ProductSalesData DataTable
prodSalesData.Rows.Add(dailyProductSalesRow);
// Copy the DataTable to SQL Server using SqlBulkCopy
using (SqlConnection dbConnection = new SqlConnection("Data Source=ProductHost;Initial Catalog=dbProduct;Integrated Security=SSPI;Connection Timeout=60;Min Pool Size=2;Max Pool Size=20;"))
{
dbConnection.Open();
using (SqlBulkCopy s = new SqlBulkCopy(dbConnection))
{
s.DestinationTableName = prodSalesData.TableName;
foreach (var column in prodSalesData.Columns)
s.ColumnMappings.Add(column.ToString(), column.ToString());
s.WriteToServer(prodSalesData);
}
}
}
}
}

Sql Server 2008 Tuning with large transactions (700k+ rows/transaction)

So, I'm working on a database that I will be adding to my future projects as sort of a supporting db, but I'm having a bit of an issue with it, especially the logs.
The database basically needs to be updated once a month. The main table has to be purged and then refilled off of a CSV file. The problem is that Sql Server will generate a log for it which is MEGA big. I was successful in filling it up once, but wanted to test the whole process by purging it and then refilling it.
That's when I get an error that the log file is filled up. It jumps from 88MB (after shrinking via maintenance plan) to 248MB and then stops the process altogether and never completes.
I've capped it's growth at 256MB, incrementing by 16MB, which is why it failed, but in reality I don't need it to log anything at all. Is there a way to just completely bypass logging on any query being run against the database?
Thanks for any responses in advance!
EDIT: Per the suggestions of #mattmc3 I've implemented SqlBulkCopy for the whole procedure. It works AMAZING, except, my loop is somehow crashing on the very last remaining chunk that needs to be inserted. I'm not too sure where I'm going wrong, heck I don't even know if this is a proper loop, so I'd appreciate some help on it.
I do know that its an issue with the very last GetDataTable or SetSqlBulkCopy calls. I'm trying to insert 788189 rows, 788000 get in and the remaining 189 are crashing...
string[] Rows;
using (StreamReader Reader = new StreamReader("C:/?.csv")) {
Rows = Reader.ReadToEnd().TrimEnd().Split(new char[1] {
'\n'
}, StringSplitOptions.RemoveEmptyEntries);
};
int RowsInserted = 0;
using (SqlConnection Connection = new SqlConnection("")) {
Connection.Open();
DataTable Table = null;
while ((RowsInserted < Rows.Length) && ((Rows.Length - RowsInserted) >= 1000)) {
Table = GetDataTable(Rows.Skip(RowsInserted).Take(1000).ToArray());
SetSqlBulkCopy(Table, Connection);
RowsInserted += 1000;
};
Table = GetDataTable(Rows.Skip(RowsInserted).ToArray());
SetSqlBulkCopy(Table, Connection);
Connection.Close();
};
static DataTable GetDataTable(
string[] Rows) {
using (DataTable Table = new DataTable()) {
Table.Columns.Add(new DataColumn("A"));
Table.Columns.Add(new DataColumn("B"));
Table.Columns.Add(new DataColumn("C"));
Table.Columns.Add(new DataColumn("D"));
for (short a = 0, b = (short)Rows.Length; a < b; a++) {
string[] Columns = Rows[a].Split(new char[1] {
','
}, StringSplitOptions.RemoveEmptyEntries);
DataRow Row = Table.NewRow();
Row["A"] = Columns[0];
Row["B"] = Columns[1];
Row["C"] = Columns[2];
Row["D"] = Columns[3];
Table.Rows.Add(Row);
};
return (Table);
};
}
static void SetSqlBulkCopy(
DataTable Table,
SqlConnection Connection) {
using (SqlBulkCopy SqlBulkCopy = new SqlBulkCopy(Connection)) {
SqlBulkCopy.ColumnMappings.Add(new SqlBulkCopyColumnMapping("A", "A"));
SqlBulkCopy.ColumnMappings.Add(new SqlBulkCopyColumnMapping("B", "B"));
SqlBulkCopy.ColumnMappings.Add(new SqlBulkCopyColumnMapping("C", "C"));
SqlBulkCopy.ColumnMappings.Add(new SqlBulkCopyColumnMapping("D", "D"));
SqlBulkCopy.BatchSize = Table.Rows.Count;
SqlBulkCopy.DestinationTableName = "E";
SqlBulkCopy.WriteToServer(Table);
};
}
EDIT/FINAL CODE: So the app is now finished and works AMAZING, and quite speedy! #mattmc3, thanks for all the help! Here is the final code for anyone who may find it useful:
List<string> Rows = new List<string>();
using (StreamReader Reader = new StreamReader(#"?.csv")) {
string Line = string.Empty;
while (!String.IsNullOrWhiteSpace(Line = Reader.ReadLine())) {
Rows.Add(Line);
};
};
if (Rows.Count > 0) {
int RowsInserted = 0;
DataTable Table = new DataTable();
Table.Columns.Add(new DataColumn("Id"));
Table.Columns.Add(new DataColumn("A"));
while ((RowsInserted < Rows.Count) && ((Rows.Count - RowsInserted) >= 1000)) {
Table = GetDataTable(Rows.Skip(RowsInserted).Take(1000).ToList(), Table);
PerformSqlBulkCopy(Table);
RowsInserted += 1000;
Table.Clear();
};
Table = GetDataTable(Rows.Skip(RowsInserted).ToList(), Table);
PerformSqlBulkCopy(Table);
};
static DataTable GetDataTable(
List<string> Rows,
DataTable Table) {
for (short a = 0, b = (short)Rows.Count; a < b; a++) {
string[] Columns = Rows[a].Split(new char[1] {
','
}, StringSplitOptions.RemoveEmptyEntries);
DataRow Row = Table.NewRow();
Row["A"] = "";
Table.Rows.Add(Row);
};
return (Table);
}
static void PerformSqlBulkCopy(
DataTable Table) {
using (SqlBulkCopy SqlBulkCopy = new SqlBulkCopy(#"", SqlBulkCopyOptions.TableLock)) {
SqlBulkCopy.BatchSize = Table.Rows.Count;
SqlBulkCopy.DestinationTableName = "";
SqlBulkCopy.WriteToServer(Table);
};
}
If you are doing a Bulk Insert into the table in SQL Server, which is how you should be doing this (BCP, Bulk Insert, Insert Into...Select, or in .NET, the SqlBulkCopy class) you can use the "Bulk Logged" recovery model. I highly recommend reading the MSDN articles on recovery models: http://msdn.microsoft.com/en-us/library/ms189275.aspx
You can set the Recover model for each database separately. Maybe the simple recovery model will work for you. The simple model:
Automatically reclaims log space to keep space requirements small, essentially eliminating the need to manage the transaction log space.
Read up on it here.
There is no way to bypass using the transaction log in SQL Server.